Jan 21 22:44:59 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 21 22:44:59 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 21 22:44:59 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 21 22:44:59 localhost kernel: BIOS-provided physical RAM map:
Jan 21 22:44:59 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 21 22:44:59 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 21 22:44:59 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 21 22:44:59 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 21 22:44:59 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 21 22:44:59 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 21 22:44:59 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 21 22:44:59 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 21 22:44:59 localhost kernel: NX (Execute Disable) protection: active
Jan 21 22:44:59 localhost kernel: APIC: Static calls initialized
Jan 21 22:44:59 localhost kernel: SMBIOS 2.8 present.
Jan 21 22:44:59 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 21 22:44:59 localhost kernel: Hypervisor detected: KVM
Jan 21 22:44:59 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 21 22:44:59 localhost kernel: kvm-clock: using sched offset of 3404942168 cycles
Jan 21 22:44:59 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 21 22:44:59 localhost kernel: tsc: Detected 2799.998 MHz processor
Jan 21 22:44:59 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 21 22:44:59 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 21 22:44:59 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 21 22:44:59 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 21 22:44:59 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 21 22:44:59 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 21 22:44:59 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 21 22:44:59 localhost kernel: Using GB pages for direct mapping
Jan 21 22:44:59 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 21 22:44:59 localhost kernel: ACPI: Early table checksum verification disabled
Jan 21 22:44:59 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 21 22:44:59 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 22:44:59 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 22:44:59 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 22:44:59 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 21 22:44:59 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 22:44:59 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 21 22:44:59 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 21 22:44:59 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 21 22:44:59 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 21 22:44:59 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 21 22:44:59 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 21 22:44:59 localhost kernel: No NUMA configuration found
Jan 21 22:44:59 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 21 22:44:59 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 21 22:44:59 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 21 22:44:59 localhost kernel: Zone ranges:
Jan 21 22:44:59 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 21 22:44:59 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 21 22:44:59 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 21 22:44:59 localhost kernel:   Device   empty
Jan 21 22:44:59 localhost kernel: Movable zone start for each node
Jan 21 22:44:59 localhost kernel: Early memory node ranges
Jan 21 22:44:59 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 21 22:44:59 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 21 22:44:59 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 21 22:44:59 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 21 22:44:59 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 21 22:44:59 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 21 22:44:59 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 21 22:44:59 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 21 22:44:59 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 21 22:44:59 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 21 22:44:59 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 21 22:44:59 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 21 22:44:59 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 21 22:44:59 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 21 22:44:59 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 21 22:44:59 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 21 22:44:59 localhost kernel: TSC deadline timer available
Jan 21 22:44:59 localhost kernel: CPU topo: Max. logical packages:   8
Jan 21 22:44:59 localhost kernel: CPU topo: Max. logical dies:       8
Jan 21 22:44:59 localhost kernel: CPU topo: Max. dies per package:   1
Jan 21 22:44:59 localhost kernel: CPU topo: Max. threads per core:   1
Jan 21 22:44:59 localhost kernel: CPU topo: Num. cores per package:     1
Jan 21 22:44:59 localhost kernel: CPU topo: Num. threads per package:   1
Jan 21 22:44:59 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 21 22:44:59 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 21 22:44:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 21 22:44:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 21 22:44:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 21 22:44:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 21 22:44:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 21 22:44:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 21 22:44:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 21 22:44:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 21 22:44:59 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 21 22:44:59 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 21 22:44:59 localhost kernel: Booting paravirtualized kernel on KVM
Jan 21 22:44:59 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 21 22:44:59 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 21 22:44:59 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 21 22:44:59 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 21 22:44:59 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 21 22:44:59 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 21 22:44:59 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 21 22:44:59 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 21 22:44:59 localhost kernel: random: crng init done
Jan 21 22:44:59 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 21 22:44:59 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 21 22:44:59 localhost kernel: Fallback order for Node 0: 0 
Jan 21 22:44:59 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 21 22:44:59 localhost kernel: Policy zone: Normal
Jan 21 22:44:59 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 21 22:44:59 localhost kernel: software IO TLB: area num 8.
Jan 21 22:44:59 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 21 22:44:59 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 21 22:44:59 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 21 22:44:59 localhost kernel: Dynamic Preempt: voluntary
Jan 21 22:44:59 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 21 22:44:59 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 21 22:44:59 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 21 22:44:59 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 21 22:44:59 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 21 22:44:59 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 21 22:44:59 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 21 22:44:59 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 21 22:44:59 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 21 22:44:59 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 21 22:44:59 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 21 22:44:59 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 21 22:44:59 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 21 22:44:59 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 21 22:44:59 localhost kernel: Console: colour VGA+ 80x25
Jan 21 22:44:59 localhost kernel: printk: console [ttyS0] enabled
Jan 21 22:44:59 localhost kernel: ACPI: Core revision 20230331
Jan 21 22:44:59 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 21 22:44:59 localhost kernel: x2apic enabled
Jan 21 22:44:59 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 21 22:44:59 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 21 22:44:59 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Jan 21 22:44:59 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 21 22:44:59 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 21 22:44:59 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 21 22:44:59 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 21 22:44:59 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 21 22:44:59 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 21 22:44:59 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 21 22:44:59 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 21 22:44:59 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 21 22:44:59 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 21 22:44:59 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 21 22:44:59 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 21 22:44:59 localhost kernel: x86/bugs: return thunk changed
Jan 21 22:44:59 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 21 22:44:59 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 21 22:44:59 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 21 22:44:59 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 21 22:44:59 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 21 22:44:59 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 21 22:44:59 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 21 22:44:59 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 21 22:44:59 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 21 22:44:59 localhost kernel: landlock: Up and running.
Jan 21 22:44:59 localhost kernel: Yama: becoming mindful.
Jan 21 22:44:59 localhost kernel: SELinux:  Initializing.
Jan 21 22:44:59 localhost kernel: LSM support for eBPF active
Jan 21 22:44:59 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 21 22:44:59 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 21 22:44:59 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 21 22:44:59 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 21 22:44:59 localhost kernel: ... version:                0
Jan 21 22:44:59 localhost kernel: ... bit width:              48
Jan 21 22:44:59 localhost kernel: ... generic registers:      6
Jan 21 22:44:59 localhost kernel: ... value mask:             0000ffffffffffff
Jan 21 22:44:59 localhost kernel: ... max period:             00007fffffffffff
Jan 21 22:44:59 localhost kernel: ... fixed-purpose events:   0
Jan 21 22:44:59 localhost kernel: ... event mask:             000000000000003f
Jan 21 22:44:59 localhost kernel: signal: max sigframe size: 1776
Jan 21 22:44:59 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 21 22:44:59 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 21 22:44:59 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 21 22:44:59 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 21 22:44:59 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 21 22:44:59 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 21 22:44:59 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Jan 21 22:44:59 localhost kernel: node 0 deferred pages initialised in 10ms
Jan 21 22:44:59 localhost kernel: Memory: 7763820K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618356K reserved, 0K cma-reserved)
Jan 21 22:44:59 localhost kernel: devtmpfs: initialized
Jan 21 22:44:59 localhost kernel: x86/mm: Memory block size: 128MB
Jan 21 22:44:59 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 21 22:44:59 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 21 22:44:59 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 21 22:44:59 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 21 22:44:59 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 21 22:44:59 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 21 22:44:59 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 21 22:44:59 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 21 22:44:59 localhost kernel: audit: type=2000 audit(1769035497.322:1): state=initialized audit_enabled=0 res=1
Jan 21 22:44:59 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 21 22:44:59 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 21 22:44:59 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 21 22:44:59 localhost kernel: cpuidle: using governor menu
Jan 21 22:44:59 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 21 22:44:59 localhost kernel: PCI: Using configuration type 1 for base access
Jan 21 22:44:59 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 21 22:44:59 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 21 22:44:59 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 21 22:44:59 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 21 22:44:59 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 21 22:44:59 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 21 22:44:59 localhost kernel: Demotion targets for Node 0: null
Jan 21 22:44:59 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 21 22:44:59 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 21 22:44:59 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 21 22:44:59 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 21 22:44:59 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 21 22:44:59 localhost kernel: ACPI: Interpreter enabled
Jan 21 22:44:59 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 21 22:44:59 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 21 22:44:59 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 21 22:44:59 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 21 22:44:59 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 21 22:44:59 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 21 22:44:59 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [3] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [4] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [5] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [6] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [7] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [8] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [9] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [10] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [11] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [12] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [13] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [14] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [15] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [16] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [17] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [18] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [19] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [20] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [21] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [22] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [23] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [24] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [25] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [26] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [27] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [28] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [29] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [30] registered
Jan 21 22:44:59 localhost kernel: acpiphp: Slot [31] registered
Jan 21 22:44:59 localhost kernel: PCI host bridge to bus 0000:00
Jan 21 22:44:59 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 21 22:44:59 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 21 22:44:59 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 21 22:44:59 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 21 22:44:59 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 21 22:44:59 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 21 22:44:59 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 21 22:44:59 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 21 22:44:59 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 21 22:44:59 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 21 22:44:59 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 21 22:44:59 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 21 22:44:59 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 21 22:44:59 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 21 22:44:59 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 21 22:44:59 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 21 22:44:59 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 21 22:44:59 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 21 22:44:59 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 21 22:44:59 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 21 22:44:59 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 21 22:44:59 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 21 22:44:59 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 21 22:44:59 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 21 22:44:59 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 21 22:44:59 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 21 22:44:59 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 21 22:44:59 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 21 22:44:59 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 21 22:44:59 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 21 22:44:59 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 21 22:44:59 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 21 22:44:59 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 21 22:44:59 localhost kernel: iommu: Default domain type: Translated
Jan 21 22:44:59 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 21 22:44:59 localhost kernel: SCSI subsystem initialized
Jan 21 22:44:59 localhost kernel: ACPI: bus type USB registered
Jan 21 22:44:59 localhost kernel: usbcore: registered new interface driver usbfs
Jan 21 22:44:59 localhost kernel: usbcore: registered new interface driver hub
Jan 21 22:44:59 localhost kernel: usbcore: registered new device driver usb
Jan 21 22:44:59 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 21 22:44:59 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 21 22:44:59 localhost kernel: PTP clock support registered
Jan 21 22:44:59 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 21 22:44:59 localhost kernel: NetLabel: Initializing
Jan 21 22:44:59 localhost kernel: NetLabel:  domain hash size = 128
Jan 21 22:44:59 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 21 22:44:59 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 21 22:44:59 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 21 22:44:59 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 21 22:44:59 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 21 22:44:59 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 21 22:44:59 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 21 22:44:59 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 21 22:44:59 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 21 22:44:59 localhost kernel: vgaarb: loaded
Jan 21 22:44:59 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 21 22:44:59 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 21 22:44:59 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 21 22:44:59 localhost kernel: pnp: PnP ACPI init
Jan 21 22:44:59 localhost kernel: pnp 00:03: [dma 2]
Jan 21 22:44:59 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 21 22:44:59 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 21 22:44:59 localhost kernel: NET: Registered PF_INET protocol family
Jan 21 22:44:59 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 21 22:44:59 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 21 22:44:59 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 21 22:44:59 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 21 22:44:59 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 21 22:44:59 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 21 22:44:59 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 21 22:44:59 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 21 22:44:59 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 21 22:44:59 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 21 22:44:59 localhost kernel: NET: Registered PF_XDP protocol family
Jan 21 22:44:59 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 21 22:44:59 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 21 22:44:59 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 21 22:44:59 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 21 22:44:59 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 21 22:44:59 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 21 22:44:59 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 21 22:44:59 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 108542 usecs
Jan 21 22:44:59 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 21 22:44:59 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 21 22:44:59 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 21 22:44:59 localhost kernel: ACPI: bus type thunderbolt registered
Jan 21 22:44:59 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 21 22:44:59 localhost kernel: Initialise system trusted keyrings
Jan 21 22:44:59 localhost kernel: Key type blacklist registered
Jan 21 22:44:59 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 21 22:44:59 localhost kernel: zbud: loaded
Jan 21 22:44:59 localhost kernel: integrity: Platform Keyring initialized
Jan 21 22:44:59 localhost kernel: integrity: Machine keyring initialized
Jan 21 22:44:59 localhost kernel: Freeing initrd memory: 87956K
Jan 21 22:44:59 localhost kernel: NET: Registered PF_ALG protocol family
Jan 21 22:44:59 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 21 22:44:59 localhost kernel: Key type asymmetric registered
Jan 21 22:44:59 localhost kernel: Asymmetric key parser 'x509' registered
Jan 21 22:44:59 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 21 22:44:59 localhost kernel: io scheduler mq-deadline registered
Jan 21 22:44:59 localhost kernel: io scheduler kyber registered
Jan 21 22:44:59 localhost kernel: io scheduler bfq registered
Jan 21 22:44:59 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 21 22:44:59 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 21 22:44:59 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 21 22:44:59 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 21 22:44:59 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 21 22:44:59 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 21 22:44:59 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 21 22:44:59 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 21 22:44:59 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 21 22:44:59 localhost kernel: Non-volatile memory driver v1.3
Jan 21 22:44:59 localhost kernel: rdac: device handler registered
Jan 21 22:44:59 localhost kernel: hp_sw: device handler registered
Jan 21 22:44:59 localhost kernel: emc: device handler registered
Jan 21 22:44:59 localhost kernel: alua: device handler registered
Jan 21 22:44:59 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 21 22:44:59 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 21 22:44:59 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 21 22:44:59 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 21 22:44:59 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 21 22:44:59 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 21 22:44:59 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 21 22:44:59 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 21 22:44:59 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 21 22:44:59 localhost kernel: hub 1-0:1.0: USB hub found
Jan 21 22:44:59 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 21 22:44:59 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 21 22:44:59 localhost kernel: usbserial: USB Serial support registered for generic
Jan 21 22:44:59 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 21 22:44:59 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 21 22:44:59 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 21 22:44:59 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 21 22:44:59 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 21 22:44:59 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 21 22:44:59 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 21 22:44:59 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 21 22:44:59 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 21 22:44:59 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-21T22:44:58 UTC (1769035498)
Jan 21 22:44:59 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 21 22:44:59 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 21 22:44:59 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 21 22:44:59 localhost kernel: usbcore: registered new interface driver usbhid
Jan 21 22:44:59 localhost kernel: usbhid: USB HID core driver
Jan 21 22:44:59 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 21 22:44:59 localhost kernel: Initializing XFRM netlink socket
Jan 21 22:44:59 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 21 22:44:59 localhost kernel: Segment Routing with IPv6
Jan 21 22:44:59 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 21 22:44:59 localhost kernel: mpls_gso: MPLS GSO support
Jan 21 22:44:59 localhost kernel: IPI shorthand broadcast: enabled
Jan 21 22:44:59 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 21 22:44:59 localhost kernel: AES CTR mode by8 optimization enabled
Jan 21 22:44:59 localhost kernel: sched_clock: Marking stable (1380026973, 146857665)->(1667424221, -140539583)
Jan 21 22:44:59 localhost kernel: registered taskstats version 1
Jan 21 22:44:59 localhost kernel: Loading compiled-in X.509 certificates
Jan 21 22:44:59 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 21 22:44:59 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 21 22:44:59 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 21 22:44:59 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 21 22:44:59 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 21 22:44:59 localhost kernel: Demotion targets for Node 0: null
Jan 21 22:44:59 localhost kernel: page_owner is disabled
Jan 21 22:44:59 localhost kernel: Key type .fscrypt registered
Jan 21 22:44:59 localhost kernel: Key type fscrypt-provisioning registered
Jan 21 22:44:59 localhost kernel: Key type big_key registered
Jan 21 22:44:59 localhost kernel: Key type encrypted registered
Jan 21 22:44:59 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 21 22:44:59 localhost kernel: Loading compiled-in module X.509 certificates
Jan 21 22:44:59 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 21 22:44:59 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 21 22:44:59 localhost kernel: ima: No architecture policies found
Jan 21 22:44:59 localhost kernel: evm: Initialising EVM extended attributes:
Jan 21 22:44:59 localhost kernel: evm: security.selinux
Jan 21 22:44:59 localhost kernel: evm: security.SMACK64 (disabled)
Jan 21 22:44:59 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 21 22:44:59 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 21 22:44:59 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 21 22:44:59 localhost kernel: evm: security.apparmor (disabled)
Jan 21 22:44:59 localhost kernel: evm: security.ima
Jan 21 22:44:59 localhost kernel: evm: security.capability
Jan 21 22:44:59 localhost kernel: evm: HMAC attrs: 0x1
Jan 21 22:44:59 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 21 22:44:59 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 21 22:44:59 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 21 22:44:59 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 21 22:44:59 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 21 22:44:59 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 21 22:44:59 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 21 22:44:59 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 21 22:44:59 localhost kernel: Running certificate verification RSA selftest
Jan 21 22:44:59 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 21 22:44:59 localhost kernel: Running certificate verification ECDSA selftest
Jan 21 22:44:59 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 21 22:44:59 localhost kernel: clk: Disabling unused clocks
Jan 21 22:44:59 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 21 22:44:59 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 21 22:44:59 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 21 22:44:59 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 21 22:44:59 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 21 22:44:59 localhost kernel: Run /init as init process
Jan 21 22:44:59 localhost kernel:   with arguments:
Jan 21 22:44:59 localhost kernel:     /init
Jan 21 22:44:59 localhost kernel:   with environment:
Jan 21 22:44:59 localhost kernel:     HOME=/
Jan 21 22:44:59 localhost kernel:     TERM=linux
Jan 21 22:44:59 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 21 22:44:59 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 21 22:44:59 localhost systemd[1]: Detected virtualization kvm.
Jan 21 22:44:59 localhost systemd[1]: Detected architecture x86-64.
Jan 21 22:44:59 localhost systemd[1]: Running in initrd.
Jan 21 22:44:59 localhost systemd[1]: No hostname configured, using default hostname.
Jan 21 22:44:59 localhost systemd[1]: Hostname set to <localhost>.
Jan 21 22:44:59 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 21 22:44:59 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 21 22:44:59 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 21 22:44:59 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 21 22:44:59 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 21 22:44:59 localhost systemd[1]: Reached target Local File Systems.
Jan 21 22:44:59 localhost systemd[1]: Reached target Path Units.
Jan 21 22:44:59 localhost systemd[1]: Reached target Slice Units.
Jan 21 22:44:59 localhost systemd[1]: Reached target Swaps.
Jan 21 22:44:59 localhost systemd[1]: Reached target Timer Units.
Jan 21 22:44:59 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 21 22:44:59 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 21 22:44:59 localhost systemd[1]: Listening on Journal Socket.
Jan 21 22:44:59 localhost systemd[1]: Listening on udev Control Socket.
Jan 21 22:44:59 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 21 22:44:59 localhost systemd[1]: Reached target Socket Units.
Jan 21 22:44:59 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 21 22:44:59 localhost systemd[1]: Starting Journal Service...
Jan 21 22:44:59 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 21 22:44:59 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 21 22:44:59 localhost systemd[1]: Starting Create System Users...
Jan 21 22:44:59 localhost systemd[1]: Starting Setup Virtual Console...
Jan 21 22:44:59 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 21 22:44:59 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 21 22:44:59 localhost systemd[1]: Finished Create System Users.
Jan 21 22:44:59 localhost systemd-journald[307]: Journal started
Jan 21 22:44:59 localhost systemd-journald[307]: Runtime Journal (/run/log/journal/cf9153dca08f47ce9ee555ef01f58da9) is 8.0M, max 153.6M, 145.6M free.
Jan 21 22:44:59 localhost systemd-sysusers[312]: Creating group 'users' with GID 100.
Jan 21 22:44:59 localhost systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Jan 21 22:44:59 localhost systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 21 22:44:59 localhost systemd[1]: Started Journal Service.
Jan 21 22:44:59 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 21 22:44:59 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 21 22:44:59 localhost systemd[1]: Finished Setup Virtual Console.
Jan 21 22:44:59 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 21 22:44:59 localhost systemd[1]: Starting dracut cmdline hook...
Jan 21 22:44:59 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 21 22:44:59 localhost dracut-cmdline[326]: dracut-9 dracut-057-102.git20250818.el9
Jan 21 22:44:59 localhost dracut-cmdline[326]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 21 22:44:59 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 21 22:44:59 localhost systemd[1]: Finished dracut cmdline hook.
Jan 21 22:44:59 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 21 22:44:59 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 21 22:44:59 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 21 22:44:59 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 21 22:44:59 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 21 22:44:59 localhost kernel: RPC: Registered udp transport module.
Jan 21 22:44:59 localhost kernel: RPC: Registered tcp transport module.
Jan 21 22:44:59 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 21 22:44:59 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 21 22:45:00 localhost rpc.statd[442]: Version 2.5.4 starting
Jan 21 22:45:00 localhost rpc.statd[442]: Initializing NSM state
Jan 21 22:45:00 localhost rpc.idmapd[447]: Setting log level to 0
Jan 21 22:45:00 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 21 22:45:00 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 21 22:45:00 localhost systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Jan 21 22:45:00 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 21 22:45:00 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 21 22:45:00 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 21 22:45:00 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 21 22:45:00 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 21 22:45:00 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 21 22:45:00 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 21 22:45:00 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 21 22:45:00 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 21 22:45:00 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 21 22:45:00 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 21 22:45:00 localhost systemd[1]: Reached target Network.
Jan 21 22:45:00 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 21 22:45:00 localhost systemd[1]: Starting dracut initqueue hook...
Jan 21 22:45:00 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 21 22:45:00 localhost systemd[1]: Reached target System Initialization.
Jan 21 22:45:00 localhost systemd[1]: Reached target Basic System.
Jan 21 22:45:00 localhost kernel: libata version 3.00 loaded.
Jan 21 22:45:00 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 21 22:45:00 localhost systemd-udevd[492]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 22:45:00 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 21 22:45:00 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 21 22:45:00 localhost kernel:  vda: vda1
Jan 21 22:45:00 localhost kernel: scsi host0: ata_piix
Jan 21 22:45:00 localhost kernel: scsi host1: ata_piix
Jan 21 22:45:00 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 21 22:45:00 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 21 22:45:00 localhost kernel: ata1: found unknown device (class 0)
Jan 21 22:45:00 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 21 22:45:00 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 21 22:45:00 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 21 22:45:00 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 21 22:45:00 localhost systemd[1]: Reached target Initrd Root Device.
Jan 21 22:45:00 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 21 22:45:00 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 21 22:45:00 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 21 22:45:00 localhost systemd[1]: Finished dracut initqueue hook.
Jan 21 22:45:00 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 21 22:45:00 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 21 22:45:00 localhost systemd[1]: Reached target Remote File Systems.
Jan 21 22:45:00 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 21 22:45:00 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 21 22:45:00 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 21 22:45:00 localhost systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Jan 21 22:45:00 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 21 22:45:00 localhost systemd[1]: Mounting /sysroot...
Jan 21 22:45:01 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 21 22:45:01 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 21 22:45:01 localhost kernel: XFS (vda1): Ending clean mount
Jan 21 22:45:01 localhost systemd[1]: Mounted /sysroot.
Jan 21 22:45:01 localhost systemd[1]: Reached target Initrd Root File System.
Jan 21 22:45:01 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 21 22:45:01 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 21 22:45:01 localhost systemd[1]: Reached target Initrd File Systems.
Jan 21 22:45:01 localhost systemd[1]: Reached target Initrd Default Target.
Jan 21 22:45:01 localhost systemd[1]: Starting dracut mount hook...
Jan 21 22:45:01 localhost systemd[1]: Finished dracut mount hook.
Jan 21 22:45:01 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 21 22:45:01 localhost rpc.idmapd[447]: exiting on signal 15
Jan 21 22:45:01 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 21 22:45:01 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 21 22:45:01 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Network.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Timer Units.
Jan 21 22:45:01 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 21 22:45:01 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Basic System.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Path Units.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Remote File Systems.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Slice Units.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Socket Units.
Jan 21 22:45:01 localhost systemd[1]: Stopped target System Initialization.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Local File Systems.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Swaps.
Jan 21 22:45:01 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped dracut mount hook.
Jan 21 22:45:01 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 21 22:45:01 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 21 22:45:01 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 21 22:45:01 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 21 22:45:01 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 21 22:45:01 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 21 22:45:01 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 21 22:45:01 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 21 22:45:01 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 21 22:45:01 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 21 22:45:01 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 21 22:45:01 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Closed udev Control Socket.
Jan 21 22:45:01 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Closed udev Kernel Socket.
Jan 21 22:45:01 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 21 22:45:01 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 21 22:45:01 localhost systemd[1]: Starting Cleanup udev Database...
Jan 21 22:45:01 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 21 22:45:01 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 21 22:45:01 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Stopped Create System Users.
Jan 21 22:45:01 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 21 22:45:01 localhost systemd[1]: Finished Cleanup udev Database.
Jan 21 22:45:01 localhost systemd[1]: Reached target Switch Root.
Jan 21 22:45:01 localhost systemd[1]: Starting Switch Root...
Jan 21 22:45:01 localhost systemd[1]: Switching root.
Jan 21 22:45:01 localhost systemd-journald[307]: Journal stopped
Jan 21 22:45:02 localhost systemd-journald[307]: Received SIGTERM from PID 1 (systemd).
Jan 21 22:45:02 localhost kernel: audit: type=1404 audit(1769035501.881:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 21 22:45:02 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 22:45:02 localhost kernel: SELinux:  policy capability open_perms=1
Jan 21 22:45:02 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 22:45:02 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 21 22:45:02 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 22:45:02 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 22:45:02 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 22:45:02 localhost kernel: audit: type=1403 audit(1769035501.998:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 21 22:45:02 localhost systemd[1]: Successfully loaded SELinux policy in 119.819ms.
Jan 21 22:45:02 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.142ms.
Jan 21 22:45:02 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 21 22:45:02 localhost systemd[1]: Detected virtualization kvm.
Jan 21 22:45:02 localhost systemd[1]: Detected architecture x86-64.
Jan 21 22:45:02 localhost systemd-rc-local-generator[638]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 22:45:02 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 21 22:45:02 localhost systemd[1]: Stopped Switch Root.
Jan 21 22:45:02 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 21 22:45:02 localhost systemd[1]: Created slice Slice /system/getty.
Jan 21 22:45:02 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 21 22:45:02 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 21 22:45:02 localhost systemd[1]: Created slice User and Session Slice.
Jan 21 22:45:02 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 21 22:45:02 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 21 22:45:02 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 21 22:45:02 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 21 22:45:02 localhost systemd[1]: Stopped target Switch Root.
Jan 21 22:45:02 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 21 22:45:02 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 21 22:45:02 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 21 22:45:02 localhost systemd[1]: Reached target Path Units.
Jan 21 22:45:02 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 21 22:45:02 localhost systemd[1]: Reached target Slice Units.
Jan 21 22:45:02 localhost systemd[1]: Reached target Swaps.
Jan 21 22:45:02 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 21 22:45:02 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 21 22:45:02 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 21 22:45:02 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 21 22:45:02 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 21 22:45:02 localhost systemd[1]: Listening on udev Control Socket.
Jan 21 22:45:02 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 21 22:45:02 localhost systemd[1]: Mounting Huge Pages File System...
Jan 21 22:45:02 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 21 22:45:02 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 21 22:45:02 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 21 22:45:02 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 21 22:45:02 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 21 22:45:02 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 21 22:45:02 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 21 22:45:02 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 21 22:45:02 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 21 22:45:02 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 21 22:45:02 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 21 22:45:02 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 21 22:45:02 localhost systemd[1]: Stopped Journal Service.
Jan 21 22:45:02 localhost systemd[1]: Starting Journal Service...
Jan 21 22:45:02 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 21 22:45:02 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 21 22:45:02 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 21 22:45:02 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 21 22:45:02 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 21 22:45:02 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 21 22:45:02 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 21 22:45:02 localhost systemd-journald[679]: Journal started
Jan 21 22:45:02 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 21 22:45:02 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 21 22:45:02 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 21 22:45:02 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 21 22:45:02 localhost systemd[1]: Started Journal Service.
Jan 21 22:45:02 localhost kernel: fuse: init (API version 7.37)
Jan 21 22:45:02 localhost systemd[1]: Mounted Huge Pages File System.
Jan 21 22:45:02 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 21 22:45:02 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 21 22:45:02 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 21 22:45:02 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 21 22:45:02 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 21 22:45:02 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 21 22:45:02 localhost kernel: ACPI: bus type drm_connector registered
Jan 21 22:45:02 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 21 22:45:02 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 21 22:45:02 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 21 22:45:02 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 21 22:45:02 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 21 22:45:02 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 21 22:45:02 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 21 22:45:02 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 21 22:45:02 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 21 22:45:02 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 21 22:45:02 localhost systemd[1]: Mounting FUSE Control File System...
Jan 21 22:45:02 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 21 22:45:02 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 21 22:45:02 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 21 22:45:02 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 21 22:45:02 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 21 22:45:02 localhost systemd[1]: Starting Create System Users...
Jan 21 22:45:02 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 21 22:45:02 localhost systemd-journald[679]: Received client request to flush runtime journal.
Jan 21 22:45:02 localhost systemd[1]: Mounted FUSE Control File System.
Jan 21 22:45:02 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 21 22:45:02 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 21 22:45:02 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 21 22:45:02 localhost systemd[1]: Finished Create System Users.
Jan 21 22:45:02 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 21 22:45:02 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 21 22:45:02 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 21 22:45:02 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 21 22:45:02 localhost systemd[1]: Reached target Local File Systems.
Jan 21 22:45:02 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 21 22:45:02 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 21 22:45:02 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 21 22:45:02 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 21 22:45:02 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 21 22:45:02 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 21 22:45:02 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 21 22:45:02 localhost bootctl[696]: Couldn't find EFI system partition, skipping.
Jan 21 22:45:02 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 21 22:45:02 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 21 22:45:02 localhost systemd[1]: Starting Security Auditing Service...
Jan 21 22:45:02 localhost systemd[1]: Starting RPC Bind...
Jan 21 22:45:02 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 21 22:45:02 localhost auditd[702]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 21 22:45:02 localhost auditd[702]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 21 22:45:02 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 21 22:45:02 localhost systemd[1]: Started RPC Bind.
Jan 21 22:45:02 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 21 22:45:02 localhost augenrules[707]: /sbin/augenrules: No change
Jan 21 22:45:02 localhost augenrules[722]: No rules
Jan 21 22:45:02 localhost augenrules[722]: enabled 1
Jan 21 22:45:02 localhost augenrules[722]: failure 1
Jan 21 22:45:02 localhost augenrules[722]: pid 702
Jan 21 22:45:02 localhost augenrules[722]: rate_limit 0
Jan 21 22:45:02 localhost augenrules[722]: backlog_limit 8192
Jan 21 22:45:02 localhost augenrules[722]: lost 0
Jan 21 22:45:02 localhost augenrules[722]: backlog 0
Jan 21 22:45:02 localhost augenrules[722]: backlog_wait_time 60000
Jan 21 22:45:02 localhost augenrules[722]: backlog_wait_time_actual 0
Jan 21 22:45:02 localhost augenrules[722]: enabled 1
Jan 21 22:45:02 localhost augenrules[722]: failure 1
Jan 21 22:45:02 localhost augenrules[722]: pid 702
Jan 21 22:45:02 localhost augenrules[722]: rate_limit 0
Jan 21 22:45:02 localhost augenrules[722]: backlog_limit 8192
Jan 21 22:45:02 localhost augenrules[722]: lost 0
Jan 21 22:45:02 localhost augenrules[722]: backlog 0
Jan 21 22:45:02 localhost augenrules[722]: backlog_wait_time 60000
Jan 21 22:45:02 localhost augenrules[722]: backlog_wait_time_actual 0
Jan 21 22:45:02 localhost augenrules[722]: enabled 1
Jan 21 22:45:02 localhost augenrules[722]: failure 1
Jan 21 22:45:02 localhost augenrules[722]: pid 702
Jan 21 22:45:02 localhost augenrules[722]: rate_limit 0
Jan 21 22:45:02 localhost augenrules[722]: backlog_limit 8192
Jan 21 22:45:02 localhost augenrules[722]: lost 0
Jan 21 22:45:02 localhost augenrules[722]: backlog 0
Jan 21 22:45:02 localhost augenrules[722]: backlog_wait_time 60000
Jan 21 22:45:02 localhost augenrules[722]: backlog_wait_time_actual 0
Jan 21 22:45:02 localhost systemd[1]: Started Security Auditing Service.
Jan 21 22:45:02 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 21 22:45:02 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 21 22:45:03 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 21 22:45:03 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 21 22:45:03 localhost systemd[1]: Starting Update is Completed...
Jan 21 22:45:03 localhost systemd[1]: Finished Update is Completed.
Jan 21 22:45:03 localhost systemd-udevd[730]: Using default interface naming scheme 'rhel-9.0'.
Jan 21 22:45:03 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 21 22:45:03 localhost systemd[1]: Reached target System Initialization.
Jan 21 22:45:03 localhost systemd[1]: Started dnf makecache --timer.
Jan 21 22:45:03 localhost systemd[1]: Started Daily rotation of log files.
Jan 21 22:45:03 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 21 22:45:03 localhost systemd[1]: Reached target Timer Units.
Jan 21 22:45:03 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 21 22:45:03 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 21 22:45:03 localhost systemd[1]: Reached target Socket Units.
Jan 21 22:45:03 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 21 22:45:03 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 21 22:45:03 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 21 22:45:03 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 21 22:45:03 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 21 22:45:03 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 21 22:45:03 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 21 22:45:03 localhost systemd-udevd[732]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 22:45:03 localhost systemd[1]: Reached target Basic System.
Jan 21 22:45:03 localhost dbus-broker-lau[768]: Ready
Jan 21 22:45:03 localhost systemd[1]: Starting NTP client/server...
Jan 21 22:45:03 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 21 22:45:03 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 21 22:45:03 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 21 22:45:03 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 21 22:45:03 localhost systemd[1]: Started irqbalance daemon.
Jan 21 22:45:03 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 21 22:45:03 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 21 22:45:03 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 21 22:45:03 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 21 22:45:03 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 22:45:03 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 22:45:03 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 22:45:03 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 21 22:45:03 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 21 22:45:03 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 21 22:45:03 localhost systemd[1]: Starting User Login Management...
Jan 21 22:45:03 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 21 22:45:03 localhost chronyd[786]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 21 22:45:03 localhost chronyd[786]: Loaded 0 symmetric keys
Jan 21 22:45:03 localhost chronyd[786]: Using right/UTC timezone to obtain leap second data
Jan 21 22:45:03 localhost chronyd[786]: Loaded seccomp filter (level 2)
Jan 21 22:45:03 localhost systemd[1]: Started NTP client/server.
Jan 21 22:45:03 localhost systemd-logind[784]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 21 22:45:03 localhost systemd-logind[784]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 21 22:45:03 localhost systemd-logind[784]: New seat seat0.
Jan 21 22:45:03 localhost systemd[1]: Started User Login Management.
Jan 21 22:45:03 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 21 22:45:03 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 21 22:45:03 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 21 22:45:03 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 21 22:45:03 localhost kernel: Console: switching to colour dummy device 80x25
Jan 21 22:45:03 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 21 22:45:03 localhost kernel: [drm] features: -context_init
Jan 21 22:45:03 localhost kernel: [drm] number of scanouts: 1
Jan 21 22:45:03 localhost kernel: [drm] number of cap sets: 0
Jan 21 22:45:03 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 21 22:45:03 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 21 22:45:03 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 21 22:45:03 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 21 22:45:03 localhost kernel: kvm_amd: TSC scaling supported
Jan 21 22:45:03 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 21 22:45:03 localhost kernel: kvm_amd: Nested Paging enabled
Jan 21 22:45:03 localhost kernel: kvm_amd: LBR virtualization supported
Jan 21 22:45:03 localhost iptables.init[778]: iptables: Applying firewall rules: [  OK  ]
Jan 21 22:45:03 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 21 22:45:03 localhost cloud-init[840]: Cloud-init v. 24.4-8.el9 running 'init-local' at Wed, 21 Jan 2026 22:45:03 +0000. Up 6.59 seconds.
Jan 21 22:45:03 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 21 22:45:03 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 21 22:45:03 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpmyohisiy.mount: Deactivated successfully.
Jan 21 22:45:04 localhost systemd[1]: Starting Hostname Service...
Jan 21 22:45:04 localhost systemd[1]: Started Hostname Service.
Jan 21 22:45:04 np0005591283.novalocal systemd-hostnamed[854]: Hostname set to <np0005591283.novalocal> (static)
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Reached target Preparation for Network.
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Starting Network Manager...
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3296] NetworkManager (version 1.54.3-2.el9) is starting... (boot:a457267e-aaec-4d55-ae32-e78b7b5bf63f)
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3299] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3364] manager[0x55e2621e8000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3417] hostname: hostname: using hostnamed
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3417] hostname: static hostname changed from (none) to "np0005591283.novalocal"
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3421] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3615] manager[0x55e2621e8000]: rfkill: Wi-Fi hardware radio set enabled
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3616] manager[0x55e2621e8000]: rfkill: WWAN hardware radio set enabled
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3654] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3656] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3656] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3657] manager: Networking is enabled by state file
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3658] settings: Loaded settings plugin: keyfile (internal)
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3667] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3686] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3696] dhcp: init: Using DHCP client 'internal'
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3699] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3711] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3717] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3725] device (lo): Activation: starting connection 'lo' (2e99d35a-31da-47ef-9f44-21c4df97b7a3)
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3733] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3737] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3764] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3768] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3770] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3772] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3774] device (eth0): carrier: link connected
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3778] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3783] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3789] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3794] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3795] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3797] manager: NetworkManager state is now CONNECTING
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3799] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3805] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.3807] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Started Network Manager.
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Reached target Network.
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.4100] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.4103] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.4110] device (lo): Activation: successful, device activated.
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Reached target NFS client services.
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Reached target Remote File Systems.
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.5915] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.5928] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.5947] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.5988] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.5989] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.5991] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.5993] device (eth0): Activation: successful, device activated.
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.5996] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 21 22:45:04 np0005591283.novalocal NetworkManager[858]: <info>  [1769035504.6001] manager: startup complete
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 21 22:45:04 np0005591283.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: Cloud-init v. 24.4-8.el9 running 'init' at Wed, 21 Jan 2026 22:45:05 +0000. Up 7.87 seconds.
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: ++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: | Device |  Up  |           Address           |      Mask     | Scope  |     Hw-Address    |
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.204        | 255.255.255.0 | global | fa:16:3e:6b:0d:fb |
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fe6b:dfb/64 |       .       |  link  | fa:16:3e:6b:0d:fb |
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1          |   255.0.0.0   |  host  |         .         |
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: |   lo   | True |           ::1/128           |       .       |  host  |         .         |
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: +--------+------+-----------------------------+---------------+--------+-------------------+
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 21 22:45:05 np0005591283.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 21 22:45:05 np0005591283.novalocal useradd[988]: new group: name=cloud-user, GID=1001
Jan 21 22:45:05 np0005591283.novalocal useradd[988]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 21 22:45:05 np0005591283.novalocal useradd[988]: add 'cloud-user' to group 'adm'
Jan 21 22:45:05 np0005591283.novalocal useradd[988]: add 'cloud-user' to group 'systemd-journal'
Jan 21 22:45:05 np0005591283.novalocal useradd[988]: add 'cloud-user' to shadow group 'adm'
Jan 21 22:45:05 np0005591283.novalocal useradd[988]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: Generating public/private rsa key pair.
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: The key fingerprint is:
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: SHA256:ft2Nv57KAoFaSGE9iowJhStc1e+E/AhcgWjvms1DJWM root@np0005591283.novalocal
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: The key's randomart image is:
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: +---[RSA 3072]----+
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: | o. o+=..        |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |o  +.o =         |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |o.* = = =        |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |o+ o E * +       |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |.   o B S .      |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |     + o + . . o |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |    *   . o . o .|
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |   o +   . ..  ..|
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |      .     .oo+o|
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: +----[SHA256]-----+
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: Generating public/private ecdsa key pair.
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: The key fingerprint is:
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: SHA256:mqckKcbriAtphDyHnp/vK0iAINNlYU+bN/KnGU9kGNk root@np0005591283.novalocal
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: The key's randomart image is:
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: +---[ECDSA 256]---+
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: | . .=.. .o       |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |+ .o o o.oE      |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |+.    = + o      |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |= .    + +       |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |o= .    S o      |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |o++  . o B       |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |++= o + + .      |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |++.+.o o         |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |+ooo++o          |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: +----[SHA256]-----+
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: Generating public/private ed25519 key pair.
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: The key fingerprint is:
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: SHA256:m+wg/ZIk9Q1lHfOQ6+1djlk9I/izM83Rsy1nsg7myak root@np0005591283.novalocal
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: The key's randomart image is:
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: +--[ED25519 256]--+
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |           .+o   |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |          o o+   |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |         o   ..  |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |      . .   .    |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |     . .So ...  o|
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |    .....o.....==|
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |    .oo.+   +.+*O|
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |     .o+   + O*+*|
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: |       .o E.=oB* |
Jan 21 22:45:06 np0005591283.novalocal cloud-init[922]: +----[SHA256]-----+
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Reached target Network is Online.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Starting System Logging Service...
Jan 21 22:45:06 np0005591283.novalocal sm-notify[1004]: Version 2.5.4 starting
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Starting Permit User Sessions...
Jan 21 22:45:06 np0005591283.novalocal sshd[1006]: Server listening on 0.0.0.0 port 22.
Jan 21 22:45:06 np0005591283.novalocal sshd[1006]: Server listening on :: port 22.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Finished Permit User Sessions.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Started Command Scheduler.
Jan 21 22:45:06 np0005591283.novalocal crond[1009]: (CRON) STARTUP (1.5.7)
Jan 21 22:45:06 np0005591283.novalocal crond[1009]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Started Getty on tty1.
Jan 21 22:45:06 np0005591283.novalocal crond[1009]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 8% if used.)
Jan 21 22:45:06 np0005591283.novalocal crond[1009]: (CRON) INFO (running with inotify support)
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 21 22:45:06 np0005591283.novalocal rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Jan 21 22:45:06 np0005591283.novalocal rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Reached target Login Prompts.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Started System Logging Service.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Reached target Multi-User System.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 21 22:45:06 np0005591283.novalocal rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 22:45:06 np0005591283.novalocal kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Jan 21 22:45:06 np0005591283.novalocal kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 21 22:45:06 np0005591283.novalocal cloud-init[1132]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Wed, 21 Jan 2026 22:45:06 +0000. Up 9.45 seconds.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 21 22:45:06 np0005591283.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 21 22:45:06 np0005591283.novalocal dracut[1265]: dracut-057-102.git20250818.el9
Jan 21 22:45:07 np0005591283.novalocal cloud-init[1283]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Wed, 21 Jan 2026 22:45:07 +0000. Up 9.84 seconds.
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 21 22:45:07 np0005591283.novalocal cloud-init[1297]: #############################################################
Jan 21 22:45:07 np0005591283.novalocal cloud-init[1302]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 21 22:45:07 np0005591283.novalocal cloud-init[1309]: 256 SHA256:mqckKcbriAtphDyHnp/vK0iAINNlYU+bN/KnGU9kGNk root@np0005591283.novalocal (ECDSA)
Jan 21 22:45:07 np0005591283.novalocal cloud-init[1317]: 256 SHA256:m+wg/ZIk9Q1lHfOQ6+1djlk9I/izM83Rsy1nsg7myak root@np0005591283.novalocal (ED25519)
Jan 21 22:45:07 np0005591283.novalocal cloud-init[1325]: 3072 SHA256:ft2Nv57KAoFaSGE9iowJhStc1e+E/AhcgWjvms1DJWM root@np0005591283.novalocal (RSA)
Jan 21 22:45:07 np0005591283.novalocal cloud-init[1330]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 21 22:45:07 np0005591283.novalocal cloud-init[1337]: #############################################################
Jan 21 22:45:07 np0005591283.novalocal cloud-init[1283]: Cloud-init v. 24.4-8.el9 finished at Wed, 21 Jan 2026 22:45:07 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.06 seconds
Jan 21 22:45:07 np0005591283.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 21 22:45:07 np0005591283.novalocal systemd[1]: Reached target Cloud-init target.
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 21 22:45:07 np0005591283.novalocal dracut[1267]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: memstrack is not available
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: memstrack is not available
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 21 22:45:08 np0005591283.novalocal sshd-session[1847]: Unable to negotiate with 38.102.83.114 port 55404: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 21 22:45:08 np0005591283.novalocal sshd-session[1857]: Connection reset by 38.102.83.114 port 55410 [preauth]
Jan 21 22:45:08 np0005591283.novalocal sshd-session[1865]: Unable to negotiate with 38.102.83.114 port 55420: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: *** Including module: systemd ***
Jan 21 22:45:08 np0005591283.novalocal sshd-session[1881]: Unable to negotiate with 38.102.83.114 port 55432: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 21 22:45:08 np0005591283.novalocal sshd-session[1838]: Connection closed by 38.102.83.114 port 55402 [preauth]
Jan 21 22:45:08 np0005591283.novalocal sshd-session[1886]: Connection closed by 38.102.83.114 port 55440 [preauth]
Jan 21 22:45:08 np0005591283.novalocal sshd-session[1904]: Unable to negotiate with 38.102.83.114 port 55460: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 21 22:45:08 np0005591283.novalocal sshd-session[1915]: Unable to negotiate with 38.102.83.114 port 55468: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 21 22:45:08 np0005591283.novalocal sshd-session[1894]: Connection closed by 38.102.83.114 port 55452 [preauth]
Jan 21 22:45:08 np0005591283.novalocal dracut[1267]: *** Including module: fips ***
Jan 21 22:45:09 np0005591283.novalocal dracut[1267]: *** Including module: systemd-initrd ***
Jan 21 22:45:09 np0005591283.novalocal dracut[1267]: *** Including module: i18n ***
Jan 21 22:45:09 np0005591283.novalocal dracut[1267]: *** Including module: drm ***
Jan 21 22:45:09 np0005591283.novalocal chronyd[786]: Selected source 167.160.187.179 (2.centos.pool.ntp.org)
Jan 21 22:45:09 np0005591283.novalocal chronyd[786]: System clock TAI offset set to 37 seconds
Jan 21 22:45:09 np0005591283.novalocal dracut[1267]: *** Including module: prefixdevname ***
Jan 21 22:45:09 np0005591283.novalocal dracut[1267]: *** Including module: kernel-modules ***
Jan 21 22:45:10 np0005591283.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 21 22:45:10 np0005591283.novalocal dracut[1267]: *** Including module: kernel-modules-extra ***
Jan 21 22:45:10 np0005591283.novalocal dracut[1267]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 21 22:45:10 np0005591283.novalocal dracut[1267]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 21 22:45:10 np0005591283.novalocal dracut[1267]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 21 22:45:10 np0005591283.novalocal dracut[1267]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 21 22:45:10 np0005591283.novalocal dracut[1267]: *** Including module: qemu ***
Jan 21 22:45:10 np0005591283.novalocal dracut[1267]: *** Including module: fstab-sys ***
Jan 21 22:45:10 np0005591283.novalocal dracut[1267]: *** Including module: rootfs-block ***
Jan 21 22:45:10 np0005591283.novalocal dracut[1267]: *** Including module: terminfo ***
Jan 21 22:45:10 np0005591283.novalocal dracut[1267]: *** Including module: udev-rules ***
Jan 21 22:45:10 np0005591283.novalocal chronyd[786]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Jan 21 22:45:10 np0005591283.novalocal dracut[1267]: Skipping udev rule: 91-permissions.rules
Jan 21 22:45:10 np0005591283.novalocal dracut[1267]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]: *** Including module: virtiofs ***
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]: *** Including module: dracut-systemd ***
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]: *** Including module: usrmount ***
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]: *** Including module: base ***
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]: *** Including module: fs-lib ***
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]: *** Including module: kdumpbase ***
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:   microcode_ctl module: mangling fw_dir
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:     microcode_ctl: configuration "intel" is ignored
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 21 22:45:11 np0005591283.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]: *** Including module: openssl ***
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]: *** Including module: shutdown ***
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]: *** Including module: squash ***
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]: *** Including modules done ***
Jan 21 22:45:12 np0005591283.novalocal dracut[1267]: *** Installing kernel module dependencies ***
Jan 21 22:45:13 np0005591283.novalocal dracut[1267]: *** Installing kernel module dependencies done ***
Jan 21 22:45:13 np0005591283.novalocal dracut[1267]: *** Resolving executable dependencies ***
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: IRQ 35 affinity is now unmanaged
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: IRQ 33 affinity is now unmanaged
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: IRQ 31 affinity is now unmanaged
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: IRQ 28 affinity is now unmanaged
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: IRQ 34 affinity is now unmanaged
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: IRQ 32 affinity is now unmanaged
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: IRQ 30 affinity is now unmanaged
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 21 22:45:13 np0005591283.novalocal irqbalance[779]: IRQ 29 affinity is now unmanaged
Jan 21 22:45:14 np0005591283.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 22:45:14 np0005591283.novalocal dracut[1267]: *** Resolving executable dependencies done ***
Jan 21 22:45:14 np0005591283.novalocal dracut[1267]: *** Generating early-microcode cpio image ***
Jan 21 22:45:14 np0005591283.novalocal dracut[1267]: *** Store current command line parameters ***
Jan 21 22:45:14 np0005591283.novalocal dracut[1267]: Stored kernel commandline:
Jan 21 22:45:14 np0005591283.novalocal dracut[1267]: No dracut internal kernel commandline stored in the initramfs
Jan 21 22:45:14 np0005591283.novalocal dracut[1267]: *** Install squash loader ***
Jan 21 22:45:15 np0005591283.novalocal dracut[1267]: *** Squashing the files inside the initramfs ***
Jan 21 22:45:16 np0005591283.novalocal dracut[1267]: *** Squashing the files inside the initramfs done ***
Jan 21 22:45:16 np0005591283.novalocal dracut[1267]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 21 22:45:16 np0005591283.novalocal dracut[1267]: *** Hardlinking files ***
Jan 21 22:45:16 np0005591283.novalocal dracut[1267]: Mode:           real
Jan 21 22:45:16 np0005591283.novalocal dracut[1267]: Files:          50
Jan 21 22:45:16 np0005591283.novalocal dracut[1267]: Linked:         0 files
Jan 21 22:45:16 np0005591283.novalocal dracut[1267]: Compared:       0 xattrs
Jan 21 22:45:16 np0005591283.novalocal dracut[1267]: Compared:       0 files
Jan 21 22:45:16 np0005591283.novalocal dracut[1267]: Saved:          0 B
Jan 21 22:45:16 np0005591283.novalocal dracut[1267]: Duration:       0.000447 seconds
Jan 21 22:45:16 np0005591283.novalocal dracut[1267]: *** Hardlinking files done ***
Jan 21 22:45:16 np0005591283.novalocal dracut[1267]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 21 22:45:17 np0005591283.novalocal kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Jan 21 22:45:17 np0005591283.novalocal kdumpctl[1018]: kdump: Starting kdump: [OK]
Jan 21 22:45:17 np0005591283.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 21 22:45:17 np0005591283.novalocal systemd[1]: Startup finished in 1.830s (kernel) + 2.843s (initrd) + 15.502s (userspace) = 20.176s.
Jan 21 22:45:23 np0005591283.novalocal sshd-session[4302]: Accepted publickey for zuul from 38.102.83.114 port 55814 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 21 22:45:23 np0005591283.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 21 22:45:23 np0005591283.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 21 22:45:23 np0005591283.novalocal systemd-logind[784]: New session 1 of user zuul.
Jan 21 22:45:23 np0005591283.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 21 22:45:23 np0005591283.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 21 22:45:23 np0005591283.novalocal systemd[4306]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:45:23 np0005591283.novalocal systemd[4306]: Queued start job for default target Main User Target.
Jan 21 22:45:23 np0005591283.novalocal systemd[4306]: Created slice User Application Slice.
Jan 21 22:45:23 np0005591283.novalocal systemd[4306]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 22:45:23 np0005591283.novalocal systemd[4306]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 22:45:23 np0005591283.novalocal systemd[4306]: Reached target Paths.
Jan 21 22:45:23 np0005591283.novalocal systemd[4306]: Reached target Timers.
Jan 21 22:45:23 np0005591283.novalocal systemd[4306]: Starting D-Bus User Message Bus Socket...
Jan 21 22:45:24 np0005591283.novalocal systemd[4306]: Starting Create User's Volatile Files and Directories...
Jan 21 22:45:24 np0005591283.novalocal systemd[4306]: Finished Create User's Volatile Files and Directories.
Jan 21 22:45:24 np0005591283.novalocal systemd[4306]: Listening on D-Bus User Message Bus Socket.
Jan 21 22:45:24 np0005591283.novalocal systemd[4306]: Reached target Sockets.
Jan 21 22:45:24 np0005591283.novalocal systemd[4306]: Reached target Basic System.
Jan 21 22:45:24 np0005591283.novalocal systemd[4306]: Reached target Main User Target.
Jan 21 22:45:24 np0005591283.novalocal systemd[4306]: Startup finished in 111ms.
Jan 21 22:45:24 np0005591283.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 21 22:45:24 np0005591283.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 21 22:45:24 np0005591283.novalocal sshd-session[4302]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:45:24 np0005591283.novalocal python3[4389]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 22:45:30 np0005591283.novalocal sshd-session[4394]: Invalid user postgres from 188.166.69.60 port 50448
Jan 21 22:45:31 np0005591283.novalocal sshd-session[4394]: Connection closed by invalid user postgres 188.166.69.60 port 50448 [preauth]
Jan 21 22:45:33 np0005591283.novalocal python3[4419]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 22:45:34 np0005591283.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 22:45:42 np0005591283.novalocal python3[4479]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 22:45:43 np0005591283.novalocal python3[4519]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 21 22:45:45 np0005591283.novalocal python3[4545]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2e4yttZDYBqdG8LApHzgUrKnJJhokPjy46000EGKrecg+C8A4mLQflJ0D/xvugtt/H91C3VfRJbQOPQ7hZmStaqICNoXl/C8gc+eNroWZE+yY/wlWIxUH08XS6asYrTpDpg5UmpvUaYUK+3UMHnBY7Ito24+Jty+rd2YwCphABstuMfb1NJAx6Jml5CgCMob2n9WNcySPRTJ7JEA45egnysW3zGHGsS6qA8z8KP4tsp0oqBu1cfczB2RxnOXPhXZSJcS+3lww8bkb/wmQh1+Ho5qQEILiO5sxZGE4T9giN9XH2aveWWK0ttofy63F0tFxrl4uVBOtPYvY+GFt+GJuAwQK/wFmObp8yFqj8YU0HrxwXaVGLO6bfltMq8+k+/sDcwLSVGsCR6kw70L44MXX4znyZuRO7aEx+rAOMmL9ZfrVMgF7BEKlJG7ZldriZuFA1dpyF07UOpUN5wDaKC0EUC9s9ANBhs/JzmSBbA66LTl3G+2zXPfjQLBU99msPhs= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:45:45 np0005591283.novalocal python3[4569]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:46 np0005591283.novalocal python3[4668]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:45:46 np0005591283.novalocal python3[4739]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769035545.9108999-251-3121361348898/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=3005863ccc544e3a9d90dbd38e9aa500_id_rsa follow=False checksum=232cfc4771d49d01feffe7bca174ec959890bb55 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:47 np0005591283.novalocal python3[4862]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:45:47 np0005591283.novalocal python3[4933]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769035547.0097063-306-111446575606117/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=3005863ccc544e3a9d90dbd38e9aa500_id_rsa.pub follow=False checksum=0a660c0f8e508883780892e7228376ef7bc415eb backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:49 np0005591283.novalocal python3[4981]: ansible-ping Invoked with data=pong
Jan 21 22:45:50 np0005591283.novalocal python3[5005]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 22:45:52 np0005591283.novalocal python3[5063]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 21 22:45:53 np0005591283.novalocal python3[5095]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:54 np0005591283.novalocal python3[5119]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:54 np0005591283.novalocal python3[5143]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:54 np0005591283.novalocal python3[5167]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:54 np0005591283.novalocal python3[5191]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:55 np0005591283.novalocal python3[5215]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:56 np0005591283.novalocal sudo[5239]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guegbpyfpkfnfdecarqwhurvarhiasmx ; /usr/bin/python3'
Jan 21 22:45:56 np0005591283.novalocal sudo[5239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:45:56 np0005591283.novalocal python3[5241]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:56 np0005591283.novalocal sudo[5239]: pam_unix(sudo:session): session closed for user root
Jan 21 22:45:57 np0005591283.novalocal sudo[5317]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqtbdzwvtnmzepkqihgjbimrjnyrzvic ; /usr/bin/python3'
Jan 21 22:45:57 np0005591283.novalocal sudo[5317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:45:57 np0005591283.novalocal python3[5319]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:45:57 np0005591283.novalocal sudo[5317]: pam_unix(sudo:session): session closed for user root
Jan 21 22:45:57 np0005591283.novalocal sudo[5390]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epqkqzqlaorqeljygjyyzvlcvbkndkih ; /usr/bin/python3'
Jan 21 22:45:57 np0005591283.novalocal sudo[5390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:45:58 np0005591283.novalocal python3[5392]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769035557.056456-31-52395079042554/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:45:58 np0005591283.novalocal sudo[5390]: pam_unix(sudo:session): session closed for user root
Jan 21 22:45:58 np0005591283.novalocal python3[5440]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:45:58 np0005591283.novalocal python3[5464]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:45:59 np0005591283.novalocal python3[5488]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:45:59 np0005591283.novalocal python3[5512]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:45:59 np0005591283.novalocal python3[5536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:00 np0005591283.novalocal python3[5560]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:00 np0005591283.novalocal python3[5584]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:00 np0005591283.novalocal python3[5608]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:00 np0005591283.novalocal python3[5632]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:01 np0005591283.novalocal python3[5656]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:01 np0005591283.novalocal python3[5680]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:01 np0005591283.novalocal python3[5704]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:01 np0005591283.novalocal python3[5728]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:02 np0005591283.novalocal python3[5752]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:02 np0005591283.novalocal python3[5776]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:02 np0005591283.novalocal python3[5800]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:02 np0005591283.novalocal python3[5824]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:03 np0005591283.novalocal python3[5848]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:03 np0005591283.novalocal python3[5872]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:03 np0005591283.novalocal python3[5896]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:04 np0005591283.novalocal python3[5920]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:04 np0005591283.novalocal python3[5944]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:04 np0005591283.novalocal python3[5968]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:04 np0005591283.novalocal python3[5992]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:05 np0005591283.novalocal python3[6016]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:05 np0005591283.novalocal python3[6040]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:46:08 np0005591283.novalocal sudo[6064]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdsbhbyhxeibpimwxkcoevvropfwbciz ; /usr/bin/python3'
Jan 21 22:46:08 np0005591283.novalocal sudo[6064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:08 np0005591283.novalocal python3[6066]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 21 22:46:08 np0005591283.novalocal systemd[1]: Starting Time & Date Service...
Jan 21 22:46:08 np0005591283.novalocal systemd[1]: Started Time & Date Service.
Jan 21 22:46:08 np0005591283.novalocal systemd-timedated[6068]: Changed time zone to 'UTC' (UTC).
Jan 21 22:46:08 np0005591283.novalocal sudo[6064]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:08 np0005591283.novalocal sudo[6095]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnzqszxezvgjrldgcevhyutehkzwqwpo ; /usr/bin/python3'
Jan 21 22:46:08 np0005591283.novalocal sudo[6095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:08 np0005591283.novalocal python3[6097]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:08 np0005591283.novalocal sudo[6095]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:09 np0005591283.novalocal python3[6173]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:46:09 np0005591283.novalocal python3[6244]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769035569.199093-251-142491036117334/source _original_basename=tmp536e2v5g follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:10 np0005591283.novalocal python3[6344]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:46:10 np0005591283.novalocal python3[6415]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769035570.0862827-301-78139146040812/source _original_basename=tmp0g_s9ep3 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:11 np0005591283.novalocal sudo[6515]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmzxlxnjyfbemqjuvlagcoybbhuqxngv ; /usr/bin/python3'
Jan 21 22:46:11 np0005591283.novalocal sudo[6515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:11 np0005591283.novalocal python3[6517]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:46:11 np0005591283.novalocal sudo[6515]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:11 np0005591283.novalocal sudo[6588]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luueehljrkcqqcvkyjpcgpxoogkkxlld ; /usr/bin/python3'
Jan 21 22:46:11 np0005591283.novalocal sudo[6588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:12 np0005591283.novalocal python3[6590]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769035571.4308918-381-24422587679461/source _original_basename=tmp3qxu18g0 follow=False checksum=01954034105cdb65b42722894a5c1036808c70c7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:12 np0005591283.novalocal sudo[6588]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:12 np0005591283.novalocal python3[6638]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:46:13 np0005591283.novalocal python3[6664]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:46:13 np0005591283.novalocal sudo[6742]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lprlqxiezmheahpkvyekujxecimiawlb ; /usr/bin/python3'
Jan 21 22:46:13 np0005591283.novalocal sudo[6742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:13 np0005591283.novalocal python3[6744]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:46:13 np0005591283.novalocal sudo[6742]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:13 np0005591283.novalocal sudo[6815]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piqpzjhsvdgwzjwrmynfhizjfrfjzabk ; /usr/bin/python3'
Jan 21 22:46:13 np0005591283.novalocal sudo[6815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:13 np0005591283.novalocal python3[6817]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769035573.2241967-451-241072195944205/source _original_basename=tmp4myg53i5 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:13 np0005591283.novalocal sudo[6815]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:14 np0005591283.novalocal sudo[6866]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaiypeleqxjpwxsycbfcidekgsoysmgc ; /usr/bin/python3'
Jan 21 22:46:14 np0005591283.novalocal sudo[6866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:14 np0005591283.novalocal python3[6868]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-2a6d-faa9-00000000001f-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:46:14 np0005591283.novalocal sudo[6866]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:15 np0005591283.novalocal python3[6896]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-2a6d-faa9-000000000020-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 21 22:46:16 np0005591283.novalocal sshd-session[6901]: Invalid user postgres from 188.166.69.60 port 48890
Jan 21 22:46:16 np0005591283.novalocal sshd-session[6901]: Connection closed by invalid user postgres 188.166.69.60 port 48890 [preauth]
Jan 21 22:46:16 np0005591283.novalocal python3[6926]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:26 np0005591283.novalocal sshd-session[6927]: Invalid user ubuntu from 38.67.240.124 port 61428
Jan 21 22:46:26 np0005591283.novalocal sshd-session[6927]: Received disconnect from 38.67.240.124 port 61428:11:  [preauth]
Jan 21 22:46:26 np0005591283.novalocal sshd-session[6927]: Disconnected from invalid user ubuntu 38.67.240.124 port 61428 [preauth]
Jan 21 22:46:37 np0005591283.novalocal sudo[6952]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytzhdjkftrsptixktrnpqaolhzrcqrwr ; /usr/bin/python3'
Jan 21 22:46:37 np0005591283.novalocal sudo[6952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:46:37 np0005591283.novalocal python3[6954]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:46:37 np0005591283.novalocal sudo[6952]: pam_unix(sudo:session): session closed for user root
Jan 21 22:46:38 np0005591283.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 21 22:47:01 np0005591283.novalocal sshd-session[6957]: Invalid user postgres from 188.166.69.60 port 37780
Jan 21 22:47:01 np0005591283.novalocal sshd-session[6957]: Connection closed by invalid user postgres 188.166.69.60 port 37780 [preauth]
Jan 21 22:47:18 np0005591283.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 21 22:47:18 np0005591283.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 21 22:47:18 np0005591283.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 21 22:47:18 np0005591283.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 21 22:47:18 np0005591283.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 21 22:47:18 np0005591283.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 21 22:47:18 np0005591283.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 21 22:47:18 np0005591283.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 21 22:47:18 np0005591283.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 21 22:47:18 np0005591283.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 21 22:47:18 np0005591283.novalocal NetworkManager[858]: <info>  [1769035638.9867] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 21 22:47:18 np0005591283.novalocal systemd-udevd[6962]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 22:47:19 np0005591283.novalocal NetworkManager[858]: <info>  [1769035639.0012] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 22:47:19 np0005591283.novalocal NetworkManager[858]: <info>  [1769035639.0034] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 21 22:47:19 np0005591283.novalocal NetworkManager[858]: <info>  [1769035639.0036] device (eth1): carrier: link connected
Jan 21 22:47:19 np0005591283.novalocal NetworkManager[858]: <info>  [1769035639.0038] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 21 22:47:19 np0005591283.novalocal NetworkManager[858]: <info>  [1769035639.0042] policy: auto-activating connection 'Wired connection 1' (ec059f52-b25e-31f6-9b9f-6e854f0ee9a8)
Jan 21 22:47:19 np0005591283.novalocal NetworkManager[858]: <info>  [1769035639.0046] device (eth1): Activation: starting connection 'Wired connection 1' (ec059f52-b25e-31f6-9b9f-6e854f0ee9a8)
Jan 21 22:47:19 np0005591283.novalocal NetworkManager[858]: <info>  [1769035639.0046] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 22:47:19 np0005591283.novalocal NetworkManager[858]: <info>  [1769035639.0049] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 22:47:19 np0005591283.novalocal NetworkManager[858]: <info>  [1769035639.0051] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 22:47:19 np0005591283.novalocal NetworkManager[858]: <info>  [1769035639.0055] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 21 22:47:19 np0005591283.novalocal sshd-session[6959]: Received disconnect from 45.148.10.151 port 18680:11:  [preauth]
Jan 21 22:47:19 np0005591283.novalocal sshd-session[6959]: Disconnected from authenticating user root 45.148.10.151 port 18680 [preauth]
Jan 21 22:47:19 np0005591283.novalocal python3[6988]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-4168-cfbf-000000000128-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:47:26 np0005591283.novalocal sudo[7066]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwyrwokosyrfyjbaxuhfbovtyufhvpku ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 22:47:26 np0005591283.novalocal sudo[7066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:47:26 np0005591283.novalocal python3[7068]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:47:26 np0005591283.novalocal sudo[7066]: pam_unix(sudo:session): session closed for user root
Jan 21 22:47:27 np0005591283.novalocal sudo[7139]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aynaedrnhfureeuczcqurhbcodvccrlm ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 22:47:27 np0005591283.novalocal sudo[7139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:47:27 np0005591283.novalocal python3[7141]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769035646.5816994-104-254798281154903/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=428e9c58cff9141b732d3093ebf93d2797ebe319 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:47:27 np0005591283.novalocal sudo[7139]: pam_unix(sudo:session): session closed for user root
Jan 21 22:47:27 np0005591283.novalocal sudo[7189]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rycpkplmkpamaylcvpunxikgznkjklwj ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 22:47:27 np0005591283.novalocal sudo[7189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:47:27 np0005591283.novalocal python3[7191]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 22:47:27 np0005591283.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 21 22:47:27 np0005591283.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 21 22:47:27 np0005591283.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 21 22:47:27 np0005591283.novalocal systemd[1]: Stopping Network Manager...
Jan 21 22:47:27 np0005591283.novalocal NetworkManager[858]: <info>  [1769035647.9299] caught SIGTERM, shutting down normally.
Jan 21 22:47:27 np0005591283.novalocal NetworkManager[858]: <info>  [1769035647.9310] dhcp4 (eth0): canceled DHCP transaction
Jan 21 22:47:27 np0005591283.novalocal NetworkManager[858]: <info>  [1769035647.9311] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 22:47:27 np0005591283.novalocal NetworkManager[858]: <info>  [1769035647.9311] dhcp4 (eth0): state changed no lease
Jan 21 22:47:27 np0005591283.novalocal NetworkManager[858]: <info>  [1769035647.9315] manager: NetworkManager state is now CONNECTING
Jan 21 22:47:27 np0005591283.novalocal NetworkManager[858]: <info>  [1769035647.9489] dhcp4 (eth1): canceled DHCP transaction
Jan 21 22:47:27 np0005591283.novalocal NetworkManager[858]: <info>  [1769035647.9490] dhcp4 (eth1): state changed no lease
Jan 21 22:47:27 np0005591283.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 22:47:27 np0005591283.novalocal NetworkManager[858]: <info>  [1769035647.9559] exiting (success)
Jan 21 22:47:27 np0005591283.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 22:47:27 np0005591283.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 21 22:47:27 np0005591283.novalocal systemd[1]: Stopped Network Manager.
Jan 21 22:47:27 np0005591283.novalocal systemd[1]: Starting Network Manager...
Jan 21 22:47:27 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035647.9950] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:a457267e-aaec-4d55-ae32-e78b7b5bf63f)
Jan 21 22:47:27 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035647.9952] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0001] manager[0x560e272a8000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 21 22:47:28 np0005591283.novalocal systemd[1]: Starting Hostname Service...
Jan 21 22:47:28 np0005591283.novalocal systemd[1]: Started Hostname Service.
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0796] hostname: hostname: using hostnamed
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0797] hostname: static hostname changed from (none) to "np0005591283.novalocal"
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0803] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0808] manager[0x560e272a8000]: rfkill: Wi-Fi hardware radio set enabled
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0809] manager[0x560e272a8000]: rfkill: WWAN hardware radio set enabled
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0840] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0840] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0841] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0841] manager: Networking is enabled by state file
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0844] settings: Loaded settings plugin: keyfile (internal)
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0848] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0876] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0885] dhcp: init: Using DHCP client 'internal'
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0888] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0895] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0900] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0910] device (lo): Activation: starting connection 'lo' (2e99d35a-31da-47ef-9f44-21c4df97b7a3)
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0917] device (eth0): carrier: link connected
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0922] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0927] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0927] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0934] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0942] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0949] device (eth1): carrier: link connected
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0953] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0959] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (ec059f52-b25e-31f6-9b9f-6e854f0ee9a8) (indicated)
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0959] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0964] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0971] device (eth1): Activation: starting connection 'Wired connection 1' (ec059f52-b25e-31f6-9b9f-6e854f0ee9a8)
Jan 21 22:47:28 np0005591283.novalocal systemd[1]: Started Network Manager.
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0978] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0982] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0985] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0987] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0992] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.0996] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1000] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1002] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1006] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1013] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1016] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1026] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1029] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1046] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1052] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1057] device (lo): Activation: successful, device activated.
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1065] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1072] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1131] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1160] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1162] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1165] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1167] device (eth0): Activation: successful, device activated.
Jan 21 22:47:28 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035648.1173] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 21 22:47:28 np0005591283.novalocal sudo[7189]: pam_unix(sudo:session): session closed for user root
Jan 21 22:47:28 np0005591283.novalocal python3[7275]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-4168-cfbf-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:47:38 np0005591283.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 22:47:46 np0005591283.novalocal sshd-session[7278]: Invalid user postgres from 188.166.69.60 port 55592
Jan 21 22:47:46 np0005591283.novalocal sshd-session[7278]: Connection closed by invalid user postgres 188.166.69.60 port 55592 [preauth]
Jan 21 22:47:58 np0005591283.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2060] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 22:48:13 np0005591283.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 22:48:13 np0005591283.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2361] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2367] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2376] device (eth1): Activation: successful, device activated.
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2384] manager: startup complete
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2387] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <warn>  [1769035693.2394] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2414] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 21 22:48:13 np0005591283.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2508] dhcp4 (eth1): canceled DHCP transaction
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2508] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2509] dhcp4 (eth1): state changed no lease
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2521] policy: auto-activating connection 'ci-private-network' (ca895a5d-b6dc-5a65-bf50-75a15530a096)
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2524] device (eth1): Activation: starting connection 'ci-private-network' (ca895a5d-b6dc-5a65-bf50-75a15530a096)
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2524] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2526] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2530] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2536] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2569] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2570] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 22:48:13 np0005591283.novalocal NetworkManager[7199]: <info>  [1769035693.2574] device (eth1): Activation: successful, device activated.
Jan 21 22:48:21 np0005591283.novalocal systemd[4306]: Starting Mark boot as successful...
Jan 21 22:48:21 np0005591283.novalocal systemd[4306]: Finished Mark boot as successful.
Jan 21 22:48:23 np0005591283.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 22:48:28 np0005591283.novalocal sshd-session[4316]: Received disconnect from 38.102.83.114 port 55814:11: disconnected by user
Jan 21 22:48:28 np0005591283.novalocal sshd-session[4316]: Disconnected from user zuul 38.102.83.114 port 55814
Jan 21 22:48:28 np0005591283.novalocal sshd-session[4302]: pam_unix(sshd:session): session closed for user zuul
Jan 21 22:48:28 np0005591283.novalocal systemd-logind[784]: Session 1 logged out. Waiting for processes to exit.
Jan 21 22:48:32 np0005591283.novalocal sshd-session[7306]: Invalid user postgres from 188.166.69.60 port 39290
Jan 21 22:48:32 np0005591283.novalocal sshd-session[7306]: Connection closed by invalid user postgres 188.166.69.60 port 39290 [preauth]
Jan 21 22:49:13 np0005591283.novalocal sshd-session[7308]: Accepted publickey for zuul from 38.102.83.114 port 51070 ssh2: RSA SHA256:LSN8GeK+nwfQgAdzsG9Fx0/CGGktcUeOM8rFlOBs7zo
Jan 21 22:49:13 np0005591283.novalocal systemd-logind[784]: New session 3 of user zuul.
Jan 21 22:49:13 np0005591283.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 21 22:49:13 np0005591283.novalocal sshd-session[7308]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:49:14 np0005591283.novalocal sudo[7387]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flpcwchszzvfickctdrsrluipvprbwsk ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 22:49:14 np0005591283.novalocal sudo[7387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:49:14 np0005591283.novalocal python3[7389]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:49:14 np0005591283.novalocal sudo[7387]: pam_unix(sudo:session): session closed for user root
Jan 21 22:49:14 np0005591283.novalocal sudo[7460]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iahjbquznbkybvubkdhxawuctotwosqe ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 21 22:49:14 np0005591283.novalocal sudo[7460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:49:14 np0005591283.novalocal python3[7462]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769035753.892338-365-26381892177619/source _original_basename=tmp44b7xbi0 follow=False checksum=9be2ac127257c76b31f8acdef7104cc3c2481547 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:49:14 np0005591283.novalocal sudo[7460]: pam_unix(sudo:session): session closed for user root
Jan 21 22:49:18 np0005591283.novalocal sshd-session[7487]: Invalid user postgres from 188.166.69.60 port 41370
Jan 21 22:49:18 np0005591283.novalocal sshd-session[7487]: Connection closed by invalid user postgres 188.166.69.60 port 41370 [preauth]
Jan 21 22:49:18 np0005591283.novalocal sshd-session[7311]: Connection closed by 38.102.83.114 port 51070
Jan 21 22:49:18 np0005591283.novalocal sshd-session[7308]: pam_unix(sshd:session): session closed for user zuul
Jan 21 22:49:18 np0005591283.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 21 22:49:18 np0005591283.novalocal systemd-logind[784]: Session 3 logged out. Waiting for processes to exit.
Jan 21 22:49:18 np0005591283.novalocal systemd-logind[784]: Removed session 3.
Jan 21 22:50:03 np0005591283.novalocal sshd-session[7489]: Invalid user postgres from 188.166.69.60 port 34812
Jan 21 22:50:03 np0005591283.novalocal sshd-session[7489]: Connection closed by invalid user postgres 188.166.69.60 port 34812 [preauth]
Jan 21 22:50:48 np0005591283.novalocal sshd-session[7492]: Invalid user postgres from 188.166.69.60 port 44690
Jan 21 22:50:48 np0005591283.novalocal sshd-session[7492]: Connection closed by invalid user postgres 188.166.69.60 port 44690 [preauth]
Jan 21 22:51:21 np0005591283.novalocal systemd[4306]: Created slice User Background Tasks Slice.
Jan 21 22:51:21 np0005591283.novalocal systemd[4306]: Starting Cleanup of User's Temporary Files and Directories...
Jan 21 22:51:21 np0005591283.novalocal systemd[4306]: Finished Cleanup of User's Temporary Files and Directories.
Jan 21 22:51:33 np0005591283.novalocal sshd-session[7497]: Invalid user postgres from 188.166.69.60 port 49500
Jan 21 22:51:33 np0005591283.novalocal sshd-session[7497]: Connection closed by invalid user postgres 188.166.69.60 port 49500 [preauth]
Jan 21 22:52:16 np0005591283.novalocal sshd-session[7500]: Invalid user pi from 188.166.69.60 port 57108
Jan 21 22:52:16 np0005591283.novalocal sshd-session[7500]: Connection closed by invalid user pi 188.166.69.60 port 57108 [preauth]
Jan 21 22:53:00 np0005591283.novalocal sshd-session[7502]: Invalid user pi from 188.166.69.60 port 46074
Jan 21 22:53:01 np0005591283.novalocal sshd-session[7502]: Connection closed by invalid user pi 188.166.69.60 port 46074 [preauth]
Jan 21 22:53:46 np0005591283.novalocal sshd-session[7504]: Invalid user pi from 188.166.69.60 port 50330
Jan 21 22:53:46 np0005591283.novalocal sshd-session[7504]: Connection closed by invalid user pi 188.166.69.60 port 50330 [preauth]
Jan 21 22:54:30 np0005591283.novalocal sshd-session[7506]: Received disconnect from 45.148.10.151 port 23392:11:  [preauth]
Jan 21 22:54:30 np0005591283.novalocal sshd-session[7506]: Disconnected from authenticating user root 45.148.10.151 port 23392 [preauth]
Jan 21 22:54:32 np0005591283.novalocal sshd-session[7508]: Invalid user pi from 188.166.69.60 port 57562
Jan 21 22:54:32 np0005591283.novalocal sshd-session[7508]: Connection closed by invalid user pi 188.166.69.60 port 57562 [preauth]
Jan 21 22:54:44 np0005591283.novalocal sshd-session[7511]: Accepted publickey for zuul from 38.102.83.114 port 37786 ssh2: RSA SHA256:LSN8GeK+nwfQgAdzsG9Fx0/CGGktcUeOM8rFlOBs7zo
Jan 21 22:54:45 np0005591283.novalocal systemd-logind[784]: New session 4 of user zuul.
Jan 21 22:54:45 np0005591283.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 21 22:54:45 np0005591283.novalocal sshd-session[7511]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:54:45 np0005591283.novalocal sudo[7538]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlkhpiafbofvayiuevnubtspqtsawusf ; /usr/bin/python3'
Jan 21 22:54:45 np0005591283.novalocal sudo[7538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:45 np0005591283.novalocal python3[7540]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-51e3-fd82-000000000ca4-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:54:45 np0005591283.novalocal sudo[7538]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:45 np0005591283.novalocal sudo[7566]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xywzgknpdlubjynnqrxiakztekekgcec ; /usr/bin/python3'
Jan 21 22:54:45 np0005591283.novalocal sudo[7566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:45 np0005591283.novalocal python3[7568]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:54:45 np0005591283.novalocal sudo[7566]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:45 np0005591283.novalocal sudo[7592]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yddxpvsworleoritinixnanhnqltniuu ; /usr/bin/python3'
Jan 21 22:54:45 np0005591283.novalocal sudo[7592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:45 np0005591283.novalocal python3[7595]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:54:45 np0005591283.novalocal sudo[7592]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:45 np0005591283.novalocal sudo[7619]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyzbqtqyeycpouigckwaydjwguenxxxw ; /usr/bin/python3'
Jan 21 22:54:45 np0005591283.novalocal sudo[7619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:46 np0005591283.novalocal python3[7621]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:54:46 np0005591283.novalocal sudo[7619]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:46 np0005591283.novalocal sudo[7645]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evbgtopvzkpbjtggkwtapbldpkftdkmu ; /usr/bin/python3'
Jan 21 22:54:46 np0005591283.novalocal sudo[7645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:46 np0005591283.novalocal python3[7647]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:54:46 np0005591283.novalocal sudo[7645]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:46 np0005591283.novalocal sudo[7671]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eazsbwclaetgkiewzklmrjdvwerapife ; /usr/bin/python3'
Jan 21 22:54:46 np0005591283.novalocal sudo[7671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:46 np0005591283.novalocal python3[7673]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:54:46 np0005591283.novalocal sudo[7671]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:47 np0005591283.novalocal sudo[7749]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktyjwgvpyihjzwnfqkfgvqpebdwywedm ; /usr/bin/python3'
Jan 21 22:54:47 np0005591283.novalocal sudo[7749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:47 np0005591283.novalocal python3[7751]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:54:47 np0005591283.novalocal sudo[7749]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:47 np0005591283.novalocal sudo[7822]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbqfdjlptduufzpdgwxwlchdoaonyoyb ; /usr/bin/python3'
Jan 21 22:54:47 np0005591283.novalocal sudo[7822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:47 np0005591283.novalocal python3[7824]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769036087.1907952-364-14045200778803/source _original_basename=tmpamfrjzo3 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:54:47 np0005591283.novalocal sudo[7822]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:49 np0005591283.novalocal sudo[7872]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udquucsciuyftbtzobgmvyamhdaaxgyx ; /usr/bin/python3'
Jan 21 22:54:49 np0005591283.novalocal sudo[7872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:49 np0005591283.novalocal python3[7874]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 22:54:49 np0005591283.novalocal systemd[1]: Reloading.
Jan 21 22:54:49 np0005591283.novalocal systemd-rc-local-generator[7894]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 22:54:49 np0005591283.novalocal sudo[7872]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:51 np0005591283.novalocal sudo[7928]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jknjelrjqqonttewdahevjtyqtweizkn ; /usr/bin/python3'
Jan 21 22:54:51 np0005591283.novalocal sudo[7928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:51 np0005591283.novalocal python3[7930]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 21 22:54:51 np0005591283.novalocal sudo[7928]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:54 np0005591283.novalocal sudo[7954]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlnsrcxyslheswspfizgptpsqmmxbzdw ; /usr/bin/python3'
Jan 21 22:54:54 np0005591283.novalocal sudo[7954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:54 np0005591283.novalocal python3[7956]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:54:54 np0005591283.novalocal sudo[7954]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:55 np0005591283.novalocal sudo[7982]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbhjybghgkmjhspfbgscaploxirxuqjf ; /usr/bin/python3'
Jan 21 22:54:55 np0005591283.novalocal sudo[7982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:55 np0005591283.novalocal python3[7984]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:54:55 np0005591283.novalocal sudo[7982]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:55 np0005591283.novalocal sudo[8010]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxbtiazwkoolxnjwcwuwtshvjrrmdjkk ; /usr/bin/python3'
Jan 21 22:54:55 np0005591283.novalocal sudo[8010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:55 np0005591283.novalocal python3[8012]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:54:55 np0005591283.novalocal sudo[8010]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:55 np0005591283.novalocal sudo[8038]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfyjmrlgpymsjgdyixsfhsvdskirtcwh ; /usr/bin/python3'
Jan 21 22:54:55 np0005591283.novalocal sudo[8038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:54:55 np0005591283.novalocal python3[8040]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:54:55 np0005591283.novalocal sudo[8038]: pam_unix(sudo:session): session closed for user root
Jan 21 22:54:56 np0005591283.novalocal python3[8067]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-51e3-fd82-000000000cab-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:54:57 np0005591283.novalocal python3[8097]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 21 22:55:00 np0005591283.novalocal sshd-session[7514]: Connection closed by 38.102.83.114 port 37786
Jan 21 22:55:00 np0005591283.novalocal sshd-session[7511]: pam_unix(sshd:session): session closed for user zuul
Jan 21 22:55:00 np0005591283.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 21 22:55:00 np0005591283.novalocal systemd[1]: session-4.scope: Consumed 3.999s CPU time.
Jan 21 22:55:00 np0005591283.novalocal systemd-logind[784]: Session 4 logged out. Waiting for processes to exit.
Jan 21 22:55:00 np0005591283.novalocal systemd-logind[784]: Removed session 4.
Jan 21 22:55:02 np0005591283.novalocal sshd-session[8102]: Accepted publickey for zuul from 38.102.83.114 port 41458 ssh2: RSA SHA256:LSN8GeK+nwfQgAdzsG9Fx0/CGGktcUeOM8rFlOBs7zo
Jan 21 22:55:02 np0005591283.novalocal systemd-logind[784]: New session 5 of user zuul.
Jan 21 22:55:02 np0005591283.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 21 22:55:02 np0005591283.novalocal sshd-session[8102]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:55:02 np0005591283.novalocal sudo[8129]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxfepfbpzjhkqtfsmkyovpdleekjduhj ; /usr/bin/python3'
Jan 21 22:55:02 np0005591283.novalocal sudo[8129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:55:02 np0005591283.novalocal python3[8131]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 21 22:55:10 np0005591283.novalocal setsebool[8175]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 21 22:55:10 np0005591283.novalocal setsebool[8175]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 21 22:55:17 np0005591283.novalocal sshd-session[8184]: Invalid user pi from 188.166.69.60 port 57400
Jan 21 22:55:17 np0005591283.novalocal sshd-session[8184]: Connection closed by invalid user pi 188.166.69.60 port 57400 [preauth]
Jan 21 22:55:22 np0005591283.novalocal kernel: SELinux:  Converting 385 SID table entries...
Jan 21 22:55:22 np0005591283.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 22:55:22 np0005591283.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 21 22:55:22 np0005591283.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 22:55:22 np0005591283.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 21 22:55:22 np0005591283.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 22:55:22 np0005591283.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 22:55:22 np0005591283.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 22:55:33 np0005591283.novalocal kernel: SELinux:  Converting 388 SID table entries...
Jan 21 22:55:33 np0005591283.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 22:55:33 np0005591283.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 21 22:55:33 np0005591283.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 22:55:33 np0005591283.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 21 22:55:33 np0005591283.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 22:55:33 np0005591283.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 22:55:33 np0005591283.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 22:55:50 np0005591283.novalocal dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 21 22:55:50 np0005591283.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 22:55:50 np0005591283.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 21 22:55:50 np0005591283.novalocal systemd[1]: Reloading.
Jan 21 22:55:50 np0005591283.novalocal systemd-rc-local-generator[8947]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 22:55:50 np0005591283.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 22:55:51 np0005591283.novalocal sudo[8129]: pam_unix(sudo:session): session closed for user root
Jan 21 22:55:53 np0005591283.novalocal python3[11632]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ec2-ffbe-17c1-e6f1-00000000000c-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 22:55:54 np0005591283.novalocal kernel: evm: overlay not supported
Jan 21 22:55:54 np0005591283.novalocal systemd[4306]: Starting D-Bus User Message Bus...
Jan 21 22:55:54 np0005591283.novalocal dbus-broker-launch[12590]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 21 22:55:54 np0005591283.novalocal dbus-broker-launch[12590]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 21 22:55:54 np0005591283.novalocal systemd[4306]: Started D-Bus User Message Bus.
Jan 21 22:55:54 np0005591283.novalocal dbus-broker-lau[12590]: Ready
Jan 21 22:55:54 np0005591283.novalocal systemd[4306]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 21 22:55:54 np0005591283.novalocal systemd[4306]: Created slice Slice /user.
Jan 21 22:55:54 np0005591283.novalocal systemd[4306]: podman-12498.scope: unit configures an IP firewall, but not running as root.
Jan 21 22:55:54 np0005591283.novalocal systemd[4306]: (This warning is only shown for the first unit using IP firewalling.)
Jan 21 22:55:54 np0005591283.novalocal systemd[4306]: Started podman-12498.scope.
Jan 21 22:55:54 np0005591283.novalocal systemd[4306]: Started podman-pause-e118ee19.scope.
Jan 21 22:55:55 np0005591283.novalocal sudo[13723]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejgejiswttdfsrsvojouyzprumwboaco ; /usr/bin/python3'
Jan 21 22:55:55 np0005591283.novalocal sudo[13723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:55:55 np0005591283.novalocal python3[13742]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.27:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.27:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:55:55 np0005591283.novalocal python3[13742]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 21 22:55:55 np0005591283.novalocal sudo[13723]: pam_unix(sudo:session): session closed for user root
Jan 21 22:55:56 np0005591283.novalocal sshd-session[8105]: Connection closed by 38.102.83.114 port 41458
Jan 21 22:55:56 np0005591283.novalocal sshd-session[8102]: pam_unix(sshd:session): session closed for user zuul
Jan 21 22:55:56 np0005591283.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 21 22:55:56 np0005591283.novalocal systemd[1]: session-5.scope: Consumed 43.541s CPU time.
Jan 21 22:55:56 np0005591283.novalocal systemd-logind[784]: Session 5 logged out. Waiting for processes to exit.
Jan 21 22:55:56 np0005591283.novalocal systemd-logind[784]: Removed session 5.
Jan 21 22:56:03 np0005591283.novalocal sshd-session[16630]: Invalid user pi from 188.166.69.60 port 43348
Jan 21 22:56:04 np0005591283.novalocal sshd-session[16630]: Connection closed by invalid user pi 188.166.69.60 port 43348 [preauth]
Jan 21 22:56:14 np0005591283.novalocal sshd-session[22699]: Unable to negotiate with 38.102.83.151 port 32768: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 21 22:56:14 np0005591283.novalocal sshd-session[22701]: Connection closed by 38.102.83.151 port 60994 [preauth]
Jan 21 22:56:14 np0005591283.novalocal sshd-session[22703]: Connection closed by 38.102.83.151 port 60978 [preauth]
Jan 21 22:56:14 np0005591283.novalocal sshd-session[22700]: Unable to negotiate with 38.102.83.151 port 32770: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 21 22:56:14 np0005591283.novalocal sshd-session[22706]: Unable to negotiate with 38.102.83.151 port 32786: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 21 22:56:20 np0005591283.novalocal sshd-session[24702]: Accepted publickey for zuul from 38.102.83.114 port 57714 ssh2: RSA SHA256:LSN8GeK+nwfQgAdzsG9Fx0/CGGktcUeOM8rFlOBs7zo
Jan 21 22:56:20 np0005591283.novalocal systemd-logind[784]: New session 6 of user zuul.
Jan 21 22:56:20 np0005591283.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 21 22:56:20 np0005591283.novalocal sshd-session[24702]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 22:56:20 np0005591283.novalocal python3[24818]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLb1E5HwmlMjFU9nv6wd+VHV9J1rtO+UWxPZpEjo1oVR+Rls9TFII1iFAeK4/68neaHhE2B9Qc0dAUKPbHC0hoM= zuul@np0005591282.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:56:20 np0005591283.novalocal sudo[25061]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfmrwutbdlaleptulabdmunfnnfdhmub ; /usr/bin/python3'
Jan 21 22:56:20 np0005591283.novalocal sudo[25061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:56:20 np0005591283.novalocal python3[25073]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLb1E5HwmlMjFU9nv6wd+VHV9J1rtO+UWxPZpEjo1oVR+Rls9TFII1iFAeK4/68neaHhE2B9Qc0dAUKPbHC0hoM= zuul@np0005591282.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:56:20 np0005591283.novalocal sudo[25061]: pam_unix(sudo:session): session closed for user root
Jan 21 22:56:21 np0005591283.novalocal sudo[25642]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dipkhwblrosxxrahhzohfdqsedwpwbnn ; /usr/bin/python3'
Jan 21 22:56:21 np0005591283.novalocal sudo[25642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:56:21 np0005591283.novalocal python3[25652]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005591283.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 21 22:56:21 np0005591283.novalocal useradd[25742]: new group: name=cloud-admin, GID=1002
Jan 21 22:56:21 np0005591283.novalocal useradd[25742]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 21 22:56:22 np0005591283.novalocal sudo[25642]: pam_unix(sudo:session): session closed for user root
Jan 21 22:56:22 np0005591283.novalocal sudo[26036]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-badkfewfwygpkpkoopoyrnlsbvbylgsc ; /usr/bin/python3'
Jan 21 22:56:22 np0005591283.novalocal sudo[26036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:56:22 np0005591283.novalocal python3[26045]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLb1E5HwmlMjFU9nv6wd+VHV9J1rtO+UWxPZpEjo1oVR+Rls9TFII1iFAeK4/68neaHhE2B9Qc0dAUKPbHC0hoM= zuul@np0005591282.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 21 22:56:22 np0005591283.novalocal sudo[26036]: pam_unix(sudo:session): session closed for user root
Jan 21 22:56:22 np0005591283.novalocal sudo[26342]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gymshtxvacspwnhomvjpmgmoygxourti ; /usr/bin/python3'
Jan 21 22:56:22 np0005591283.novalocal sudo[26342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:56:23 np0005591283.novalocal python3[26349]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 22:56:23 np0005591283.novalocal sudo[26342]: pam_unix(sudo:session): session closed for user root
Jan 21 22:56:23 np0005591283.novalocal sudo[26649]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vabpmssdbiilybrsjbnueyvsggxleuvh ; /usr/bin/python3'
Jan 21 22:56:23 np0005591283.novalocal sudo[26649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:56:23 np0005591283.novalocal python3[26658]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769036182.8107994-167-52024087436987/source _original_basename=tmp_qyoe0z5 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 22:56:23 np0005591283.novalocal sudo[26649]: pam_unix(sudo:session): session closed for user root
Jan 21 22:56:24 np0005591283.novalocal sudo[27023]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtbwsmiwotwammzvybiugjdqaolgbnam ; /usr/bin/python3'
Jan 21 22:56:24 np0005591283.novalocal sudo[27023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 22:56:24 np0005591283.novalocal python3[27032]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Jan 21 22:56:24 np0005591283.novalocal systemd[1]: Starting Hostname Service...
Jan 21 22:56:24 np0005591283.novalocal systemd[1]: Started Hostname Service.
Jan 21 22:56:24 np0005591283.novalocal systemd-hostnamed[27159]: Changed pretty hostname to 'compute-0'
Jan 21 22:56:24 compute-0 systemd-hostnamed[27159]: Hostname set to <compute-0> (static)
Jan 21 22:56:24 compute-0 NetworkManager[7199]: <info>  [1769036184.5787] hostname: static hostname changed from "np0005591283.novalocal" to "compute-0"
Jan 21 22:56:24 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 22:56:24 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 22:56:24 compute-0 sudo[27023]: pam_unix(sudo:session): session closed for user root
Jan 21 22:56:26 compute-0 sshd-session[24753]: Connection closed by 38.102.83.114 port 57714
Jan 21 22:56:26 compute-0 sshd-session[24702]: pam_unix(sshd:session): session closed for user zuul
Jan 21 22:56:26 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Jan 21 22:56:26 compute-0 systemd[1]: session-6.scope: Consumed 2.158s CPU time.
Jan 21 22:56:26 compute-0 systemd-logind[784]: Session 6 logged out. Waiting for processes to exit.
Jan 21 22:56:26 compute-0 systemd-logind[784]: Removed session 6.
Jan 21 22:56:31 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 22:56:31 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 22:56:31 compute-0 systemd[1]: man-db-cache-update.service: Consumed 50.677s CPU time.
Jan 21 22:56:31 compute-0 systemd[1]: run-r59572fc28dad42ec8b458a05704c6b30.service: Deactivated successfully.
Jan 21 22:56:34 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 22:56:45 compute-0 sshd-session[29961]: Invalid user pi from 188.166.69.60 port 36480
Jan 21 22:56:45 compute-0 sshd-session[29961]: Connection closed by invalid user pi 188.166.69.60 port 36480 [preauth]
Jan 21 22:56:54 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 22:57:29 compute-0 sshd-session[29966]: Invalid user pi from 188.166.69.60 port 49348
Jan 21 22:57:29 compute-0 sshd-session[29966]: Connection closed by invalid user pi 188.166.69.60 port 49348 [preauth]
Jan 21 22:58:04 compute-0 sshd-session[29970]: Connection closed by 203.83.238.251 port 32916
Jan 21 22:58:11 compute-0 sshd-session[29971]: Invalid user pi from 188.166.69.60 port 50630
Jan 21 22:58:11 compute-0 sshd-session[29971]: Connection closed by invalid user pi 188.166.69.60 port 50630 [preauth]
Jan 21 22:58:53 compute-0 sshd-session[29974]: Invalid user administrator from 188.166.69.60 port 34034
Jan 21 22:58:53 compute-0 sshd-session[29974]: Connection closed by invalid user administrator 188.166.69.60 port 34034 [preauth]
Jan 21 22:59:39 compute-0 sshd-session[29976]: Invalid user administrator from 188.166.69.60 port 49710
Jan 21 22:59:40 compute-0 sshd-session[29976]: Connection closed by invalid user administrator 188.166.69.60 port 49710 [preauth]
Jan 21 23:00:11 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 21 23:00:11 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 21 23:00:11 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 21 23:00:11 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 21 23:00:23 compute-0 sshd-session[29983]: Accepted publickey for zuul from 38.102.83.151 port 48400 ssh2: RSA SHA256:LSN8GeK+nwfQgAdzsG9Fx0/CGGktcUeOM8rFlOBs7zo
Jan 21 23:00:23 compute-0 systemd-logind[784]: New session 7 of user zuul.
Jan 21 23:00:23 compute-0 systemd[1]: Started Session 7 of User zuul.
Jan 21 23:00:23 compute-0 sshd-session[29983]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:00:23 compute-0 sshd-session[29982]: Invalid user administrator from 188.166.69.60 port 40008
Jan 21 23:00:23 compute-0 sshd-session[29982]: Connection closed by invalid user administrator 188.166.69.60 port 40008 [preauth]
Jan 21 23:00:23 compute-0 python3[30060]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:00:25 compute-0 sudo[30174]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypsnzqjvdjtwdpfrfvyftosbqdbxobua ; /usr/bin/python3'
Jan 21 23:00:25 compute-0 sudo[30174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:25 compute-0 python3[30176]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:25 compute-0 sudo[30174]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:26 compute-0 sudo[30247]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yecmdpnoqnueoeqtpczvktgmfihothnn ; /usr/bin/python3'
Jan 21 23:00:26 compute-0 sudo[30247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:26 compute-0 python3[30249]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.6734278-34005-236179086198711/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:26 compute-0 sudo[30247]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:26 compute-0 sudo[30273]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lngasthpkcqiuhorzrbdomgwnpnaevnk ; /usr/bin/python3'
Jan 21 23:00:26 compute-0 sudo[30273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:26 compute-0 python3[30275]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:26 compute-0 sudo[30273]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:26 compute-0 sudo[30346]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjjeosohttpsvjygyjkxfyutawsnybly ; /usr/bin/python3'
Jan 21 23:00:26 compute-0 sudo[30346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:26 compute-0 python3[30348]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.6734278-34005-236179086198711/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:26 compute-0 sudo[30346]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:27 compute-0 sudo[30372]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbilqsvcpepkbaatjwkdjakyplilaqij ; /usr/bin/python3'
Jan 21 23:00:27 compute-0 sudo[30372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:27 compute-0 python3[30374]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:27 compute-0 sudo[30372]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:27 compute-0 sudo[30445]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olchtzwbhwzhmwxrfirwmhidlcnbzpcn ; /usr/bin/python3'
Jan 21 23:00:27 compute-0 sudo[30445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:27 compute-0 python3[30447]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.6734278-34005-236179086198711/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:27 compute-0 sudo[30445]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:27 compute-0 sudo[30471]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeidjkjdbtosxfbzibideblmtryqhvqv ; /usr/bin/python3'
Jan 21 23:00:27 compute-0 sudo[30471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:27 compute-0 python3[30473]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:27 compute-0 sudo[30471]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:27 compute-0 sudo[30544]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udnxerftuwhdlhktvhehqpmzwldeyozk ; /usr/bin/python3'
Jan 21 23:00:27 compute-0 sudo[30544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:28 compute-0 python3[30546]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.6734278-34005-236179086198711/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:28 compute-0 sudo[30544]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:28 compute-0 sudo[30570]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdwiqakytmkhsxleusjobctjjcqlleky ; /usr/bin/python3'
Jan 21 23:00:28 compute-0 sudo[30570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:28 compute-0 python3[30572]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:28 compute-0 sudo[30570]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:28 compute-0 sudo[30643]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwjthfvnwsvizpkrkxxjejlvpxsjlxxx ; /usr/bin/python3'
Jan 21 23:00:28 compute-0 sudo[30643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:28 compute-0 python3[30645]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.6734278-34005-236179086198711/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:28 compute-0 sudo[30643]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:28 compute-0 sudo[30669]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fekdwpboamquafzgiorezhzruhtugtux ; /usr/bin/python3'
Jan 21 23:00:28 compute-0 sudo[30669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:28 compute-0 python3[30671]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:28 compute-0 sudo[30669]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:29 compute-0 sudo[30742]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igbhxuknrtblljnpdmixpjyukeszafux ; /usr/bin/python3'
Jan 21 23:00:29 compute-0 sudo[30742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:29 compute-0 python3[30744]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.6734278-34005-236179086198711/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:29 compute-0 sudo[30742]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:29 compute-0 sudo[30768]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipjvisawbnhymgdqkfsqrmfaartkgeis ; /usr/bin/python3'
Jan 21 23:00:29 compute-0 sudo[30768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:29 compute-0 python3[30770]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 21 23:00:29 compute-0 sudo[30768]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:29 compute-0 sudo[30841]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plcybqsxuehblgcppooiheiflbafpzfm ; /usr/bin/python3'
Jan 21 23:00:29 compute-0 sudo[30841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:00:29 compute-0 python3[30843]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769036425.6734278-34005-236179086198711/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:00:29 compute-0 sudo[30841]: pam_unix(sudo:session): session closed for user root
Jan 21 23:00:33 compute-0 sshd-session[30868]: Connection closed by 192.168.122.11 port 33606 [preauth]
Jan 21 23:00:33 compute-0 sshd-session[30870]: Connection closed by 192.168.122.11 port 33616 [preauth]
Jan 21 23:00:33 compute-0 sshd-session[30869]: Unable to negotiate with 192.168.122.11 port 33644: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 21 23:00:33 compute-0 sshd-session[30871]: Unable to negotiate with 192.168.122.11 port 33628: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 21 23:00:33 compute-0 sshd-session[30872]: Unable to negotiate with 192.168.122.11 port 33632: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 21 23:00:40 compute-0 python3[30901]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:01:01 compute-0 CROND[30904]: (root) CMD (run-parts /etc/cron.hourly)
Jan 21 23:01:01 compute-0 run-parts[30907]: (/etc/cron.hourly) starting 0anacron
Jan 21 23:01:01 compute-0 anacron[30915]: Anacron started on 2026-01-21
Jan 21 23:01:01 compute-0 anacron[30915]: Will run job `cron.daily' in 24 min.
Jan 21 23:01:01 compute-0 anacron[30915]: Will run job `cron.weekly' in 44 min.
Jan 21 23:01:01 compute-0 anacron[30915]: Will run job `cron.monthly' in 64 min.
Jan 21 23:01:01 compute-0 anacron[30915]: Jobs will be executed sequentially
Jan 21 23:01:01 compute-0 run-parts[30917]: (/etc/cron.hourly) finished 0anacron
Jan 21 23:01:01 compute-0 CROND[30903]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 21 23:01:09 compute-0 sshd-session[30918]: Invalid user administrator from 188.166.69.60 port 52616
Jan 21 23:01:09 compute-0 sshd-session[30918]: Connection closed by invalid user administrator 188.166.69.60 port 52616 [preauth]
Jan 21 23:01:55 compute-0 sshd-session[30921]: Invalid user administrator from 188.166.69.60 port 42060
Jan 21 23:01:55 compute-0 sshd-session[30921]: Connection closed by invalid user administrator 188.166.69.60 port 42060 [preauth]
Jan 21 23:02:04 compute-0 sshd-session[30923]: Received disconnect from 91.224.92.78 port 60834:11:  [preauth]
Jan 21 23:02:04 compute-0 sshd-session[30923]: Disconnected from authenticating user root 91.224.92.78 port 60834 [preauth]
Jan 21 23:02:40 compute-0 sshd-session[30925]: Invalid user administrator from 188.166.69.60 port 50376
Jan 21 23:02:40 compute-0 sshd-session[30925]: Connection closed by invalid user administrator 188.166.69.60 port 50376 [preauth]
Jan 21 23:03:24 compute-0 sshd-session[30927]: Invalid user administrator from 188.166.69.60 port 50302
Jan 21 23:03:24 compute-0 sshd-session[30927]: Connection closed by invalid user administrator 188.166.69.60 port 50302 [preauth]
Jan 21 23:04:08 compute-0 sshd-session[30929]: Invalid user administrator from 188.166.69.60 port 52640
Jan 21 23:04:08 compute-0 sshd-session[30929]: Connection closed by invalid user administrator 188.166.69.60 port 52640 [preauth]
Jan 21 23:04:52 compute-0 sshd-session[30931]: Invalid user administrator from 188.166.69.60 port 35718
Jan 21 23:04:53 compute-0 sshd-session[30931]: Connection closed by invalid user administrator 188.166.69.60 port 35718 [preauth]
Jan 21 23:05:37 compute-0 sshd-session[30933]: Invalid user ftpuser from 188.166.69.60 port 58706
Jan 21 23:05:37 compute-0 sshd-session[30933]: Connection closed by invalid user ftpuser 188.166.69.60 port 58706 [preauth]
Jan 21 23:05:39 compute-0 sshd-session[29987]: Received disconnect from 38.102.83.151 port 48400:11: disconnected by user
Jan 21 23:05:39 compute-0 sshd-session[29987]: Disconnected from user zuul 38.102.83.151 port 48400
Jan 21 23:05:39 compute-0 sshd-session[29983]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:05:39 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Jan 21 23:05:39 compute-0 systemd[1]: session-7.scope: Consumed 4.685s CPU time.
Jan 21 23:05:39 compute-0 systemd-logind[784]: Session 7 logged out. Waiting for processes to exit.
Jan 21 23:05:39 compute-0 systemd-logind[784]: Removed session 7.
Jan 21 23:06:21 compute-0 sshd-session[30936]: Invalid user ftpuser from 188.166.69.60 port 46892
Jan 21 23:06:21 compute-0 sshd-session[30936]: Connection closed by invalid user ftpuser 188.166.69.60 port 46892 [preauth]
Jan 21 23:07:05 compute-0 sshd-session[30939]: Invalid user ftpuser from 188.166.69.60 port 57910
Jan 21 23:07:05 compute-0 sshd-session[30939]: Connection closed by invalid user ftpuser 188.166.69.60 port 57910 [preauth]
Jan 21 23:07:51 compute-0 sshd-session[30941]: Invalid user ftpuser from 188.166.69.60 port 49504
Jan 21 23:07:52 compute-0 sshd-session[30941]: Connection closed by invalid user ftpuser 188.166.69.60 port 49504 [preauth]
Jan 21 23:08:37 compute-0 sshd-session[30944]: Invalid user ftpuser from 188.166.69.60 port 53752
Jan 21 23:08:37 compute-0 sshd-session[30944]: Connection closed by invalid user ftpuser 188.166.69.60 port 53752 [preauth]
Jan 21 23:09:17 compute-0 sshd-session[30946]: Received disconnect from 91.224.92.78 port 50286:11:  [preauth]
Jan 21 23:09:17 compute-0 sshd-session[30946]: Disconnected from authenticating user root 91.224.92.78 port 50286 [preauth]
Jan 21 23:09:22 compute-0 sshd-session[30948]: Invalid user ftpuser from 188.166.69.60 port 36732
Jan 21 23:09:22 compute-0 sshd-session[30948]: Connection closed by invalid user ftpuser 188.166.69.60 port 36732 [preauth]
Jan 21 23:10:07 compute-0 sshd-session[30950]: Invalid user ftpuser from 188.166.69.60 port 58272
Jan 21 23:10:07 compute-0 sshd-session[30950]: Connection closed by invalid user ftpuser 188.166.69.60 port 58272 [preauth]
Jan 21 23:10:50 compute-0 sshd-session[30952]: Invalid user ftpuser from 188.166.69.60 port 41582
Jan 21 23:10:50 compute-0 sshd-session[30952]: Connection closed by invalid user ftpuser 188.166.69.60 port 41582 [preauth]
Jan 21 23:11:36 compute-0 sshd-session[30954]: Invalid user ftpuser from 188.166.69.60 port 42944
Jan 21 23:11:36 compute-0 sshd-session[30954]: Connection closed by invalid user ftpuser 188.166.69.60 port 42944 [preauth]
Jan 21 23:12:20 compute-0 sshd-session[30957]: Invalid user mysql from 188.166.69.60 port 35834
Jan 21 23:12:21 compute-0 sshd-session[30957]: Connection closed by invalid user mysql 188.166.69.60 port 35834 [preauth]
Jan 21 23:13:06 compute-0 sshd-session[30960]: Invalid user mysql from 188.166.69.60 port 41330
Jan 21 23:13:07 compute-0 sshd-session[30960]: Connection closed by invalid user mysql 188.166.69.60 port 41330 [preauth]
Jan 21 23:13:51 compute-0 sshd-session[30962]: Invalid user mysql from 188.166.69.60 port 38220
Jan 21 23:13:52 compute-0 sshd-session[30962]: Connection closed by invalid user mysql 188.166.69.60 port 38220 [preauth]
Jan 21 23:14:20 compute-0 sshd-session[30964]: Connection reset by 198.235.24.226 port 64630 [preauth]
Jan 21 23:14:36 compute-0 sshd-session[30966]: Invalid user mysql from 188.166.69.60 port 52796
Jan 21 23:14:36 compute-0 sshd-session[30966]: Connection closed by invalid user mysql 188.166.69.60 port 52796 [preauth]
Jan 21 23:15:21 compute-0 systemd[1]: Starting dnf makecache...
Jan 21 23:15:21 compute-0 sshd-session[30968]: Invalid user mysql from 188.166.69.60 port 44736
Jan 21 23:15:21 compute-0 dnf[30970]: Failed determining last makecache time.
Jan 21 23:15:21 compute-0 sshd-session[30968]: Connection closed by invalid user mysql 188.166.69.60 port 44736 [preauth]
Jan 21 23:15:21 compute-0 dnf[30970]: delorean-openstack-barbican-42b4c41831408a8e323 361 kB/s |  13 kB     00:00
Jan 21 23:15:21 compute-0 dnf[30970]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 3.1 MB/s |  65 kB     00:00
Jan 21 23:15:21 compute-0 dnf[30970]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.5 MB/s |  32 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-python-stevedore-c4acc5639fd2329372142 4.9 MB/s | 131 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.5 MB/s |  32 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-os-refresh-config-9bfc52b5049be2d8de61  14 MB/s | 349 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 2.2 MB/s |  42 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-python-designate-tests-tempest-347fdbc 845 kB/s |  18 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-openstack-glance-1fd12c29b339f30fe823e 977 kB/s |  18 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.5 MB/s |  29 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-openstack-manila-3c01b7181572c95dac462 1.1 MB/s |  25 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-python-whitebox-neutron-tests-tempest- 7.4 MB/s | 154 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-openstack-octavia-ba397f07a7331190208c 1.1 MB/s |  26 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-openstack-watcher-c014f81a8647287f6dcc 693 kB/s |  16 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-ansible-config_template-5ccaa22121a7ff 328 kB/s | 7.4 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 5.8 MB/s | 144 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-openstack-swift-dc98a8463506ac520c469a 649 kB/s |  14 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-python-tempestconf-8515371b7cceebd4282 2.2 MB/s |  53 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: delorean-openstack-heat-ui-013accbfd179753bc3f0 4.5 MB/s |  96 kB     00:00
Jan 21 23:15:22 compute-0 dnf[30970]: CentOS Stream 9 - BaseOS                         66 kB/s | 6.7 kB     00:00
Jan 21 23:15:23 compute-0 dnf[30970]: CentOS Stream 9 - AppStream                      29 kB/s | 6.8 kB     00:00
Jan 21 23:15:23 compute-0 dnf[30970]: CentOS Stream 9 - CRB                            73 kB/s | 6.6 kB     00:00
Jan 21 23:15:23 compute-0 dnf[30970]: CentOS Stream 9 - Extras packages                32 kB/s | 7.3 kB     00:00
Jan 21 23:15:23 compute-0 dnf[30970]: dlrn-antelope-testing                            34 MB/s | 1.1 MB     00:00
Jan 21 23:15:23 compute-0 dnf[30970]: dlrn-antelope-build-deps                         18 MB/s | 461 kB     00:00
Jan 21 23:15:24 compute-0 dnf[30970]: centos9-rabbitmq                                9.6 MB/s | 123 kB     00:00
Jan 21 23:15:24 compute-0 dnf[30970]: centos9-storage                                  22 MB/s | 415 kB     00:00
Jan 21 23:15:24 compute-0 dnf[30970]: centos9-opstools                                3.6 MB/s |  51 kB     00:00
Jan 21 23:15:24 compute-0 dnf[30970]: NFV SIG OpenvSwitch                              23 MB/s | 461 kB     00:00
Jan 21 23:15:25 compute-0 dnf[30970]: repo-setup-centos-appstream                      87 MB/s |  26 MB     00:00
Jan 21 23:15:31 compute-0 dnf[30970]: repo-setup-centos-baseos                         76 MB/s | 8.9 MB     00:00
Jan 21 23:15:32 compute-0 dnf[30970]: repo-setup-centos-highavailability               38 MB/s | 744 kB     00:00
Jan 21 23:15:33 compute-0 dnf[30970]: repo-setup-centos-powertools                     54 MB/s | 7.6 MB     00:00
Jan 21 23:15:34 compute-0 sshd-session[31066]: Accepted publickey for zuul from 192.168.122.30 port 55092 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:15:34 compute-0 systemd-logind[784]: New session 8 of user zuul.
Jan 21 23:15:34 compute-0 systemd[1]: Started Session 8 of User zuul.
Jan 21 23:15:34 compute-0 sshd-session[31066]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:15:35 compute-0 python3.9[31220]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:15:36 compute-0 sudo[31404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mktrnzqvxhprszivofxaybrfrglxmxpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037335.8967738-56-4963708227287/AnsiballZ_command.py'
Jan 21 23:15:36 compute-0 sudo[31404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:15:36 compute-0 dnf[30970]: Extra Packages for Enterprise Linux 9 - x86_64   14 MB/s |  20 MB     00:01
Jan 21 23:15:36 compute-0 python3.9[31406]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:15:51 compute-0 sudo[31404]: pam_unix(sudo:session): session closed for user root
Jan 21 23:15:51 compute-0 sshd-session[31069]: Connection closed by 192.168.122.30 port 55092
Jan 21 23:15:51 compute-0 sshd-session[31066]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:15:51 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Jan 21 23:15:51 compute-0 systemd[1]: session-8.scope: Consumed 9.336s CPU time.
Jan 21 23:15:51 compute-0 systemd-logind[784]: Session 8 logged out. Waiting for processes to exit.
Jan 21 23:15:51 compute-0 systemd-logind[784]: Removed session 8.
Jan 21 23:15:54 compute-0 dnf[30970]: Metadata cache created.
Jan 21 23:15:54 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 21 23:15:54 compute-0 systemd[1]: Finished dnf makecache.
Jan 21 23:15:54 compute-0 systemd[1]: dnf-makecache.service: Consumed 30.124s CPU time.
Jan 21 23:16:04 compute-0 sshd-session[31464]: Invalid user mysql from 188.166.69.60 port 60552
Jan 21 23:16:04 compute-0 sshd-session[31464]: Connection closed by invalid user mysql 188.166.69.60 port 60552 [preauth]
Jan 21 23:16:07 compute-0 sshd-session[31466]: Accepted publickey for zuul from 192.168.122.30 port 33398 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:16:07 compute-0 systemd-logind[784]: New session 9 of user zuul.
Jan 21 23:16:07 compute-0 systemd[1]: Started Session 9 of User zuul.
Jan 21 23:16:07 compute-0 sshd-session[31466]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:16:07 compute-0 python3.9[31619]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 21 23:16:09 compute-0 python3.9[31793]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:16:10 compute-0 sudo[31943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkldoccqfuhfcaudoyzhemrsvpxezqle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037369.6722393-93-188575322412453/AnsiballZ_command.py'
Jan 21 23:16:10 compute-0 sudo[31943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:10 compute-0 python3.9[31945]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:16:10 compute-0 sudo[31943]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:11 compute-0 sudo[32096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aibbwdrbnpefulwxfpwjwkohqpcuzaoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037370.8353896-129-79399722515988/AnsiballZ_stat.py'
Jan 21 23:16:11 compute-0 sudo[32096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:11 compute-0 python3.9[32098]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:16:11 compute-0 sudo[32096]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:12 compute-0 sudo[32248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tykltrfpypsmjoillekxpbgpwydliodt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037371.794718-153-76874208383369/AnsiballZ_file.py'
Jan 21 23:16:12 compute-0 sudo[32248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:12 compute-0 python3.9[32250]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:16:12 compute-0 sudo[32248]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:13 compute-0 sudo[32400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hagvjeeqmqewxvibbkkgijkdswctmupl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037372.7619371-177-280998063525453/AnsiballZ_stat.py'
Jan 21 23:16:13 compute-0 sudo[32400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:13 compute-0 python3.9[32402]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:16:13 compute-0 sudo[32400]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:13 compute-0 sudo[32523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbmntwnauxnaflqmgptzmortaxubnrlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037372.7619371-177-280998063525453/AnsiballZ_copy.py'
Jan 21 23:16:13 compute-0 sudo[32523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:14 compute-0 python3.9[32525]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037372.7619371-177-280998063525453/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:16:14 compute-0 sudo[32523]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:14 compute-0 sudo[32675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgoovtgvulvbxlpgxxntyrnawfzyulzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037374.2879758-222-236349447855084/AnsiballZ_setup.py'
Jan 21 23:16:14 compute-0 sudo[32675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:14 compute-0 python3.9[32677]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:16:15 compute-0 sudo[32675]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:15 compute-0 sudo[32831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbygcfsxmvuefqtgnbnghjeclelzjrtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037375.4414074-246-16699552676862/AnsiballZ_file.py'
Jan 21 23:16:15 compute-0 sudo[32831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:15 compute-0 python3.9[32833]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:16:15 compute-0 sudo[32831]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:16 compute-0 sudo[32983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haeggageueabdnppudlqenstzivymwyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037376.268203-273-217497328952464/AnsiballZ_file.py'
Jan 21 23:16:16 compute-0 sudo[32983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:16 compute-0 python3.9[32985]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:16:16 compute-0 sudo[32983]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:17 compute-0 python3.9[33135]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:16:21 compute-0 python3.9[33388]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:16:22 compute-0 python3.9[33538]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:16:24 compute-0 python3.9[33692]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:16:25 compute-0 sudo[33848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbvabweubmoegyyupivpkcnoyvivsfrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037385.109317-417-51840692199747/AnsiballZ_setup.py'
Jan 21 23:16:25 compute-0 sudo[33848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:25 compute-0 python3.9[33850]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:16:25 compute-0 sudo[33848]: pam_unix(sudo:session): session closed for user root
Jan 21 23:16:26 compute-0 sudo[33932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ledwrofgekyqeylvgprpcejftlbnwnxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037385.109317-417-51840692199747/AnsiballZ_dnf.py'
Jan 21 23:16:26 compute-0 sudo[33932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:16:26 compute-0 python3.9[33934]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:16:47 compute-0 sshd-session[34072]: Invalid user mysql from 188.166.69.60 port 57166
Jan 21 23:16:47 compute-0 sshd-session[34072]: Connection closed by invalid user mysql 188.166.69.60 port 57166 [preauth]
Jan 21 23:16:49 compute-0 sshd-session[34075]: Received disconnect from 45.148.10.157 port 47252:11:  [preauth]
Jan 21 23:16:49 compute-0 sshd-session[34075]: Disconnected from authenticating user root 45.148.10.157 port 47252 [preauth]
Jan 21 23:17:00 compute-0 systemd[1]: Reloading.
Jan 21 23:17:00 compute-0 systemd-rc-local-generator[34132]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:17:00 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 21 23:17:01 compute-0 systemd[1]: Reloading.
Jan 21 23:17:01 compute-0 systemd-rc-local-generator[34175]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:17:01 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 21 23:17:01 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 21 23:17:01 compute-0 systemd[1]: Reloading.
Jan 21 23:17:01 compute-0 systemd-rc-local-generator[34212]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:17:01 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 21 23:17:01 compute-0 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Jan 21 23:17:01 compute-0 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Jan 21 23:17:02 compute-0 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Jan 21 23:17:31 compute-0 sshd-session[34347]: Invalid user mysql from 188.166.69.60 port 40760
Jan 21 23:17:31 compute-0 sshd-session[34347]: Connection closed by invalid user mysql 188.166.69.60 port 40760 [preauth]
Jan 21 23:18:07 compute-0 kernel: SELinux:  Converting 2725 SID table entries...
Jan 21 23:18:07 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 23:18:07 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 21 23:18:07 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 23:18:07 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 21 23:18:07 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 23:18:07 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 23:18:07 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 23:18:07 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 21 23:18:07 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:18:07 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:18:07 compute-0 systemd[1]: Reloading.
Jan 21 23:18:07 compute-0 systemd-rc-local-generator[34549]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:18:08 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:18:08 compute-0 sudo[33932]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:08 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:18:08 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:18:08 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.166s CPU time.
Jan 21 23:18:08 compute-0 systemd[1]: run-r5181a90380cd4f9dac9fe83d7daae0b5.service: Deactivated successfully.
Jan 21 23:18:15 compute-0 sshd-session[35340]: Invalid user mysql from 188.166.69.60 port 43794
Jan 21 23:18:15 compute-0 sshd-session[35340]: Connection closed by invalid user mysql 188.166.69.60 port 43794 [preauth]
Jan 21 23:18:45 compute-0 sudo[35467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yekhdsbozupmwavxcqzinkocahsqycgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037525.0352945-453-119948141649290/AnsiballZ_command.py'
Jan 21 23:18:45 compute-0 sudo[35467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:18:45 compute-0 python3.9[35469]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:18:46 compute-0 sudo[35467]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:47 compute-0 sudo[35748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbqhxtqbmzqhtbsmrjdcfagvxhoxxuka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037526.7299862-477-227493920520760/AnsiballZ_selinux.py'
Jan 21 23:18:47 compute-0 sudo[35748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:18:47 compute-0 python3.9[35750]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 21 23:18:47 compute-0 sudo[35748]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:48 compute-0 sudo[35900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdkbikdateptgjoyhiynabscfyzhgcgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037528.246824-510-168645137822266/AnsiballZ_command.py'
Jan 21 23:18:48 compute-0 sudo[35900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:18:48 compute-0 python3.9[35902]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 21 23:18:50 compute-0 sudo[35900]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:52 compute-0 sudo[36053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbbirnwewtntmhyimnsfjkuleiayfrpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037531.9806204-534-67395606365334/AnsiballZ_file.py'
Jan 21 23:18:52 compute-0 sudo[36053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:18:52 compute-0 python3.9[36055]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:18:52 compute-0 sudo[36053]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:53 compute-0 sudo[36205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nasadrmmyzukeeuxeuuxgizgrptoynhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037532.9536233-558-111483836613925/AnsiballZ_mount.py'
Jan 21 23:18:53 compute-0 sudo[36205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:18:53 compute-0 python3.9[36207]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 21 23:18:53 compute-0 sudo[36205]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:55 compute-0 sudo[36357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtcmwqkjnoodhwkfaschbjbiwysiurwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037535.4908762-642-116128572625337/AnsiballZ_file.py'
Jan 21 23:18:55 compute-0 sudo[36357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:18:58 compute-0 python3.9[36359]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:18:58 compute-0 sudo[36357]: pam_unix(sudo:session): session closed for user root
Jan 21 23:18:58 compute-0 sshd-session[36360]: Invalid user backup from 188.166.69.60 port 60114
Jan 21 23:18:58 compute-0 sshd-session[36360]: Connection closed by invalid user backup 188.166.69.60 port 60114 [preauth]
Jan 21 23:18:59 compute-0 sudo[36511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxgwfisduogmmrhtvcxuhkmjkhgfghdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037538.6960738-666-175766716721002/AnsiballZ_stat.py'
Jan 21 23:18:59 compute-0 sudo[36511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:00 compute-0 python3.9[36513]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:19:00 compute-0 sudo[36511]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:00 compute-0 sudo[36634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyumhhnxgkstrrgnodgqtufubwhevjvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037538.6960738-666-175766716721002/AnsiballZ_copy.py'
Jan 21 23:19:00 compute-0 sudo[36634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:01 compute-0 python3.9[36636]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037538.6960738-666-175766716721002/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:19:01 compute-0 sudo[36634]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:04 compute-0 sudo[36786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvzavpzqdrcweddhvienlyhgiadkjkul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037543.9346833-738-85555061213421/AnsiballZ_stat.py'
Jan 21 23:19:04 compute-0 sudo[36786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:04 compute-0 python3.9[36788]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:19:04 compute-0 sudo[36786]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:05 compute-0 sudo[36938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcyzneoyrmljkblykrogaxwvfgawnruf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037545.680471-762-265802668975468/AnsiballZ_command.py'
Jan 21 23:19:05 compute-0 sudo[36938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:06 compute-0 python3.9[36940]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:06 compute-0 sudo[36938]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:06 compute-0 sudo[37091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkvtiyafktvidjrcyrnmopkhmqgxoprb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037546.5147016-786-18344718624874/AnsiballZ_file.py'
Jan 21 23:19:06 compute-0 sudo[37091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:06 compute-0 python3.9[37093]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:19:06 compute-0 sudo[37091]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:08 compute-0 sudo[37243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxktumkbogaxgbqmyfytfatcvtclgpkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037547.6072826-819-241826409949176/AnsiballZ_getent.py'
Jan 21 23:19:08 compute-0 sudo[37243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:08 compute-0 python3.9[37245]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 21 23:19:08 compute-0 sudo[37243]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:08 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:19:09 compute-0 sudo[37397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oilpevyxojtxvpsubbwiqqxeiipuemmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037548.6231978-843-110170085629185/AnsiballZ_group.py'
Jan 21 23:19:09 compute-0 sudo[37397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:09 compute-0 python3.9[37399]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 23:19:09 compute-0 groupadd[37400]: group added to /etc/group: name=qemu, GID=107
Jan 21 23:19:09 compute-0 groupadd[37400]: group added to /etc/gshadow: name=qemu
Jan 21 23:19:09 compute-0 groupadd[37400]: new group: name=qemu, GID=107
Jan 21 23:19:09 compute-0 sudo[37397]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:10 compute-0 sudo[37555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvfdlnvavjqarrnsffnxulerffmvfrxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037550.1138942-867-138397181881434/AnsiballZ_user.py'
Jan 21 23:19:10 compute-0 sudo[37555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:10 compute-0 python3.9[37557]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 23:19:10 compute-0 useradd[37559]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 21 23:19:11 compute-0 sudo[37555]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:11 compute-0 sudo[37715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wprzhecatzvsxshewekkapqaeslmnmon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037551.6166275-891-90478783768122/AnsiballZ_getent.py'
Jan 21 23:19:11 compute-0 sudo[37715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:12 compute-0 python3.9[37717]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 21 23:19:12 compute-0 sudo[37715]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:12 compute-0 sudo[37868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfjowyhemfyxplgcnzumslhedbosketu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037552.429886-915-187159169690042/AnsiballZ_group.py'
Jan 21 23:19:12 compute-0 sudo[37868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:12 compute-0 python3.9[37870]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 23:19:12 compute-0 groupadd[37871]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 21 23:19:12 compute-0 groupadd[37871]: group added to /etc/gshadow: name=hugetlbfs
Jan 21 23:19:12 compute-0 groupadd[37871]: new group: name=hugetlbfs, GID=42477
Jan 21 23:19:12 compute-0 sudo[37868]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:13 compute-0 sudo[38026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-povaspjtquwjuiotwttaiewjsyhkpplg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037553.384196-942-132023729282959/AnsiballZ_file.py'
Jan 21 23:19:13 compute-0 sudo[38026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:13 compute-0 python3.9[38028]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 21 23:19:13 compute-0 sudo[38026]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:14 compute-0 sudo[38178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnwehevmoewvqnoisqirpzcbrwsnkvfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037554.53418-975-51882790713066/AnsiballZ_dnf.py'
Jan 21 23:19:14 compute-0 sudo[38178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:15 compute-0 python3.9[38180]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:19:16 compute-0 sudo[38178]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:17 compute-0 sudo[38331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quvgzjpzyhmiujmjuzjyvhqdybrmqdhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037557.6082993-999-136605808931159/AnsiballZ_file.py'
Jan 21 23:19:17 compute-0 sudo[38331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:18 compute-0 python3.9[38333]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:19:18 compute-0 sudo[38331]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:18 compute-0 sudo[38483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylzeydbhbclqvyldqocogwitbkxborum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037558.4751198-1023-171002303146906/AnsiballZ_stat.py'
Jan 21 23:19:18 compute-0 sudo[38483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:18 compute-0 python3.9[38485]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:19:18 compute-0 sudo[38483]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:19 compute-0 sudo[38606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkvzjpfqymmhhepunlnfbxagmabxuedw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037558.4751198-1023-171002303146906/AnsiballZ_copy.py'
Jan 21 23:19:19 compute-0 sudo[38606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:19 compute-0 python3.9[38608]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037558.4751198-1023-171002303146906/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:19:19 compute-0 sudo[38606]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:20 compute-0 sudo[38758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odyvjqqqraidhcdprwqpmalhyusacpim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037559.941334-1068-29117610202068/AnsiballZ_systemd.py'
Jan 21 23:19:20 compute-0 sudo[38758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:20 compute-0 python3.9[38760]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:19:22 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 21 23:19:22 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 21 23:19:22 compute-0 kernel: Bridge firewalling registered
Jan 21 23:19:22 compute-0 systemd-modules-load[38764]: Inserted module 'br_netfilter'
Jan 21 23:19:22 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 21 23:19:22 compute-0 sudo[38758]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:22 compute-0 sudo[38917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juvwvpxdmdvyfjdjwwyvuiineixarpat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037562.3272328-1092-183879948638974/AnsiballZ_stat.py'
Jan 21 23:19:22 compute-0 sudo[38917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:22 compute-0 python3.9[38919]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:19:22 compute-0 sudo[38917]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:23 compute-0 sudo[39040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njxbdovverkbclyjjmbfpvpddxukjxig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037562.3272328-1092-183879948638974/AnsiballZ_copy.py'
Jan 21 23:19:23 compute-0 sudo[39040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:23 compute-0 python3.9[39042]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037562.3272328-1092-183879948638974/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:19:23 compute-0 sudo[39040]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:24 compute-0 sudo[39192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmhkobvkppkzmrksyspxsethovhqahac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037564.0544868-1146-30167630612731/AnsiballZ_dnf.py'
Jan 21 23:19:24 compute-0 sudo[39192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:24 compute-0 python3.9[39194]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:19:28 compute-0 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Jan 21 23:19:28 compute-0 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Jan 21 23:19:28 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:19:28 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:19:28 compute-0 systemd[1]: Reloading.
Jan 21 23:19:28 compute-0 systemd-rc-local-generator[39259]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:19:29 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:19:29 compute-0 sudo[39192]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:30 compute-0 python3.9[40580]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:19:31 compute-0 python3.9[41452]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 21 23:19:31 compute-0 python3.9[42117]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:19:32 compute-0 sudo[43008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eodhpjvxkemmayibubmhaasyyukjtsso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037572.319819-1263-44562459961919/AnsiballZ_command.py'
Jan 21 23:19:32 compute-0 sudo[43008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:32 compute-0 python3.9[43022]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:32 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 21 23:19:33 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:19:33 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:19:33 compute-0 systemd[1]: man-db-cache-update.service: Consumed 5.569s CPU time.
Jan 21 23:19:33 compute-0 systemd[1]: run-r749c8a8353de4b4c952fb77ea0a0fcfd.service: Deactivated successfully.
Jan 21 23:19:33 compute-0 systemd[1]: Starting Authorization Manager...
Jan 21 23:19:33 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 21 23:19:33 compute-0 polkitd[43580]: Started polkitd version 0.117
Jan 21 23:19:33 compute-0 polkitd[43580]: Loading rules from directory /etc/polkit-1/rules.d
Jan 21 23:19:33 compute-0 polkitd[43580]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 21 23:19:33 compute-0 polkitd[43580]: Finished loading, compiling and executing 2 rules
Jan 21 23:19:33 compute-0 systemd[1]: Started Authorization Manager.
Jan 21 23:19:33 compute-0 polkitd[43580]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 21 23:19:33 compute-0 sudo[43008]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:34 compute-0 sudo[43748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udvooegqqumaebzkkyspnpfcjwmeguwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037573.8037248-1290-34842196577720/AnsiballZ_systemd.py'
Jan 21 23:19:34 compute-0 sudo[43748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:34 compute-0 python3.9[43750]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:19:34 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 21 23:19:34 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Jan 21 23:19:34 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 21 23:19:34 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 21 23:19:34 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 21 23:19:34 compute-0 sudo[43748]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:35 compute-0 python3.9[43911]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 21 23:19:39 compute-0 sudo[44061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aumsoqlghaiyascxzdqpdeilxgsehciu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037578.9459858-1461-240561049634419/AnsiballZ_systemd.py'
Jan 21 23:19:39 compute-0 sudo[44061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:39 compute-0 python3.9[44063]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:19:39 compute-0 systemd[1]: Reloading.
Jan 21 23:19:39 compute-0 systemd-rc-local-generator[44087]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:19:39 compute-0 sudo[44061]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:40 compute-0 sudo[44250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yudnlrcpvtolgwmravsmticuwekogaym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037579.9574559-1461-146876658068070/AnsiballZ_systemd.py'
Jan 21 23:19:40 compute-0 sudo[44250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:40 compute-0 python3.9[44252]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:19:40 compute-0 systemd[1]: Reloading.
Jan 21 23:19:40 compute-0 systemd-rc-local-generator[44285]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:19:40 compute-0 sudo[44250]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:41 compute-0 sudo[44440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nycgwvorwiwjbuaqwciawjdsgorrhlij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037581.3963969-1509-180867870239048/AnsiballZ_command.py'
Jan 21 23:19:41 compute-0 sudo[44440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:41 compute-0 python3.9[44442]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:41 compute-0 sudo[44440]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:42 compute-0 sudo[44593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzbcsxdgciyhokpvpesmympkbqtkgggq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037582.155524-1533-91747789565574/AnsiballZ_command.py'
Jan 21 23:19:42 compute-0 sudo[44593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:42 compute-0 python3.9[44595]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:42 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 21 23:19:42 compute-0 sudo[44593]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:43 compute-0 sudo[44748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpensxzunvsxcuqumjnqhpdnooqohlqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037582.9555118-1557-245474507314603/AnsiballZ_command.py'
Jan 21 23:19:43 compute-0 sudo[44748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:43 compute-0 sshd-session[44634]: Invalid user backup from 188.166.69.60 port 42032
Jan 21 23:19:43 compute-0 sshd-session[44634]: Connection closed by invalid user backup 188.166.69.60 port 42032 [preauth]
Jan 21 23:19:43 compute-0 python3.9[44750]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:44 compute-0 sudo[44748]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:45 compute-0 sudo[44910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnhyhnbutipvwabuqthlvjyiufhozqsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037585.1460712-1581-116388688154025/AnsiballZ_command.py'
Jan 21 23:19:45 compute-0 sudo[44910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:45 compute-0 python3.9[44912]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:45 compute-0 sudo[44910]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:46 compute-0 sudo[45063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfygrgyysiedyazzjtvihyyaewzavypp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037585.8865333-1605-169032333147684/AnsiballZ_systemd.py'
Jan 21 23:19:46 compute-0 sudo[45063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:46 compute-0 python3.9[45065]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:19:46 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 21 23:19:46 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Jan 21 23:19:46 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Jan 21 23:19:46 compute-0 systemd[1]: Starting Apply Kernel Variables...
Jan 21 23:19:46 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 21 23:19:46 compute-0 systemd[1]: Finished Apply Kernel Variables.
Jan 21 23:19:46 compute-0 sudo[45063]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:47 compute-0 sshd-session[31469]: Connection closed by 192.168.122.30 port 33398
Jan 21 23:19:47 compute-0 sshd-session[31466]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:19:47 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Jan 21 23:19:47 compute-0 systemd[1]: session-9.scope: Consumed 2min 6.635s CPU time.
Jan 21 23:19:47 compute-0 systemd-logind[784]: Session 9 logged out. Waiting for processes to exit.
Jan 21 23:19:47 compute-0 systemd-logind[784]: Removed session 9.
Jan 21 23:19:52 compute-0 sshd-session[45095]: Accepted publickey for zuul from 192.168.122.30 port 36734 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:19:52 compute-0 systemd-logind[784]: New session 10 of user zuul.
Jan 21 23:19:52 compute-0 systemd[1]: Started Session 10 of User zuul.
Jan 21 23:19:52 compute-0 sshd-session[45095]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:19:53 compute-0 python3.9[45248]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:19:55 compute-0 python3.9[45402]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:19:56 compute-0 sudo[45556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwquausfqlzfkipgrdpumhflghckdrgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037596.1798992-110-44295037222068/AnsiballZ_command.py'
Jan 21 23:19:56 compute-0 sudo[45556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:56 compute-0 python3.9[45558]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:19:56 compute-0 sudo[45556]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:57 compute-0 python3.9[45709]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:19:58 compute-0 sudo[45863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mosnhxtalbbzzvgfbxanwfxcfmmlrkrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037598.411157-170-43804887627320/AnsiballZ_setup.py'
Jan 21 23:19:58 compute-0 sudo[45863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:59 compute-0 python3.9[45865]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:19:59 compute-0 sudo[45863]: pam_unix(sudo:session): session closed for user root
Jan 21 23:19:59 compute-0 sudo[45947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjlzuybsbncihtzrfqrajwbpxafhfxqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037598.411157-170-43804887627320/AnsiballZ_dnf.py'
Jan 21 23:19:59 compute-0 sudo[45947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:19:59 compute-0 python3.9[45949]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:20:01 compute-0 sudo[45947]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:01 compute-0 sudo[46100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdfzwxcaswkqfoifbinxlpvlhdlzgnsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037601.6540976-206-155396430475207/AnsiballZ_setup.py'
Jan 21 23:20:01 compute-0 sudo[46100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:02 compute-0 python3.9[46102]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:20:02 compute-0 sudo[46100]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:03 compute-0 sudo[46271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsrkxrjrswdzevsnrsmcmbcrrlqgudbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037602.8042374-239-173510984699554/AnsiballZ_file.py'
Jan 21 23:20:03 compute-0 sudo[46271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:03 compute-0 python3.9[46273]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:20:03 compute-0 sudo[46271]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:03 compute-0 sudo[46423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axpsgufbigwztmzdhieksoyosesijwfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037603.7059567-263-103110794888238/AnsiballZ_command.py'
Jan 21 23:20:03 compute-0 sudo[46423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:04 compute-0 python3.9[46425]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:20:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1086432503-merged.mount: Deactivated successfully.
Jan 21 23:20:04 compute-0 podman[46426]: 2026-01-21 23:20:04.207264349 +0000 UTC m=+0.050516434 system refresh
Jan 21 23:20:04 compute-0 sudo[46423]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:04 compute-0 sudo[46586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqbzsloezwsjgymiumffdsrojfcgsgvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037604.482999-287-136132510979205/AnsiballZ_stat.py'
Jan 21 23:20:04 compute-0 sudo[46586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:05 compute-0 python3.9[46588]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:20:05 compute-0 sudo[46586]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:20:05 compute-0 sudo[46709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmapovommbmmbfaarfvbwvlueqzuckpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037604.482999-287-136132510979205/AnsiballZ_copy.py'
Jan 21 23:20:05 compute-0 sudo[46709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:05 compute-0 python3.9[46711]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037604.482999-287-136132510979205/.source.json follow=False _original_basename=podman_network_config.j2 checksum=66903f4c6c0d83e6e550bf4794dc618e2885e15c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:20:05 compute-0 sudo[46709]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:06 compute-0 sudo[46861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljcvurqwldukndrzvuomqlukhnxltmgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037606.0544028-332-44976429106446/AnsiballZ_stat.py'
Jan 21 23:20:06 compute-0 sudo[46861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:06 compute-0 python3.9[46863]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:20:06 compute-0 sudo[46861]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:06 compute-0 sudo[46984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdiwmuiuerpyxzvdxggazksraiaypzik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037606.0544028-332-44976429106446/AnsiballZ_copy.py'
Jan 21 23:20:06 compute-0 sudo[46984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:07 compute-0 python3.9[46986]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037606.0544028-332-44976429106446/.source.conf follow=False _original_basename=registries.conf.j2 checksum=51f7dfe021bf6a784cb4010cf142a3df219fb1a0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:20:07 compute-0 sudo[46984]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:07 compute-0 sudo[47136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzbupzshirglxhawxklcvejdrhkculfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037607.5546389-380-126555206157237/AnsiballZ_ini_file.py'
Jan 21 23:20:07 compute-0 sudo[47136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:08 compute-0 python3.9[47138]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:20:08 compute-0 sudo[47136]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:08 compute-0 sudo[47288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuwxricktmikdonxjhobxbahrwfydvgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037608.346966-380-196400942221519/AnsiballZ_ini_file.py'
Jan 21 23:20:08 compute-0 sudo[47288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:08 compute-0 python3.9[47290]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:20:08 compute-0 sudo[47288]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:09 compute-0 sudo[47440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjorfqhadvhxwlptbbaukmtmfeztbgni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037608.9931655-380-255888134157985/AnsiballZ_ini_file.py'
Jan 21 23:20:09 compute-0 sudo[47440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:09 compute-0 python3.9[47442]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:20:09 compute-0 sudo[47440]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:09 compute-0 sudo[47592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbqoqnregplwkjepsuybtbgtwklxewpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037609.567816-380-44236961416750/AnsiballZ_ini_file.py'
Jan 21 23:20:09 compute-0 sudo[47592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:10 compute-0 python3.9[47594]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:20:10 compute-0 sudo[47592]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:11 compute-0 python3.9[47744]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:20:11 compute-0 sudo[47896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifcizrutvnipdftwpueolcayekkedsyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037611.6894686-500-42864815080405/AnsiballZ_dnf.py'
Jan 21 23:20:11 compute-0 sudo[47896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:12 compute-0 python3.9[47898]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:13 compute-0 sudo[47896]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:13 compute-0 sudo[48049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqoekofhyqezddbqpwlbpzmyqsvstvke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037613.648329-524-240744656022724/AnsiballZ_dnf.py'
Jan 21 23:20:13 compute-0 sudo[48049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:14 compute-0 python3.9[48051]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:15 compute-0 sudo[48049]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:17 compute-0 sudo[48209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubjmwbdyoplaycoocjffunxjzpymojjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037616.8454194-554-122277045776907/AnsiballZ_dnf.py'
Jan 21 23:20:17 compute-0 sudo[48209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:17 compute-0 python3.9[48211]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:19 compute-0 sudo[48209]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:19 compute-0 sudo[48362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsppjewzngmrjokuiblinnfxvblepwxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037619.6655846-581-149290615811077/AnsiballZ_dnf.py'
Jan 21 23:20:19 compute-0 sudo[48362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:20 compute-0 python3.9[48364]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:21 compute-0 sudo[48362]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:22 compute-0 sudo[48515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysgakwudpcsfqxwtmjhnxmibmvvrlvvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037622.0081701-614-273390773485519/AnsiballZ_dnf.py'
Jan 21 23:20:22 compute-0 sudo[48515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:22 compute-0 python3.9[48517]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:23 compute-0 sudo[48515]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:24 compute-0 sudo[48671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mchvrzmxsdxtvflvmijotidqouaivmkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037624.2818563-638-51563613005651/AnsiballZ_dnf.py'
Jan 21 23:20:24 compute-0 sudo[48671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:24 compute-0 python3.9[48673]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:27 compute-0 sudo[48671]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:28 compute-0 sshd-session[48715]: Invalid user backup from 188.166.69.60 port 55778
Jan 21 23:20:28 compute-0 sshd-session[48715]: Connection closed by invalid user backup 188.166.69.60 port 55778 [preauth]
Jan 21 23:20:28 compute-0 sudo[48842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbmyvdmaaaqvjnmmogybzcuxjkjusehn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037628.2235472-665-65968262920967/AnsiballZ_dnf.py'
Jan 21 23:20:28 compute-0 sudo[48842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:28 compute-0 python3.9[48844]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:29 compute-0 sudo[48842]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:30 compute-0 sudo[48995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtiiztkntckkgpanzouvazpjcgbdzztw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037630.2790575-692-175929615967290/AnsiballZ_dnf.py'
Jan 21 23:20:30 compute-0 sudo[48995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:30 compute-0 python3.9[48997]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:42 compute-0 sudo[48995]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:50 compute-0 sudo[49330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmuljqslcspeljscgewltwdqtcblhpyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037650.3613107-719-40443725635287/AnsiballZ_dnf.py'
Jan 21 23:20:50 compute-0 sudo[49330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:50 compute-0 python3.9[49332]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:52 compute-0 sudo[49330]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:53 compute-0 sudo[49486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeqzrtfzkuhushlgsqdqbaufppluened ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037653.1395702-749-214911071868077/AnsiballZ_dnf.py'
Jan 21 23:20:53 compute-0 sudo[49486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:53 compute-0 python3.9[49488]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:20:55 compute-0 sudo[49486]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:56 compute-0 sudo[49643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umpdwzhaqpqgzksmykxtlwihmbgiphih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037656.0857358-782-275699663363610/AnsiballZ_file.py'
Jan 21 23:20:56 compute-0 sudo[49643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:56 compute-0 python3.9[49645]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:20:56 compute-0 sudo[49643]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:57 compute-0 sudo[49818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbsdgwaywvrzpecujrtdtddoaqdktyyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037656.8422005-806-118534456571717/AnsiballZ_stat.py'
Jan 21 23:20:57 compute-0 sudo[49818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:57 compute-0 python3.9[49820]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:20:57 compute-0 sudo[49818]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:57 compute-0 sudo[49941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpzlohaqnkkwfgyngxmobxkozeiesril ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037656.8422005-806-118534456571717/AnsiballZ_copy.py'
Jan 21 23:20:57 compute-0 sudo[49941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:57 compute-0 python3.9[49943]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769037656.8422005-806-118534456571717/.source.json _original_basename=.hqko5opv follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:20:57 compute-0 sudo[49941]: pam_unix(sudo:session): session closed for user root
Jan 21 23:20:58 compute-0 sudo[50093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thithctxaqujsfuxldsxuogifjjgisxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037658.383719-860-213239223816724/AnsiballZ_podman_image.py'
Jan 21 23:20:58 compute-0 sudo[50093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:20:59 compute-0 python3.9[50095]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 23:20:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2812883745-lower\x2dmapped.mount: Deactivated successfully.
Jan 21 23:21:08 compute-0 podman[50108]: 2026-01-21 23:21:08.305226053 +0000 UTC m=+9.131882863 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 21 23:21:08 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:08 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:08 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:08 compute-0 sudo[50093]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:09 compute-0 sudo[50403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owxucliszlxsvrjcypbvozqglmvhzbny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037669.0163152-899-78314440877613/AnsiballZ_podman_image.py'
Jan 21 23:21:09 compute-0 sudo[50403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:09 compute-0 python3.9[50405]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 23:21:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:12 compute-0 sshd-session[50462]: Invalid user backup from 188.166.69.60 port 42086
Jan 21 23:21:12 compute-0 sshd-session[50462]: Connection closed by invalid user backup 188.166.69.60 port 42086 [preauth]
Jan 21 23:21:27 compute-0 podman[50417]: 2026-01-21 23:21:27.368150678 +0000 UTC m=+17.827633308 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 21 23:21:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:27 compute-0 sudo[50403]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:28 compute-0 sudo[50696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbpfjlipvthwzssbprtkrddivclnwmfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037688.2113576-932-216614283465736/AnsiballZ_podman_image.py'
Jan 21 23:21:28 compute-0 sudo[50696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:28 compute-0 python3.9[50698]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 23:21:28 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:33 compute-0 podman[50709]: 2026-01-21 23:21:33.357216131 +0000 UTC m=+4.509028105 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 21 23:21:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:33 compute-0 sudo[50696]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:34 compute-0 sudo[50966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myxdvnqavanbfppzhxzdqzqfratdmani ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037693.7073271-932-150481027925696/AnsiballZ_podman_image.py'
Jan 21 23:21:34 compute-0 sudo[50966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:34 compute-0 python3.9[50968]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 21 23:21:35 compute-0 podman[50980]: 2026-01-21 23:21:35.304910115 +0000 UTC m=+1.062844918 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 21 23:21:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:21:35 compute-0 sudo[50966]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:36 compute-0 sshd-session[45098]: Connection closed by 192.168.122.30 port 36734
Jan 21 23:21:36 compute-0 sshd-session[45095]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:21:36 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Jan 21 23:21:36 compute-0 systemd[1]: session-10.scope: Consumed 1min 47.189s CPU time.
Jan 21 23:21:36 compute-0 systemd-logind[784]: Session 10 logged out. Waiting for processes to exit.
Jan 21 23:21:36 compute-0 systemd-logind[784]: Removed session 10.
Jan 21 23:21:46 compute-0 sshd-session[51130]: Accepted publickey for zuul from 192.168.122.30 port 37692 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:21:46 compute-0 systemd-logind[784]: New session 11 of user zuul.
Jan 21 23:21:46 compute-0 systemd[1]: Started Session 11 of User zuul.
Jan 21 23:21:46 compute-0 sshd-session[51130]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:21:48 compute-0 python3.9[51283]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:21:49 compute-0 sudo[51437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txkpmofivubdnnbaaayvkhdxsyopddbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037708.7915528-68-91185810658028/AnsiballZ_getent.py'
Jan 21 23:21:49 compute-0 sudo[51437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:49 compute-0 python3.9[51439]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 21 23:21:49 compute-0 sudo[51437]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:50 compute-0 sudo[51590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trcqugwuyzxkksarftuwostcwcmictyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037709.8016334-92-248987874312729/AnsiballZ_group.py'
Jan 21 23:21:50 compute-0 sudo[51590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:50 compute-0 python3.9[51592]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 23:21:50 compute-0 groupadd[51593]: group added to /etc/group: name=openvswitch, GID=42476
Jan 21 23:21:50 compute-0 groupadd[51593]: group added to /etc/gshadow: name=openvswitch
Jan 21 23:21:50 compute-0 groupadd[51593]: new group: name=openvswitch, GID=42476
Jan 21 23:21:50 compute-0 sudo[51590]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:51 compute-0 sudo[51748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsvtvdbxjultmjxxxzsdlhlvspdaphoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037710.769467-116-149276453671178/AnsiballZ_user.py'
Jan 21 23:21:51 compute-0 sudo[51748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:52 compute-0 python3.9[51750]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 23:21:52 compute-0 useradd[51752]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 21 23:21:52 compute-0 useradd[51752]: add 'openvswitch' to group 'hugetlbfs'
Jan 21 23:21:52 compute-0 useradd[51752]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 21 23:21:52 compute-0 sudo[51748]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:52 compute-0 sudo[51908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcztxgfxtmhptruyprmnqrthjsvhoskb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037712.6420314-146-107049726017928/AnsiballZ_setup.py'
Jan 21 23:21:52 compute-0 sudo[51908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:53 compute-0 python3.9[51910]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:21:53 compute-0 sudo[51908]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:53 compute-0 sudo[51992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahtxpqnsiwjuuetzhqwacexxoxecatzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037712.6420314-146-107049726017928/AnsiballZ_dnf.py'
Jan 21 23:21:53 compute-0 sudo[51992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:54 compute-0 python3.9[51994]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:21:55 compute-0 sudo[51992]: pam_unix(sudo:session): session closed for user root
Jan 21 23:21:57 compute-0 sshd-session[52029]: Invalid user backup from 188.166.69.60 port 51214
Jan 21 23:21:57 compute-0 sshd-session[52029]: Connection closed by invalid user backup 188.166.69.60 port 51214 [preauth]
Jan 21 23:21:57 compute-0 sudo[52156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvwpmixxorjfpndmqboqwarfrfqbeexs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037717.3963082-188-147605050566053/AnsiballZ_dnf.py'
Jan 21 23:21:57 compute-0 sudo[52156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:21:58 compute-0 python3.9[52158]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:22:11 compute-0 kernel: SELinux:  Converting 2738 SID table entries...
Jan 21 23:22:11 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 23:22:11 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 21 23:22:11 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 23:22:11 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 21 23:22:11 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 23:22:11 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 23:22:11 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 23:22:11 compute-0 groupadd[52181]: group added to /etc/group: name=unbound, GID=994
Jan 21 23:22:11 compute-0 groupadd[52181]: group added to /etc/gshadow: name=unbound
Jan 21 23:22:11 compute-0 groupadd[52181]: new group: name=unbound, GID=994
Jan 21 23:22:11 compute-0 useradd[52188]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 21 23:22:11 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 21 23:22:11 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 21 23:22:12 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:22:12 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:22:12 compute-0 systemd[1]: Reloading.
Jan 21 23:22:12 compute-0 systemd-rc-local-generator[52686]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:22:12 compute-0 systemd-sysv-generator[52690]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:22:13 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:22:13 compute-0 sudo[52156]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:13 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:22:13 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:22:13 compute-0 systemd[1]: run-r0ee1c2ca6d6149e48019b0906978b1d0.service: Deactivated successfully.
Jan 21 23:22:15 compute-0 sudo[53257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnaiqbcppmgzasmfttkqcpwrfzhimvwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037734.7648742-212-265901193818050/AnsiballZ_systemd.py'
Jan 21 23:22:15 compute-0 sudo[53257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:15 compute-0 python3.9[53259]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:22:15 compute-0 systemd[1]: Reloading.
Jan 21 23:22:15 compute-0 systemd-rc-local-generator[53289]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:22:15 compute-0 systemd-sysv-generator[53292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:22:16 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Jan 21 23:22:16 compute-0 chown[53302]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 21 23:22:16 compute-0 ovs-ctl[53307]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 21 23:22:16 compute-0 ovs-ctl[53307]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 21 23:22:16 compute-0 ovs-ctl[53307]: Starting ovsdb-server [  OK  ]
Jan 21 23:22:16 compute-0 ovs-vsctl[53356]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 21 23:22:16 compute-0 ovs-vsctl[53376]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"7f404a2f-20ba-4b9b-88d6-fa3588630efa\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 21 23:22:16 compute-0 ovs-ctl[53307]: Configuring Open vSwitch system IDs [  OK  ]
Jan 21 23:22:16 compute-0 ovs-ctl[53307]: Enabling remote OVSDB managers [  OK  ]
Jan 21 23:22:16 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Jan 21 23:22:16 compute-0 ovs-vsctl[53382]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 21 23:22:16 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 21 23:22:16 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 21 23:22:16 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 21 23:22:16 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Jan 21 23:22:16 compute-0 ovs-ctl[53427]: Inserting openvswitch module [  OK  ]
Jan 21 23:22:16 compute-0 ovs-ctl[53396]: Starting ovs-vswitchd [  OK  ]
Jan 21 23:22:16 compute-0 ovs-vsctl[53445]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 21 23:22:16 compute-0 ovs-ctl[53396]: Enabling remote OVSDB managers [  OK  ]
Jan 21 23:22:16 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 21 23:22:16 compute-0 systemd[1]: Starting Open vSwitch...
Jan 21 23:22:16 compute-0 systemd[1]: Finished Open vSwitch.
Jan 21 23:22:16 compute-0 sudo[53257]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:17 compute-0 python3.9[53597]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:22:18 compute-0 sudo[53747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwmcaassfuxdckyebaqrhkmlcdvnxezq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037738.2214372-266-13129306328270/AnsiballZ_sefcontext.py'
Jan 21 23:22:18 compute-0 sudo[53747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:18 compute-0 python3.9[53749]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 21 23:22:20 compute-0 kernel: SELinux:  Converting 2752 SID table entries...
Jan 21 23:22:20 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 23:22:20 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 21 23:22:20 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 23:22:20 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 21 23:22:20 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 23:22:20 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 23:22:20 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 23:22:20 compute-0 sudo[53747]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:21 compute-0 python3.9[53905]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:22:22 compute-0 sudo[54062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpwlpucnzldahktjczhezifgtahrimmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037742.1234167-320-169515086884023/AnsiballZ_dnf.py'
Jan 21 23:22:22 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 21 23:22:22 compute-0 sudo[54062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:22 compute-0 python3.9[54064]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:22:24 compute-0 sudo[54062]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:24 compute-0 sudo[54215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxajokwjcbtvinjhlbdiqwuxktzcryau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037744.4559796-344-49472726649398/AnsiballZ_command.py'
Jan 21 23:22:24 compute-0 sudo[54215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:25 compute-0 python3.9[54217]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:22:26 compute-0 sudo[54215]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:26 compute-0 sshd-session[52205]: Connection closed by 167.94.138.179 port 16936 [preauth]
Jan 21 23:22:26 compute-0 sudo[54502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkeemwzoltqaxzrgarcuwzqvykecpoww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037746.1943922-368-117094263888518/AnsiballZ_file.py'
Jan 21 23:22:26 compute-0 sudo[54502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:26 compute-0 python3.9[54504]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 21 23:22:26 compute-0 sudo[54502]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:28 compute-0 python3.9[54654]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:22:28 compute-0 sudo[54806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdivgtgigomxwgmfavyhlpyiwfzbikps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037748.353311-416-219872863126574/AnsiballZ_dnf.py'
Jan 21 23:22:28 compute-0 sudo[54806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:28 compute-0 python3.9[54808]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:22:30 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:22:30 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:22:30 compute-0 systemd[1]: Reloading.
Jan 21 23:22:30 compute-0 systemd-sysv-generator[54852]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:22:30 compute-0 systemd-rc-local-generator[54848]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:22:30 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:22:31 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:22:31 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:22:31 compute-0 systemd[1]: run-r7a0db5cb4ee440ed8a72a91411109b6c.service: Deactivated successfully.
Jan 21 23:22:31 compute-0 sudo[54806]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:32 compute-0 sudo[55125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tonigrpgjusfedomqcaxeavulvyoeeis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037751.698863-440-20602844054801/AnsiballZ_systemd.py'
Jan 21 23:22:32 compute-0 sudo[55125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:32 compute-0 python3.9[55127]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:22:32 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 21 23:22:32 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Jan 21 23:22:32 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Jan 21 23:22:32 compute-0 systemd[1]: Stopping Network Manager...
Jan 21 23:22:32 compute-0 NetworkManager[7199]: <info>  [1769037752.3871] caught SIGTERM, shutting down normally.
Jan 21 23:22:32 compute-0 NetworkManager[7199]: <info>  [1769037752.3893] dhcp4 (eth0): canceled DHCP transaction
Jan 21 23:22:32 compute-0 NetworkManager[7199]: <info>  [1769037752.3893] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 23:22:32 compute-0 NetworkManager[7199]: <info>  [1769037752.3894] dhcp4 (eth0): state changed no lease
Jan 21 23:22:32 compute-0 NetworkManager[7199]: <info>  [1769037752.3897] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 23:22:32 compute-0 NetworkManager[7199]: <info>  [1769037752.3970] exiting (success)
Jan 21 23:22:32 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 23:22:32 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 23:22:32 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 21 23:22:32 compute-0 systemd[1]: Stopped Network Manager.
Jan 21 23:22:32 compute-0 systemd[1]: NetworkManager.service: Consumed 12.546s CPU time, 4.4M memory peak, read 0B from disk, written 14.0K to disk.
Jan 21 23:22:32 compute-0 systemd[1]: Starting Network Manager...
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.4669] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:a457267e-aaec-4d55-ae32-e78b7b5bf63f)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.4670] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.4734] manager[0x5598a0818000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 21 23:22:32 compute-0 systemd[1]: Starting Hostname Service...
Jan 21 23:22:32 compute-0 systemd[1]: Started Hostname Service.
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.5927] hostname: hostname: using hostnamed
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.5928] hostname: static hostname changed from (none) to "compute-0"
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.5935] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.5940] manager[0x5598a0818000]: rfkill: Wi-Fi hardware radio set enabled
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.5940] manager[0x5598a0818000]: rfkill: WWAN hardware radio set enabled
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.5962] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.5970] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.5971] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.5972] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.5972] manager: Networking is enabled by state file
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.5974] settings: Loaded settings plugin: keyfile (internal)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.5977] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6005] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6013] dhcp: init: Using DHCP client 'internal'
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6015] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6021] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6027] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6035] device (lo): Activation: starting connection 'lo' (2e99d35a-31da-47ef-9f44-21c4df97b7a3)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6041] device (eth0): carrier: link connected
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6044] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6050] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6050] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6057] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6063] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6068] device (eth1): carrier: link connected
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6073] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6078] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (ca895a5d-b6dc-5a65-bf50-75a15530a096) (indicated)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6079] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6083] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6088] device (eth1): Activation: starting connection 'ci-private-network' (ca895a5d-b6dc-5a65-bf50-75a15530a096)
Jan 21 23:22:32 compute-0 systemd[1]: Started Network Manager.
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6093] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6115] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6119] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6138] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6140] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6143] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6144] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6145] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6148] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6152] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6154] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6161] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6175] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6185] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6186] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6191] device (lo): Activation: successful, device activated.
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6201] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Jan 21 23:22:32 compute-0 systemd[1]: Starting Network Manager Wait Online...
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6219] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6289] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6297] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6305] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6310] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6313] device (eth1): Activation: successful, device activated.
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6342] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6344] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6355] manager: NetworkManager state is now CONNECTED_SITE
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6362] device (eth0): Activation: successful, device activated.
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6367] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 21 23:22:32 compute-0 NetworkManager[55139]: <info>  [1769037752.6371] manager: startup complete
Jan 21 23:22:32 compute-0 sudo[55125]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:32 compute-0 systemd[1]: Finished Network Manager Wait Online.
Jan 21 23:22:33 compute-0 sudo[55351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ertjnizfiujduamohuevhiuvllqfvreh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037752.8384652-464-505478177485/AnsiballZ_dnf.py'
Jan 21 23:22:33 compute-0 sudo[55351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:33 compute-0 python3.9[55353]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:22:38 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:22:38 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:22:38 compute-0 systemd[1]: Reloading.
Jan 21 23:22:38 compute-0 systemd-rc-local-generator[55410]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:22:38 compute-0 systemd-sysv-generator[55414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:22:38 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:22:39 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:22:39 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:22:39 compute-0 systemd[1]: run-r14ce2bf3d9c14bb78a9489effd93bdc9.service: Deactivated successfully.
Jan 21 23:22:39 compute-0 sudo[55351]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:40 compute-0 sshd-session[55688]: Invalid user backup from 188.166.69.60 port 40382
Jan 21 23:22:40 compute-0 sshd-session[55688]: Connection closed by invalid user backup 188.166.69.60 port 40382 [preauth]
Jan 21 23:22:40 compute-0 sudo[55815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfevbwpiqqvwvaknbkirweeefzfqluug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037760.3481607-500-102245933056098/AnsiballZ_stat.py'
Jan 21 23:22:40 compute-0 sudo[55815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:40 compute-0 python3.9[55817]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:22:40 compute-0 sudo[55815]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:41 compute-0 sudo[55967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdgknupbvwcnhrphgutkelpfjddtebju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037761.1518242-527-227399054678725/AnsiballZ_ini_file.py'
Jan 21 23:22:41 compute-0 sudo[55967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:41 compute-0 python3.9[55969]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:41 compute-0 sudo[55967]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:42 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 23:22:42 compute-0 sudo[56121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yixtzvttmdpkwlaelprsvjskrvigzpat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037762.6160588-557-223631917866220/AnsiballZ_ini_file.py'
Jan 21 23:22:42 compute-0 sudo[56121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:43 compute-0 python3.9[56123]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:43 compute-0 sudo[56121]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:43 compute-0 sudo[56273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nobpbeoxvayhieehvpuusyzmcksyecac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037763.3533359-557-270663210293216/AnsiballZ_ini_file.py'
Jan 21 23:22:43 compute-0 sudo[56273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:43 compute-0 python3.9[56275]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:43 compute-0 sudo[56273]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:44 compute-0 sudo[56425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnwekprgmrjtjttwcenazpsfebvtlpvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037764.0733316-602-224860414802813/AnsiballZ_ini_file.py'
Jan 21 23:22:44 compute-0 sudo[56425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:44 compute-0 python3.9[56427]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:44 compute-0 sudo[56425]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:45 compute-0 sudo[56577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wknvsfysmzqdlgttmeanbazfdwzvntnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037765.105339-602-36749354279030/AnsiballZ_ini_file.py'
Jan 21 23:22:45 compute-0 sudo[56577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:45 compute-0 python3.9[56579]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:45 compute-0 sudo[56577]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:46 compute-0 sudo[56729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dunomwqqoldlxtqkczpvmkarykaixnwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037765.9161148-647-203849217349658/AnsiballZ_stat.py'
Jan 21 23:22:46 compute-0 sudo[56729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:46 compute-0 python3.9[56731]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:22:46 compute-0 sudo[56729]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:46 compute-0 sudo[56852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxdnsankdojifarmcqrlvnlktqbfwjkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037765.9161148-647-203849217349658/AnsiballZ_copy.py'
Jan 21 23:22:46 compute-0 sudo[56852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:47 compute-0 python3.9[56854]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037765.9161148-647-203849217349658/.source _original_basename=.pw2kj_2r follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:47 compute-0 sudo[56852]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:47 compute-0 sudo[57004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzxtzvilunegkacfhrsfniskrxakfegd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037767.4181335-692-198432325120157/AnsiballZ_file.py'
Jan 21 23:22:47 compute-0 sudo[57004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:47 compute-0 python3.9[57006]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:48 compute-0 sudo[57004]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:48 compute-0 sudo[57156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlzmhjtvhcwullhjjfthvrlxvmxmanop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037768.217297-716-110033595195514/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 21 23:22:48 compute-0 sudo[57156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:48 compute-0 python3.9[57158]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 21 23:22:48 compute-0 sudo[57156]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:49 compute-0 sudo[57308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juelzhavvjvlgkjpovmnkrpxgzimicna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037769.2732098-743-147767135520526/AnsiballZ_file.py'
Jan 21 23:22:49 compute-0 sudo[57308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:49 compute-0 python3.9[57310]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:22:49 compute-0 sudo[57308]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:50 compute-0 sudo[57460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljkdyftfrhwfjeiiaxrsdcqtnmiftzbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037770.6339-773-23972513156982/AnsiballZ_stat.py'
Jan 21 23:22:50 compute-0 sudo[57460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:51 compute-0 sudo[57460]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:51 compute-0 sudo[57583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxzkpdmzdredssrpjjddzncnzjhvkqdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037770.6339-773-23972513156982/AnsiballZ_copy.py'
Jan 21 23:22:51 compute-0 sudo[57583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:51 compute-0 sudo[57583]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:52 compute-0 sudo[57735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgaltihzjhwngipltcnbosnjhqanarpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037771.980585-818-58412934993534/AnsiballZ_slurp.py'
Jan 21 23:22:52 compute-0 sudo[57735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:52 compute-0 python3.9[57737]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 21 23:22:52 compute-0 sudo[57735]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:54 compute-0 sudo[57910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvmzhirizlgusxvzxhjxlvgthkjnrirn ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037772.9878826-845-224365377817278/async_wrapper.py j368111184240 300 /home/zuul/.ansible/tmp/ansible-tmp-1769037772.9878826-845-224365377817278/AnsiballZ_edpm_os_net_config.py _'
Jan 21 23:22:54 compute-0 sudo[57910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:54 compute-0 ansible-async_wrapper.py[57912]: Invoked with j368111184240 300 /home/zuul/.ansible/tmp/ansible-tmp-1769037772.9878826-845-224365377817278/AnsiballZ_edpm_os_net_config.py _
Jan 21 23:22:54 compute-0 ansible-async_wrapper.py[57915]: Starting module and watcher
Jan 21 23:22:54 compute-0 ansible-async_wrapper.py[57915]: Start watching 57916 (300)
Jan 21 23:22:54 compute-0 ansible-async_wrapper.py[57916]: Start module (57916)
Jan 21 23:22:54 compute-0 ansible-async_wrapper.py[57912]: Return async_wrapper task started.
Jan 21 23:22:54 compute-0 sudo[57910]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:55 compute-0 python3.9[57917]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 21 23:22:55 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 21 23:22:55 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 21 23:22:55 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 21 23:22:55 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 21 23:22:55 compute-0 kernel: cfg80211: failed to load regulatory.db
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2042] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2060] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2721] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2724] audit: op="connection-add" uuid="b645e16a-1fcb-4910-a388-a74a20879489" name="br-ex-br" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2742] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2743] audit: op="connection-add" uuid="f39ae055-38ae-4f5d-908a-8b9c15faf245" name="br-ex-port" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2760] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2762] audit: op="connection-add" uuid="da7a930c-af42-4de2-8c2f-e5e766b5adef" name="eth1-port" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2778] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2780] audit: op="connection-add" uuid="b58ef4a5-efc2-4c6c-8d30-c4cabee701a1" name="vlan20-port" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2794] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2796] audit: op="connection-add" uuid="57170860-8bf5-4a44-8da0-4ba360266c08" name="vlan21-port" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2811] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2813] audit: op="connection-add" uuid="ff7935c3-12a9-4c8a-a237-9222c1ac941e" name="vlan22-port" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2837] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2855] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2857] audit: op="connection-add" uuid="11f5d308-0a4f-4167-9594-70bcdfe7244e" name="br-ex-if" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2914] audit: op="connection-update" uuid="ca895a5d-b6dc-5a65-bf50-75a15530a096" name="ci-private-network" args="ovs-external-ids.data,ipv4.addresses,ipv4.routes,ipv4.never-default,ipv4.dns,ipv4.routing-rules,ipv4.method,connection.controller,connection.slave-type,connection.timestamp,connection.master,connection.port-type,ipv6.addresses,ipv6.routes,ipv6.routing-rules,ipv6.dns,ipv6.method,ipv6.addr-gen-mode,ovs-interface.type" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2934] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2937] audit: op="connection-add" uuid="3cddda8b-fe09-4f41-afa2-24a45c4d8898" name="vlan20-if" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2955] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2957] audit: op="connection-add" uuid="d10848e4-95de-421c-be15-3b2cf3764136" name="vlan21-if" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2974] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2976] audit: op="connection-add" uuid="c0c9e445-3302-4f65-a409-07711bea47b2" name="vlan22-if" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.2991] audit: op="connection-delete" uuid="ec059f52-b25e-31f6-9b9f-6e854f0ee9a8" name="Wired connection 1" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3005] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <warn>  [1769037777.3009] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3017] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3023] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (b645e16a-1fcb-4910-a388-a74a20879489)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3025] audit: op="connection-activate" uuid="b645e16a-1fcb-4910-a388-a74a20879489" name="br-ex-br" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3027] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <warn>  [1769037777.3030] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3037] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3043] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (f39ae055-38ae-4f5d-908a-8b9c15faf245)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3046] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <warn>  [1769037777.3049] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3055] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3061] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (da7a930c-af42-4de2-8c2f-e5e766b5adef)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3064] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <warn>  [1769037777.3066] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3073] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3079] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (b58ef4a5-efc2-4c6c-8d30-c4cabee701a1)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3083] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <warn>  [1769037777.3085] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3092] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3099] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (57170860-8bf5-4a44-8da0-4ba360266c08)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3102] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <warn>  [1769037777.3104] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3113] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3120] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (ff7935c3-12a9-4c8a-a237-9222c1ac941e)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3123] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3127] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3131] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3141] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <warn>  [1769037777.3143] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3151] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3160] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (11f5d308-0a4f-4167-9594-70bcdfe7244e)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3161] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3168] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3172] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3174] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3177] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3196] device (eth1): disconnecting for new activation request.
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3197] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3201] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3203] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3204] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3207] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <warn>  [1769037777.3208] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3211] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3217] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (3cddda8b-fe09-4f41-afa2-24a45c4d8898)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3218] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3221] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3223] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3225] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3228] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <warn>  [1769037777.3229] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3233] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3237] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (d10848e4-95de-421c-be15-3b2cf3764136)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3238] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3242] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3244] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3245] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3249] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <warn>  [1769037777.3250] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3253] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3259] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (c0c9e445-3302-4f65-a409-07711bea47b2)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3259] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3263] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3265] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3266] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3268] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3283] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3286] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3289] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3292] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3298] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3302] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3307] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3310] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3312] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3317] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3321] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3325] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3326] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3330] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3336] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 kernel: ovs-system: entered promiscuous mode
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3340] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3342] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3347] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3353] dhcp4 (eth0): canceled DHCP transaction
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3353] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3354] dhcp4 (eth0): state changed no lease
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3356] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 21 23:22:57 compute-0 systemd-udevd[57921]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:22:57 compute-0 kernel: Timeout policy base is empty
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3368] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3371] audit: op="device-reapply" interface="eth1" ifindex=3 pid=57918 uid=0 result="fail" reason="Device is not activated"
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3403] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3410] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3415] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3418] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3422] device (eth1): disconnecting for new activation request.
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3423] audit: op="connection-activate" uuid="ca895a5d-b6dc-5a65-bf50-75a15530a096" name="ci-private-network" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3469] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57918 uid=0 result="success"
Jan 21 23:22:57 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3572] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3674] device (eth1): Activation: starting connection 'ci-private-network' (ca895a5d-b6dc-5a65-bf50-75a15530a096)
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3699] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3703] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3709] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3710] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3711] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3713] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3714] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3715] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3722] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3730] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3733] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3738] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3742] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3746] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3749] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3753] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3756] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3760] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3764] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3769] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3773] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3779] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3785] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3826] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3827] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.3831] device (eth1): Activation: successful, device activated.
Jan 21 23:22:57 compute-0 kernel: br-ex: entered promiscuous mode
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4232] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4250] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4274] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4276] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4282] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 23:22:57 compute-0 kernel: vlan22: entered promiscuous mode
Jan 21 23:22:57 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 21 23:22:57 compute-0 kernel: vlan20: entered promiscuous mode
Jan 21 23:22:57 compute-0 systemd-udevd[57923]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4430] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4450] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 kernel: vlan21: entered promiscuous mode
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4485] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4486] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4492] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4554] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4568] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4591] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4604] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4613] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4616] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4623] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4632] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4633] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 21 23:22:57 compute-0 NetworkManager[55139]: <info>  [1769037777.4637] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 21 23:22:58 compute-0 sudo[58248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsbluvqxyrqgpgxcqiheseegihwlteiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037777.989624-845-81865447147317/AnsiballZ_async_status.py'
Jan 21 23:22:58 compute-0 sudo[58248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:22:58 compute-0 NetworkManager[55139]: <info>  [1769037778.5384] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57918 uid=0 result="success"
Jan 21 23:22:58 compute-0 NetworkManager[55139]: <info>  [1769037778.6802] checkpoint[0x5598a07ee950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 21 23:22:58 compute-0 NetworkManager[55139]: <info>  [1769037778.6804] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57918 uid=0 result="success"
Jan 21 23:22:58 compute-0 python3.9[58250]: ansible-ansible.legacy.async_status Invoked with jid=j368111184240.57912 mode=status _async_dir=/root/.ansible_async
Jan 21 23:22:58 compute-0 sudo[58248]: pam_unix(sudo:session): session closed for user root
Jan 21 23:22:58 compute-0 NetworkManager[55139]: <info>  [1769037778.9792] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57918 uid=0 result="success"
Jan 21 23:22:58 compute-0 NetworkManager[55139]: <info>  [1769037778.9805] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57918 uid=0 result="success"
Jan 21 23:22:59 compute-0 NetworkManager[55139]: <info>  [1769037779.1779] audit: op="networking-control" arg="global-dns-configuration" pid=57918 uid=0 result="success"
Jan 21 23:22:59 compute-0 NetworkManager[55139]: <info>  [1769037779.1810] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 21 23:22:59 compute-0 NetworkManager[55139]: <info>  [1769037779.1836] audit: op="networking-control" arg="global-dns-configuration" pid=57918 uid=0 result="success"
Jan 21 23:22:59 compute-0 NetworkManager[55139]: <info>  [1769037779.1857] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57918 uid=0 result="success"
Jan 21 23:22:59 compute-0 NetworkManager[55139]: <info>  [1769037779.3269] checkpoint[0x5598a07eea20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 21 23:22:59 compute-0 NetworkManager[55139]: <info>  [1769037779.3274] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57918 uid=0 result="success"
Jan 21 23:22:59 compute-0 ansible-async_wrapper.py[57916]: Module complete (57916)
Jan 21 23:22:59 compute-0 ansible-async_wrapper.py[57915]: Done in kid B.
Jan 21 23:23:02 compute-0 sudo[58354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnxkfkiwymtxhgiuzjtdnnpihyrjxezl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037777.989624-845-81865447147317/AnsiballZ_async_status.py'
Jan 21 23:23:02 compute-0 sudo[58354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:02 compute-0 python3.9[58356]: ansible-ansible.legacy.async_status Invoked with jid=j368111184240.57912 mode=status _async_dir=/root/.ansible_async
Jan 21 23:23:02 compute-0 sudo[58354]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:02 compute-0 sudo[58454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjzfpozygwhhwhlqfdhnnsotzdevqmyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037777.989624-845-81865447147317/AnsiballZ_async_status.py'
Jan 21 23:23:02 compute-0 sudo[58454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:02 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 21 23:23:02 compute-0 python3.9[58456]: ansible-ansible.legacy.async_status Invoked with jid=j368111184240.57912 mode=cleanup _async_dir=/root/.ansible_async
Jan 21 23:23:02 compute-0 sudo[58454]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:03 compute-0 sudo[58609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbdwfxwkxjmccwbyqjprzqayzxadqkzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037783.0342956-926-237939707926517/AnsiballZ_stat.py'
Jan 21 23:23:03 compute-0 sudo[58609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:03 compute-0 python3.9[58611]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:23:03 compute-0 sudo[58609]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:03 compute-0 sudo[58732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhgxdkyktkdglousejrgiksmzslmidwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037783.0342956-926-237939707926517/AnsiballZ_copy.py'
Jan 21 23:23:03 compute-0 sudo[58732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:03 compute-0 python3.9[58734]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037783.0342956-926-237939707926517/.source.returncode _original_basename=.hloi4pbn follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:23:04 compute-0 sudo[58732]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:04 compute-0 sudo[58884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmbupnffawwrhtwddzorvnbrfynunwbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037784.4044423-974-122197253750385/AnsiballZ_stat.py'
Jan 21 23:23:04 compute-0 sudo[58884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:04 compute-0 python3.9[58886]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:23:04 compute-0 sudo[58884]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:05 compute-0 sudo[59007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peixltjaonigjhjtkgmzmrvchjyyclgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037784.4044423-974-122197253750385/AnsiballZ_copy.py'
Jan 21 23:23:05 compute-0 sudo[59007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:05 compute-0 python3.9[59009]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037784.4044423-974-122197253750385/.source.cfg _original_basename=.g4gaf8ti follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:23:05 compute-0 sudo[59007]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:06 compute-0 sudo[59160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nabtzxgxbrurvnmyxpeqqvisktnglsvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037785.715241-1019-230740689736333/AnsiballZ_systemd.py'
Jan 21 23:23:06 compute-0 sudo[59160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:06 compute-0 python3.9[59162]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:23:06 compute-0 systemd[1]: Reloading Network Manager...
Jan 21 23:23:06 compute-0 NetworkManager[55139]: <info>  [1769037786.4823] audit: op="reload" arg="0" pid=59166 uid=0 result="success"
Jan 21 23:23:06 compute-0 NetworkManager[55139]: <info>  [1769037786.4831] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 21 23:23:06 compute-0 systemd[1]: Reloaded Network Manager.
Jan 21 23:23:06 compute-0 sudo[59160]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:06 compute-0 sshd-session[51133]: Connection closed by 192.168.122.30 port 37692
Jan 21 23:23:06 compute-0 sshd-session[51130]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:23:06 compute-0 systemd-logind[784]: Session 11 logged out. Waiting for processes to exit.
Jan 21 23:23:06 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Jan 21 23:23:06 compute-0 systemd[1]: session-11.scope: Consumed 53.718s CPU time.
Jan 21 23:23:06 compute-0 systemd-logind[784]: Removed session 11.
Jan 21 23:23:13 compute-0 sshd-session[59197]: Accepted publickey for zuul from 192.168.122.30 port 49972 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:23:13 compute-0 systemd-logind[784]: New session 12 of user zuul.
Jan 21 23:23:13 compute-0 systemd[1]: Started Session 12 of User zuul.
Jan 21 23:23:13 compute-0 sshd-session[59197]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:23:14 compute-0 python3.9[59350]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:23:15 compute-0 python3.9[59505]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:23:16 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 21 23:23:17 compute-0 python3.9[59695]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:23:17 compute-0 sshd-session[59200]: Connection closed by 192.168.122.30 port 49972
Jan 21 23:23:17 compute-0 sshd-session[59197]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:23:17 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Jan 21 23:23:17 compute-0 systemd[1]: session-12.scope: Consumed 2.637s CPU time.
Jan 21 23:23:17 compute-0 systemd-logind[784]: Session 12 logged out. Waiting for processes to exit.
Jan 21 23:23:17 compute-0 systemd-logind[784]: Removed session 12.
Jan 21 23:23:22 compute-0 sshd-session[59724]: Invalid user backup from 188.166.69.60 port 51946
Jan 21 23:23:22 compute-0 sshd-session[59724]: Connection closed by invalid user backup 188.166.69.60 port 51946 [preauth]
Jan 21 23:23:23 compute-0 sshd-session[59726]: Accepted publickey for zuul from 192.168.122.30 port 38536 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:23:23 compute-0 systemd-logind[784]: New session 13 of user zuul.
Jan 21 23:23:23 compute-0 systemd[1]: Started Session 13 of User zuul.
Jan 21 23:23:23 compute-0 sshd-session[59726]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:23:24 compute-0 python3.9[59879]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:23:25 compute-0 python3.9[60034]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:23:26 compute-0 sudo[60188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbeputbmymsmejmihwpljhzixvudqprq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037806.2244587-80-241259780638021/AnsiballZ_setup.py'
Jan 21 23:23:26 compute-0 sudo[60188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:26 compute-0 python3.9[60190]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:23:27 compute-0 sudo[60188]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:27 compute-0 sudo[60272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ascxbyxaajyqqybpglojyliemaliawqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037806.2244587-80-241259780638021/AnsiballZ_dnf.py'
Jan 21 23:23:27 compute-0 sudo[60272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:27 compute-0 python3.9[60274]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:23:29 compute-0 sudo[60272]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:29 compute-0 sudo[60426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mftzcdifywzaygqstwjmgfrcrvwrizjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037809.3451657-116-101471309374540/AnsiballZ_setup.py'
Jan 21 23:23:29 compute-0 sudo[60426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:30 compute-0 python3.9[60428]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:23:30 compute-0 sudo[60426]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:31 compute-0 sudo[60617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzgoaekzgjnhteoyrvhmpjpfyjzmyfch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037810.8362846-149-113961982955653/AnsiballZ_file.py'
Jan 21 23:23:31 compute-0 sudo[60617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:31 compute-0 python3.9[60619]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:23:31 compute-0 sudo[60617]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:32 compute-0 sudo[60769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odpuyzonwlyrixntxmgiekdmirgncqkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037811.7627048-173-94137197250647/AnsiballZ_command.py'
Jan 21 23:23:32 compute-0 sudo[60769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:32 compute-0 python3.9[60771]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:23:32 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:23:32 compute-0 sudo[60769]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:33 compute-0 sudo[60933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kczxpyfbylqhzxcfdnjyvwixmtrsxmvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037812.7250412-197-243550925024108/AnsiballZ_stat.py'
Jan 21 23:23:33 compute-0 sudo[60933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:33 compute-0 python3.9[60935]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:23:33 compute-0 sudo[60933]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:33 compute-0 sudo[61011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuglekttajffmahtmxddidvqsjxnsgpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037812.7250412-197-243550925024108/AnsiballZ_file.py'
Jan 21 23:23:33 compute-0 sudo[61011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:33 compute-0 python3.9[61013]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:23:33 compute-0 sudo[61011]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:34 compute-0 sudo[61163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxleczwgqbvlmpshyraksavujxdiecrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037814.1035364-233-575082190853/AnsiballZ_stat.py'
Jan 21 23:23:34 compute-0 sudo[61163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:34 compute-0 python3.9[61165]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:23:34 compute-0 sudo[61163]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:34 compute-0 sudo[61241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imnmkwwglyiltskzikimqkrcaqhfgqxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037814.1035364-233-575082190853/AnsiballZ_file.py'
Jan 21 23:23:34 compute-0 sudo[61241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:35 compute-0 python3.9[61243]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:23:35 compute-0 sudo[61241]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:36 compute-0 sudo[61393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thyboxfmjbopejbkffsxzprevbakjxfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037815.5761087-272-148505893434443/AnsiballZ_ini_file.py'
Jan 21 23:23:36 compute-0 sudo[61393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:36 compute-0 python3.9[61395]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:23:36 compute-0 sudo[61393]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:36 compute-0 sudo[61545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiyicbroqijvnqjatglfxclyzgviicdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037816.486375-272-81880914912385/AnsiballZ_ini_file.py'
Jan 21 23:23:36 compute-0 sudo[61545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:37 compute-0 python3.9[61547]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:23:37 compute-0 sudo[61545]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:37 compute-0 sudo[61697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crbmowpksieavjvjjorcvpvnbbkzmzys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037817.170369-272-258933934072954/AnsiballZ_ini_file.py'
Jan 21 23:23:37 compute-0 sudo[61697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:37 compute-0 python3.9[61699]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:23:37 compute-0 sudo[61697]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:38 compute-0 sudo[61849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afxnkizqagjmfuygfbvwmijeuuimxhsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037817.7556627-272-19080065982358/AnsiballZ_ini_file.py'
Jan 21 23:23:38 compute-0 sudo[61849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:38 compute-0 python3.9[61851]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:23:38 compute-0 sudo[61849]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:39 compute-0 sudo[62001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcvegccaybhgflnmbugkmtugqygtponn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037818.785977-365-36530213027593/AnsiballZ_dnf.py'
Jan 21 23:23:39 compute-0 sudo[62001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:39 compute-0 python3.9[62003]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:23:40 compute-0 sudo[62001]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:41 compute-0 sudo[62154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aguhygtnjxrpsbsdqrxdjdjsamnbbojx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037821.4858813-398-102271917205609/AnsiballZ_setup.py'
Jan 21 23:23:41 compute-0 sudo[62154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:42 compute-0 python3.9[62156]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:23:42 compute-0 sudo[62154]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:42 compute-0 sudo[62308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-besjogiuueilekmwdmwegofrsosktcvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037822.3907442-422-133580141348738/AnsiballZ_stat.py'
Jan 21 23:23:42 compute-0 sudo[62308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:42 compute-0 python3.9[62310]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:23:42 compute-0 sudo[62308]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:43 compute-0 sudo[62460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzgjzhcxlahyhxzwravmwhjgeusjdoqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037823.1845925-449-20357796213884/AnsiballZ_stat.py'
Jan 21 23:23:43 compute-0 sudo[62460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:43 compute-0 python3.9[62462]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:23:43 compute-0 sudo[62460]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:44 compute-0 sudo[62612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdztkfnwulygotdkejoplaawkespsmyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037824.0594058-479-45742343088977/AnsiballZ_command.py'
Jan 21 23:23:44 compute-0 sudo[62612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:44 compute-0 python3.9[62614]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:23:44 compute-0 sudo[62612]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:45 compute-0 sudo[62765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsgxfrlgrtlchqvjptstqihezwfhxoos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037824.9793932-509-14944427621400/AnsiballZ_service_facts.py'
Jan 21 23:23:45 compute-0 sudo[62765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:45 compute-0 python3.9[62767]: ansible-service_facts Invoked
Jan 21 23:23:45 compute-0 network[62784]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:23:45 compute-0 network[62785]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:23:45 compute-0 network[62786]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:23:49 compute-0 sudo[62765]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:49 compute-0 sshd-session[62882]: Received disconnect from 91.224.92.54 port 54800:11:  [preauth]
Jan 21 23:23:49 compute-0 sshd-session[62882]: Disconnected from authenticating user root 91.224.92.54 port 54800 [preauth]
Jan 21 23:23:50 compute-0 sudo[63071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdvpbnbdswqvxeaboitbgygrgpwfsqre ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769037830.4999638-554-155072546436659/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769037830.4999638-554-155072546436659/args'
Jan 21 23:23:50 compute-0 sudo[63071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:50 compute-0 sudo[63071]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:51 compute-0 sudo[63238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvbyhssqcqeczbpvaqyddfdguuwnbwak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037831.3641646-587-261906172903774/AnsiballZ_dnf.py'
Jan 21 23:23:51 compute-0 sudo[63238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:52 compute-0 python3.9[63240]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:23:53 compute-0 sudo[63238]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:55 compute-0 sudo[63391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ettbzcepqjpbyublrnonfnikibfkgdju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037834.2344675-626-134855175789845/AnsiballZ_package_facts.py'
Jan 21 23:23:55 compute-0 sudo[63391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:55 compute-0 python3.9[63393]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 21 23:23:55 compute-0 sudo[63391]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:57 compute-0 sudo[63543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmsocurvhzurxddnxvmvgfkffexmxkae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037836.613994-656-137037545922037/AnsiballZ_stat.py'
Jan 21 23:23:57 compute-0 sudo[63543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:57 compute-0 python3.9[63545]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:23:57 compute-0 sudo[63543]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:58 compute-0 sudo[63668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqhfqdsifsklvouujzuwgwgnynqavhsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037836.613994-656-137037545922037/AnsiballZ_copy.py'
Jan 21 23:23:58 compute-0 sudo[63668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:58 compute-0 python3.9[63670]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037836.613994-656-137037545922037/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:23:58 compute-0 sudo[63668]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:58 compute-0 sudo[63822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvkehxyxakbdijerxikbtseqawkrxoil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037838.5290642-701-31332304672425/AnsiballZ_stat.py'
Jan 21 23:23:58 compute-0 sudo[63822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:58 compute-0 python3.9[63824]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:23:59 compute-0 sudo[63822]: pam_unix(sudo:session): session closed for user root
Jan 21 23:23:59 compute-0 sudo[63947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjvceolauipsojnmsvoapvzftsjxmlce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037838.5290642-701-31332304672425/AnsiballZ_copy.py'
Jan 21 23:23:59 compute-0 sudo[63947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:23:59 compute-0 python3.9[63949]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037838.5290642-701-31332304672425/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:23:59 compute-0 sudo[63947]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:01 compute-0 sudo[64101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgnbyjjbwtgkccqmbefmsligfhyryixq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037841.0057545-764-30029777626264/AnsiballZ_lineinfile.py'
Jan 21 23:24:01 compute-0 sudo[64101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:01 compute-0 python3.9[64103]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:01 compute-0 sudo[64101]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:03 compute-0 sudo[64255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwwkurjriqqptrcsgxipfmvwnamhyxrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037842.8918686-809-100056754619667/AnsiballZ_setup.py'
Jan 21 23:24:03 compute-0 sudo[64255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:03 compute-0 python3.9[64257]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:24:03 compute-0 sudo[64255]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:04 compute-0 sudo[64341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axdirxcyscgzxxxqknrcdpczwtoetgnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037842.8918686-809-100056754619667/AnsiballZ_systemd.py'
Jan 21 23:24:04 compute-0 sudo[64341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:04 compute-0 python3.9[64343]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:24:04 compute-0 sudo[64341]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:05 compute-0 sshd-session[64266]: Invalid user backup from 188.166.69.60 port 44396
Jan 21 23:24:05 compute-0 sshd-session[64266]: Connection closed by invalid user backup 188.166.69.60 port 44396 [preauth]
Jan 21 23:24:08 compute-0 sudo[64495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcihqmztnmfnongaimwlmizjrqzibsrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037847.7189658-857-240439523444103/AnsiballZ_setup.py'
Jan 21 23:24:08 compute-0 sudo[64495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:08 compute-0 python3.9[64497]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:24:08 compute-0 sudo[64495]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:08 compute-0 sudo[64579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpverewgrvhmrhevdzhytdwyhmlqejyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037847.7189658-857-240439523444103/AnsiballZ_systemd.py'
Jan 21 23:24:08 compute-0 sudo[64579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:09 compute-0 python3.9[64581]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:24:09 compute-0 chronyd[786]: chronyd exiting
Jan 21 23:24:09 compute-0 systemd[1]: Stopping NTP client/server...
Jan 21 23:24:09 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Jan 21 23:24:09 compute-0 systemd[1]: Stopped NTP client/server.
Jan 21 23:24:09 compute-0 systemd[1]: Starting NTP client/server...
Jan 21 23:24:09 compute-0 chronyd[64591]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 21 23:24:09 compute-0 chronyd[64591]: Frequency -23.216 +/- 0.351 ppm read from /var/lib/chrony/drift
Jan 21 23:24:09 compute-0 chronyd[64591]: Loaded seccomp filter (level 2)
Jan 21 23:24:09 compute-0 systemd[1]: Started NTP client/server.
Jan 21 23:24:09 compute-0 sudo[64579]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:10 compute-0 sshd-session[59729]: Connection closed by 192.168.122.30 port 38536
Jan 21 23:24:10 compute-0 sshd-session[59726]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:24:10 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Jan 21 23:24:10 compute-0 systemd[1]: session-13.scope: Consumed 29.049s CPU time.
Jan 21 23:24:10 compute-0 systemd-logind[784]: Session 13 logged out. Waiting for processes to exit.
Jan 21 23:24:10 compute-0 systemd-logind[784]: Removed session 13.
Jan 21 23:24:15 compute-0 sshd-session[64617]: Accepted publickey for zuul from 192.168.122.30 port 45048 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:24:15 compute-0 systemd-logind[784]: New session 14 of user zuul.
Jan 21 23:24:15 compute-0 systemd[1]: Started Session 14 of User zuul.
Jan 21 23:24:15 compute-0 sshd-session[64617]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:24:16 compute-0 python3.9[64770]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:24:17 compute-0 sudo[64924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ornigbimelgjwijjbbftoetggdeuerkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037857.0154128-59-52219602406351/AnsiballZ_file.py'
Jan 21 23:24:17 compute-0 sudo[64924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:17 compute-0 python3.9[64926]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:17 compute-0 sudo[64924]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:18 compute-0 sudo[65099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enalduzueevinzmrokuimlrlntbkpfrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037858.2071457-83-38311792312236/AnsiballZ_stat.py'
Jan 21 23:24:18 compute-0 sudo[65099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:18 compute-0 python3.9[65101]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:18 compute-0 sudo[65099]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:19 compute-0 sudo[65177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbgnhjtgzofmyonmgumkfmvbiwlvdswm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037858.2071457-83-38311792312236/AnsiballZ_file.py'
Jan 21 23:24:19 compute-0 sudo[65177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:19 compute-0 python3.9[65179]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.tjulu0se recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:19 compute-0 sudo[65177]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:20 compute-0 sudo[65329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxvgjgxyqibyjvgoelrpkltnjnswqhci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037860.0155563-143-144380108169868/AnsiballZ_stat.py'
Jan 21 23:24:20 compute-0 sudo[65329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:20 compute-0 python3.9[65331]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:20 compute-0 sudo[65329]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:21 compute-0 sudo[65452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryzsndoxiupxqimvsujetreokkensekk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037860.0155563-143-144380108169868/AnsiballZ_copy.py'
Jan 21 23:24:21 compute-0 sudo[65452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:21 compute-0 python3.9[65454]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037860.0155563-143-144380108169868/.source _original_basename=.gbe0zvgs follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:21 compute-0 sudo[65452]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:21 compute-0 sudo[65604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jodpziikhnvousraisaefimoutfoqkoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037861.497025-191-179159735320557/AnsiballZ_file.py'
Jan 21 23:24:21 compute-0 sudo[65604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:21 compute-0 python3.9[65606]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:24:22 compute-0 sudo[65604]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:22 compute-0 sudo[65756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkdovflmmyybunltprlbtioryirxhbux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037862.2658653-215-120317240851883/AnsiballZ_stat.py'
Jan 21 23:24:22 compute-0 sudo[65756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:22 compute-0 python3.9[65758]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:22 compute-0 sudo[65756]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:23 compute-0 sudo[65879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rridkwdayredytakqsyoqpirscdpxvbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037862.2658653-215-120317240851883/AnsiballZ_copy.py'
Jan 21 23:24:23 compute-0 sudo[65879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:23 compute-0 python3.9[65881]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037862.2658653-215-120317240851883/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:24:23 compute-0 sudo[65879]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:23 compute-0 sudo[66031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sthaskivndqxrxepwrrmyjwhsmevrymf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037863.4447331-215-274240330219425/AnsiballZ_stat.py'
Jan 21 23:24:23 compute-0 sudo[66031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:23 compute-0 python3.9[66033]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:23 compute-0 sudo[66031]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:24 compute-0 sudo[66154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jndqldwywdenuaqtclhnlxjkqlfimyuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037863.4447331-215-274240330219425/AnsiballZ_copy.py'
Jan 21 23:24:24 compute-0 sudo[66154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:24 compute-0 python3.9[66156]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769037863.4447331-215-274240330219425/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:24:24 compute-0 sudo[66154]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:25 compute-0 sudo[66306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjgrhjdpfkrzygpnhkxrojuqvfodkxuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037865.0485113-302-4982492235696/AnsiballZ_file.py'
Jan 21 23:24:25 compute-0 sudo[66306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:25 compute-0 python3.9[66308]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:25 compute-0 sudo[66306]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:26 compute-0 sudo[66458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhrussjsuayfxddnjjzcpphyztwvbhou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037865.8414032-326-157875081360696/AnsiballZ_stat.py'
Jan 21 23:24:26 compute-0 sudo[66458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:26 compute-0 python3.9[66460]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:26 compute-0 sudo[66458]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:26 compute-0 sudo[66581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjxkqcnlnwuwcepvqjvuinpvmqypyhkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037865.8414032-326-157875081360696/AnsiballZ_copy.py'
Jan 21 23:24:26 compute-0 sudo[66581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:26 compute-0 python3.9[66583]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037865.8414032-326-157875081360696/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:26 compute-0 sudo[66581]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:27 compute-0 sudo[66733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpxlqphhkdhasffqbraicmmmbodycpse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037867.2687578-371-143618014602679/AnsiballZ_stat.py'
Jan 21 23:24:27 compute-0 sudo[66733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:27 compute-0 python3.9[66735]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:27 compute-0 sudo[66733]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:28 compute-0 sudo[66856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsvxdjyxhauixojixokajnhtgcqbwgej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037867.2687578-371-143618014602679/AnsiballZ_copy.py'
Jan 21 23:24:28 compute-0 sudo[66856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:28 compute-0 python3.9[66858]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037867.2687578-371-143618014602679/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:28 compute-0 sudo[66856]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:29 compute-0 sudo[67008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwguqayrvrljwuemozikjsbjkpvircfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037868.6514425-416-271212521093664/AnsiballZ_systemd.py'
Jan 21 23:24:29 compute-0 sudo[67008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:29 compute-0 python3.9[67010]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:24:29 compute-0 systemd[1]: Reloading.
Jan 21 23:24:29 compute-0 systemd-sysv-generator[67040]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:24:29 compute-0 systemd-rc-local-generator[67037]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:24:29 compute-0 systemd[1]: Reloading.
Jan 21 23:24:29 compute-0 systemd-rc-local-generator[67073]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:24:29 compute-0 systemd-sysv-generator[67077]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:24:30 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Jan 21 23:24:30 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Jan 21 23:24:30 compute-0 sudo[67008]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:30 compute-0 sudo[67233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unlqmepgdwmnhkspjjjwrojriqojvdyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037870.6794107-440-66848933906612/AnsiballZ_stat.py'
Jan 21 23:24:30 compute-0 sudo[67233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:31 compute-0 python3.9[67235]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:31 compute-0 sudo[67233]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:31 compute-0 sudo[67356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybhtucbtddflvwslfyzfwlulocqfbdrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037870.6794107-440-66848933906612/AnsiballZ_copy.py'
Jan 21 23:24:31 compute-0 sudo[67356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:31 compute-0 python3.9[67358]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037870.6794107-440-66848933906612/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:31 compute-0 sudo[67356]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:32 compute-0 sudo[67508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzelsjxqyqmtjozefjuxsbiicqdjkaww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037872.1618817-485-191243094104667/AnsiballZ_stat.py'
Jan 21 23:24:32 compute-0 sudo[67508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:32 compute-0 python3.9[67510]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:32 compute-0 sudo[67508]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:32 compute-0 sudo[67631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsupiyrymvynlwnmmvhstntfkllnmcea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037872.1618817-485-191243094104667/AnsiballZ_copy.py'
Jan 21 23:24:32 compute-0 sudo[67631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:33 compute-0 python3.9[67633]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037872.1618817-485-191243094104667/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:33 compute-0 sudo[67631]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:33 compute-0 sudo[67783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njsluhkoqndrhmobvwyuwkcelqjdilqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037873.536118-530-245512433012154/AnsiballZ_systemd.py'
Jan 21 23:24:33 compute-0 sudo[67783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:34 compute-0 python3.9[67785]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:24:34 compute-0 systemd[1]: Reloading.
Jan 21 23:24:34 compute-0 systemd-rc-local-generator[67816]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:24:34 compute-0 systemd-sysv-generator[67820]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:24:34 compute-0 systemd[1]: Reloading.
Jan 21 23:24:34 compute-0 systemd-rc-local-generator[67847]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:24:34 compute-0 systemd-sysv-generator[67853]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:24:34 compute-0 systemd[1]: Starting Create netns directory...
Jan 21 23:24:34 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 21 23:24:34 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 21 23:24:34 compute-0 systemd[1]: Finished Create netns directory.
Jan 21 23:24:34 compute-0 sudo[67783]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:35 compute-0 python3.9[68011]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:24:35 compute-0 network[68028]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:24:35 compute-0 network[68029]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:24:35 compute-0 network[68030]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:24:41 compute-0 sudo[68290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmkomvddyoakhuddpqsyvwgtbtxqxqok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037880.8335166-578-127335756872209/AnsiballZ_systemd.py'
Jan 21 23:24:41 compute-0 sudo[68290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:41 compute-0 python3.9[68292]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:24:41 compute-0 systemd[1]: Reloading.
Jan 21 23:24:41 compute-0 systemd-rc-local-generator[68323]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:24:41 compute-0 systemd-sysv-generator[68327]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:24:41 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 21 23:24:41 compute-0 iptables.init[68333]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 21 23:24:42 compute-0 iptables.init[68333]: iptables: Flushing firewall rules: [  OK  ]
Jan 21 23:24:42 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Jan 21 23:24:42 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 21 23:24:42 compute-0 sudo[68290]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:42 compute-0 sudo[68527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkbvydviyjmgbfndwktescaeundhvqti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037882.290048-578-32788481108100/AnsiballZ_systemd.py'
Jan 21 23:24:42 compute-0 sudo[68527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:42 compute-0 python3.9[68529]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:24:42 compute-0 sudo[68527]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:43 compute-0 sudo[68681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfdaeyzrciwifykivqqryufihrthzfad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037883.361666-626-234108804373152/AnsiballZ_systemd.py'
Jan 21 23:24:43 compute-0 sudo[68681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:43 compute-0 python3.9[68683]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:24:43 compute-0 systemd[1]: Reloading.
Jan 21 23:24:44 compute-0 systemd-rc-local-generator[68710]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:24:44 compute-0 systemd-sysv-generator[68714]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:24:44 compute-0 systemd[1]: Starting Netfilter Tables...
Jan 21 23:24:44 compute-0 systemd[1]: Finished Netfilter Tables.
Jan 21 23:24:44 compute-0 sudo[68681]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:45 compute-0 sudo[68872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewoxkwfkwhlumcnvqsiscpbasyhaflaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037884.9084713-650-154996082475972/AnsiballZ_command.py'
Jan 21 23:24:45 compute-0 sudo[68872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:45 compute-0 python3.9[68874]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:24:45 compute-0 sudo[68872]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:46 compute-0 sudo[69025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teljjgljevyaectmbdysmvqyjpuceojz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037886.300998-692-247311078404726/AnsiballZ_stat.py'
Jan 21 23:24:46 compute-0 sudo[69025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:46 compute-0 python3.9[69027]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:46 compute-0 sudo[69025]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:47 compute-0 sudo[69150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrohboycpbowrlmioydkxqhifnrgwzbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037886.300998-692-247311078404726/AnsiballZ_copy.py'
Jan 21 23:24:47 compute-0 sudo[69150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:47 compute-0 python3.9[69152]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037886.300998-692-247311078404726/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:47 compute-0 sudo[69150]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:48 compute-0 sudo[69303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etadihhkorteihpmawpngyuitoctedhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037887.8077626-737-182169494882388/AnsiballZ_systemd.py'
Jan 21 23:24:48 compute-0 sudo[69303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:48 compute-0 python3.9[69305]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:24:48 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Jan 21 23:24:48 compute-0 sshd[1006]: Received SIGHUP; restarting.
Jan 21 23:24:48 compute-0 sshd[1006]: Server listening on 0.0.0.0 port 22.
Jan 21 23:24:48 compute-0 sshd[1006]: Server listening on :: port 22.
Jan 21 23:24:48 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Jan 21 23:24:48 compute-0 sudo[69303]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:49 compute-0 sudo[69461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgusvfrprnjtfcrfhglktkxwzedgbijh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037888.8573883-761-139783292870476/AnsiballZ_file.py'
Jan 21 23:24:49 compute-0 sudo[69461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:49 compute-0 python3.9[69463]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:49 compute-0 sudo[69461]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:49 compute-0 sshd-session[69442]: Invalid user backup from 188.166.69.60 port 53082
Jan 21 23:24:49 compute-0 sshd-session[69442]: Connection closed by invalid user backup 188.166.69.60 port 53082 [preauth]
Jan 21 23:24:49 compute-0 sudo[69613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apuzwgexwktyjgfxmvqmtyvbhithmovr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037889.571136-785-40101163882973/AnsiballZ_stat.py'
Jan 21 23:24:49 compute-0 sudo[69613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:50 compute-0 python3.9[69615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:50 compute-0 sudo[69613]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:50 compute-0 sudo[69736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkjwdraarvgjpuwjmvtmluebwcxfgxmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037889.571136-785-40101163882973/AnsiballZ_copy.py'
Jan 21 23:24:50 compute-0 sudo[69736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:50 compute-0 python3.9[69738]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037889.571136-785-40101163882973/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:50 compute-0 sudo[69736]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:52 compute-0 sudo[69888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziqysxjxcvsdfimrtoxcqumakqirghru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037891.6596634-839-48047974550740/AnsiballZ_timezone.py'
Jan 21 23:24:52 compute-0 sudo[69888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:52 compute-0 python3.9[69890]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 21 23:24:52 compute-0 systemd[1]: Starting Time & Date Service...
Jan 21 23:24:52 compute-0 systemd[1]: Started Time & Date Service.
Jan 21 23:24:52 compute-0 sudo[69888]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:54 compute-0 sudo[70044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aupkthjvteqdokaxftmsrgpthyygxcfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037893.8203986-866-218201032102346/AnsiballZ_file.py'
Jan 21 23:24:54 compute-0 sudo[70044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:54 compute-0 python3.9[70046]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:54 compute-0 sudo[70044]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:54 compute-0 sudo[70196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wljauvpovxyzzdqxujrfxchzjvzxwhnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037894.631291-890-112146254578415/AnsiballZ_stat.py'
Jan 21 23:24:54 compute-0 sudo[70196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:55 compute-0 python3.9[70198]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:55 compute-0 sudo[70196]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:55 compute-0 sudo[70319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfoakjkloqrhqsptrgpzpchufujpoeiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037894.631291-890-112146254578415/AnsiballZ_copy.py'
Jan 21 23:24:55 compute-0 sudo[70319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:55 compute-0 python3.9[70321]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037894.631291-890-112146254578415/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:55 compute-0 sudo[70319]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:56 compute-0 sudo[70471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abjqjkkllimdzfcbwygqtjpwmhbxizys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037895.9410064-935-95440892049844/AnsiballZ_stat.py'
Jan 21 23:24:56 compute-0 sudo[70471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:56 compute-0 python3.9[70473]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:56 compute-0 sudo[70471]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:56 compute-0 sudo[70594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asxnfovpcjjjsblqsmwsdgiezovdrntm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037895.9410064-935-95440892049844/AnsiballZ_copy.py'
Jan 21 23:24:56 compute-0 sudo[70594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:57 compute-0 python3.9[70596]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769037895.9410064-935-95440892049844/.source.yaml _original_basename=.buwki_0b follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:57 compute-0 sudo[70594]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:57 compute-0 sudo[70746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uykohocdecpnapuumqufniempzedojor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037897.2345362-980-213500456790592/AnsiballZ_stat.py'
Jan 21 23:24:57 compute-0 sudo[70746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:57 compute-0 python3.9[70748]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:24:57 compute-0 sudo[70746]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:58 compute-0 sudo[70869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csntovcmnbbcaxehtvwxjvtifysztyua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037897.2345362-980-213500456790592/AnsiballZ_copy.py'
Jan 21 23:24:58 compute-0 sudo[70869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:58 compute-0 python3.9[70871]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037897.2345362-980-213500456790592/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:24:58 compute-0 sudo[70869]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:58 compute-0 sudo[71021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvvupdmjqzkfsckekpljricfzsbliwng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037898.675765-1025-16962798002047/AnsiballZ_command.py'
Jan 21 23:24:58 compute-0 sudo[71021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:59 compute-0 python3.9[71023]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:24:59 compute-0 sudo[71021]: pam_unix(sudo:session): session closed for user root
Jan 21 23:24:59 compute-0 sudo[71174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryyjvxajfqniyznsyuufjobcsxamhsnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037899.4074786-1049-190315426464741/AnsiballZ_command.py'
Jan 21 23:24:59 compute-0 sudo[71174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:24:59 compute-0 python3.9[71176]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:25:00 compute-0 sudo[71174]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:00 compute-0 sudo[71327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbddblgxqocvzmfvbqdqmeyesrbembka ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769037900.211002-1073-122711787634892/AnsiballZ_edpm_nftables_from_files.py'
Jan 21 23:25:00 compute-0 sudo[71327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:00 compute-0 python3[71329]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 23:25:00 compute-0 sudo[71327]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:01 compute-0 anacron[30915]: Job `cron.daily' started
Jan 21 23:25:01 compute-0 anacron[30915]: Job `cron.daily' terminated
Jan 21 23:25:01 compute-0 sudo[71480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-matqhxyrrdojqlerfbjcjwdjjdmispia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037901.1152983-1097-137676517669539/AnsiballZ_stat.py'
Jan 21 23:25:01 compute-0 sudo[71480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:01 compute-0 python3.9[71483]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:25:01 compute-0 sudo[71480]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:02 compute-0 sudo[71604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbqkrzklbfyuxgwhrourdoxnsjnybjxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037901.1152983-1097-137676517669539/AnsiballZ_copy.py'
Jan 21 23:25:02 compute-0 sudo[71604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:02 compute-0 python3.9[71606]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037901.1152983-1097-137676517669539/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:02 compute-0 sudo[71604]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:02 compute-0 sudo[71756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdabeyjhvsgsztmmwgqxmszxuyomsodq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037902.5230722-1142-138796629758243/AnsiballZ_stat.py'
Jan 21 23:25:02 compute-0 sudo[71756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:03 compute-0 python3.9[71758]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:25:03 compute-0 sudo[71756]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:03 compute-0 sudo[71879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weldsnhozyqbyfwemejjlenrqblarpui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037902.5230722-1142-138796629758243/AnsiballZ_copy.py'
Jan 21 23:25:03 compute-0 sudo[71879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:03 compute-0 python3.9[71881]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037902.5230722-1142-138796629758243/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:03 compute-0 sudo[71879]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:04 compute-0 sudo[72031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwltyksdcyshvuwkhvemgjhjhxvamule ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037904.0052967-1187-233750393781193/AnsiballZ_stat.py'
Jan 21 23:25:04 compute-0 sudo[72031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:04 compute-0 python3.9[72033]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:25:04 compute-0 sudo[72031]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:04 compute-0 sudo[72154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdiavlvbfqpfmhsdyeejnpwocgzkvxvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037904.0052967-1187-233750393781193/AnsiballZ_copy.py'
Jan 21 23:25:04 compute-0 sudo[72154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:05 compute-0 python3.9[72156]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037904.0052967-1187-233750393781193/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:05 compute-0 sudo[72154]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:05 compute-0 sudo[72306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osszsrgbpiihunchqodljytiurqqilbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037905.4225395-1232-90565636979852/AnsiballZ_stat.py'
Jan 21 23:25:05 compute-0 sudo[72306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:05 compute-0 python3.9[72308]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:25:05 compute-0 sudo[72306]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:06 compute-0 sudo[72429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otpmzqnfrhoduixqfucnxuhovufbsnio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037905.4225395-1232-90565636979852/AnsiballZ_copy.py'
Jan 21 23:25:06 compute-0 sudo[72429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:06 compute-0 python3.9[72431]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037905.4225395-1232-90565636979852/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:06 compute-0 sudo[72429]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:07 compute-0 sudo[72581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdxaynenkwzcahfubkynslvcctsonpcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037906.8412004-1277-210632297663971/AnsiballZ_stat.py'
Jan 21 23:25:07 compute-0 sudo[72581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:07 compute-0 python3.9[72583]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:25:07 compute-0 sudo[72581]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:08 compute-0 sudo[72704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbxqkzdaydjyclzzxnqevrcjarhosyvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037906.8412004-1277-210632297663971/AnsiballZ_copy.py'
Jan 21 23:25:08 compute-0 sudo[72704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:08 compute-0 python3.9[72706]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037906.8412004-1277-210632297663971/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:08 compute-0 sudo[72704]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:08 compute-0 sudo[72856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbmbwbbzzcxczfqzcjuhrhpwpejymdoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037908.658258-1322-40174051149160/AnsiballZ_file.py'
Jan 21 23:25:08 compute-0 sudo[72856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:09 compute-0 python3.9[72858]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:09 compute-0 sudo[72856]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:09 compute-0 sudo[73008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhkaaqkvlzypwvvxhobhwktabfqandxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037909.3400948-1346-8616783513044/AnsiballZ_command.py'
Jan 21 23:25:09 compute-0 sudo[73008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:09 compute-0 python3.9[73010]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:25:09 compute-0 sudo[73008]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:10 compute-0 sudo[73167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdyrjigvlnpyuacivyebbsuqjbkdtwwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037910.2020087-1370-21423208799503/AnsiballZ_blockinfile.py'
Jan 21 23:25:10 compute-0 sudo[73167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:10 compute-0 python3.9[73169]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:10 compute-0 sudo[73167]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:11 compute-0 sudo[73320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drdhyrlljvzsawhhjamxicgdyynpjvyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037911.2562153-1397-174549528977896/AnsiballZ_file.py'
Jan 21 23:25:11 compute-0 sudo[73320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:11 compute-0 python3.9[73322]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:11 compute-0 sudo[73320]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:12 compute-0 sudo[73472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyoxwnprysfnuiejmimwyicmxmletuay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037911.90597-1397-119220364521359/AnsiballZ_file.py'
Jan 21 23:25:12 compute-0 sudo[73472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:12 compute-0 python3.9[73474]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:12 compute-0 sudo[73472]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:13 compute-0 sudo[73624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igcqpbaalzwjxcmwqvlhcpdrsulbtxyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037912.6630619-1442-267035658280940/AnsiballZ_mount.py'
Jan 21 23:25:13 compute-0 sudo[73624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:13 compute-0 python3.9[73626]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 21 23:25:13 compute-0 sudo[73624]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:13 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:25:13 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:25:13 compute-0 sudo[73778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfsnccoppmtylvxrsqhmzrqtvvowuzyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037913.4996316-1442-109039014292706/AnsiballZ_mount.py'
Jan 21 23:25:13 compute-0 sudo[73778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:13 compute-0 python3.9[73780]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 21 23:25:14 compute-0 sudo[73778]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:14 compute-0 sshd-session[64620]: Connection closed by 192.168.122.30 port 45048
Jan 21 23:25:14 compute-0 sshd-session[64617]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:25:14 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Jan 21 23:25:14 compute-0 systemd[1]: session-14.scope: Consumed 36.682s CPU time.
Jan 21 23:25:14 compute-0 systemd-logind[784]: Session 14 logged out. Waiting for processes to exit.
Jan 21 23:25:14 compute-0 systemd-logind[784]: Removed session 14.
Jan 21 23:25:19 compute-0 sshd-session[73806]: Accepted publickey for zuul from 192.168.122.30 port 50136 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:25:19 compute-0 systemd-logind[784]: New session 15 of user zuul.
Jan 21 23:25:19 compute-0 systemd[1]: Started Session 15 of User zuul.
Jan 21 23:25:19 compute-0 sshd-session[73806]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:25:20 compute-0 sudo[73959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arnvjchjoqvcpcinmzwjwnlvmhresjxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037919.8817549-23-122967057621565/AnsiballZ_tempfile.py'
Jan 21 23:25:20 compute-0 sudo[73959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:20 compute-0 python3.9[73961]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 21 23:25:20 compute-0 sudo[73959]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:21 compute-0 sudo[74111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrouamrdzycnbvocpsbvaihraagseiud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037920.7910585-59-228646501733343/AnsiballZ_stat.py'
Jan 21 23:25:21 compute-0 sudo[74111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:21 compute-0 python3.9[74113]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:25:21 compute-0 sudo[74111]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:22 compute-0 sudo[74263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzlihmxgxewuxuxjkpkewvgvccwjmfdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037921.7109785-89-191387227397952/AnsiballZ_setup.py'
Jan 21 23:25:22 compute-0 sudo[74263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:22 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 21 23:25:22 compute-0 python3.9[74265]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:25:22 compute-0 sudo[74263]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:23 compute-0 sudo[74417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxvnyyiokrjwjoppsbnylpaaurjqauck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037922.9431665-114-206924193851695/AnsiballZ_blockinfile.py'
Jan 21 23:25:23 compute-0 sudo[74417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:23 compute-0 python3.9[74419]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC26D51NdJjdilPO47VkyAGWZEKpDvfQ2t45jAnFi+yGdqGpJZeqIqXy1qJWgR+nOjHPpu4xyjUsXsUdkcmQySQ9nELhPXxBtFGM3LlXjhhk0Yibj4G2gfuMuG/m8d0BtpBY66pWUvd424nrAKh1ObdZgR5iHS4dtFVcrUPD7nmkE3YxEDETOTc5d/Tcal9MQArb/rQQAs2Z7N4Lgv1bSzhuu70Ij9qUff8SJhc5ZBQkAGKfNPP8XajfuTOvnEOo9uZQjTKcFZnsiSBUnxId028vihtYF6+NFOByOltsmJc7OIafk5r6JZzbps6FcCaOaT2TRLuLemBS+qfS4N0tWS1iJ00Jo7h7y+UdgDBFB3/zCHD5KiHOYCHbXdqtz8HUwsz65bdDEsKyJdh6qyFv5DN7sbB1UK6Yr/urKbVGR2mYP7sNEIAcSC9HZ2vehi9Hm/TSD7IfvR2i96ckZOsnHD3QeMUyJXjqk3PG7rlUM7NxZYyHaTuZzYrR5DvOsUjS1s=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICZF6/j6naCAJ9xH6aYQVqdvwoz3vezm/JU2Pso9ogKK
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLqlgZ52debu0OKcJwhzrcTUf3XONAZS4TIW+jISXbbaqXAGs35QUNRljBr9O34MR2l+Jib4kJghkCYEmTbTxNo=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCmNgflEDQr9DxhZFToMSHP67cO7SUQpgVB7thv3JwDIojWojCRgQSVty7S1IJD5allDPdSEn7he/4X0ePPAI6phFNIWx+fwLuXpedyRVclMG0GASpOZ1kxLiQoMh+DOdnJArZ4llA4Lxdm7MyKCzA+Tna+2Z0+XrBTZjxzM4NbwGmUrESDcTXXu7f/vCq0QTRmjHLTbEvqFbJJzIetehEBC/yJb+35myPPBJ5IU8op6ixtbvwk2pzrRYr/NOUsf/ODWITXAvMjl6U1iE2Np2giBVqfz3zKkoH7gkMRHUmwxetTejWa1kIIZRiJUsQRetDm7v+bkaHGpwokxAC3n7pMwzdSO59inU/Lpr63ruukI64YeLK7FQiJ9557a+lcykXz0xgDF2aNHS9jyyhLQ0EHQGUgQToa462bLJwlCLFkxHpamrQKS67M+71TcFb//zx7kRmUT7NSxGHNAe155RcO0L1mkDHk1r2xOfk2iLNJtgYdCcVlwSRp4beSywJSECk=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILEP7bJKdLXxjWmdj4eC7ngVkPSbC0h6tc+Oej4hLtk7
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO/PF1lIcuvdp/VOQkUSqyeGOw1ILI4bhZtJ8xgcsTd+//1XE1ll313MwTKeS1n9loXGAVB4+f9lF2fbY4gEkQI=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrBpqo7KcpSwQCe8bhjM7Y7rs9JlI0f5WrmGDjYEfFo3lyWMOb2fxXrIDWqMZa5HGb83LwgwgDL1MhTWmi47d7h8Wcxzg4uoflPcGqILiXv9Z+T68l/C6NT2ur0r4Njrz27cayzBtDPz1wKz1bf72s+Jm7Ukl84pubtCYfPhpZ6HBojmNiq+gesC60N0wbEbIHDEgd+jVptW/UdWmhzO7xEBn3qbNPk6UpnYJSU+Z2wGx6hHckTSl5Wy/7RQ2HXE990+4qkeVl88lR/LqsGthwUQ8tlp8F33yw3IS9D0uurGkuqY4GyRjexrol0VPx9VlrPU0y4K+1pP59O4qo9+z/eylWJViS4R223v0JF2RIrH6aQvHTtV1un22qYnTCTCQrZ6KAKQipc0pawnz7DdXE3D2gwcQkZZmcYm9JboWqFn5/80rsuHUZmMBOHy5owN7IjIly0yAPxjAIZy5dMr1MkQP9o/FSnvyQzt11XeO/49/DI3FH0TkomkN21/QhSYE=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKG99Xw/DkEh2LuhUTQH1tq7VFfroV01ukYKDqY+UjHx
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBvYfIE3Tv6SOsn96jsNhozh4WS77CDAl4JYSfjVLVK/RVCTMxlZOAnhAHwDUgcw2k0t2eycyJ2wTJO6OCAqGM4=
                                             create=True mode=0644 path=/tmp/ansible.8xx2udy9 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:23 compute-0 sudo[74417]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:24 compute-0 sudo[74569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjbscixksrschzsmzxtpkxykiujhipys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037923.7907376-138-195922711043856/AnsiballZ_command.py'
Jan 21 23:25:24 compute-0 sudo[74569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:24 compute-0 python3.9[74571]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.8xx2udy9' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:25:24 compute-0 sudo[74569]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:25 compute-0 sudo[74723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylrwvfyrtamqvuufhyrntstoziamhbll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037924.6731696-162-66485153365492/AnsiballZ_file.py'
Jan 21 23:25:25 compute-0 sudo[74723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:25 compute-0 python3.9[74725]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.8xx2udy9 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:25 compute-0 sudo[74723]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:26 compute-0 sshd-session[73809]: Connection closed by 192.168.122.30 port 50136
Jan 21 23:25:26 compute-0 sshd-session[73806]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:25:26 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Jan 21 23:25:26 compute-0 systemd[1]: session-15.scope: Consumed 3.512s CPU time.
Jan 21 23:25:26 compute-0 systemd-logind[784]: Session 15 logged out. Waiting for processes to exit.
Jan 21 23:25:26 compute-0 systemd-logind[784]: Removed session 15.
Jan 21 23:25:30 compute-0 sshd-session[74750]: Accepted publickey for zuul from 192.168.122.30 port 41632 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:25:30 compute-0 systemd-logind[784]: New session 16 of user zuul.
Jan 21 23:25:30 compute-0 systemd[1]: Started Session 16 of User zuul.
Jan 21 23:25:30 compute-0 sshd-session[74750]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:25:31 compute-0 python3.9[74903]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:25:32 compute-0 sshd-session[74955]: Invalid user www-data from 188.166.69.60 port 40022
Jan 21 23:25:32 compute-0 sshd-session[74955]: Connection closed by invalid user www-data 188.166.69.60 port 40022 [preauth]
Jan 21 23:25:32 compute-0 sudo[75059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxiwlyacecewfiaxpvrkionfcpowumdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037932.319353-56-220352815077911/AnsiballZ_systemd.py'
Jan 21 23:25:33 compute-0 sudo[75059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:33 compute-0 python3.9[75061]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 21 23:25:33 compute-0 sudo[75059]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:33 compute-0 sudo[75213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvxvnurbrsgzbdcgipeobtpyqkwkwobe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037933.5685868-80-272183907128700/AnsiballZ_systemd.py'
Jan 21 23:25:33 compute-0 sudo[75213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:34 compute-0 python3.9[75215]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:25:34 compute-0 sudo[75213]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:35 compute-0 sudo[75366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uelsxggbpqywbvgpqzfitimxxxndwpce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037934.528799-107-4943037600643/AnsiballZ_command.py'
Jan 21 23:25:35 compute-0 sudo[75366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:35 compute-0 python3.9[75368]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:25:35 compute-0 sudo[75366]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:35 compute-0 sudo[75519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsgabmkbrjycbnfswgwikeyxmbznzkpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037935.4457266-131-63169879413425/AnsiballZ_stat.py'
Jan 21 23:25:35 compute-0 sudo[75519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:36 compute-0 python3.9[75521]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:25:36 compute-0 sudo[75519]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:36 compute-0 sudo[75673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plotlcigbikhoedafvcepmltamadhesj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037936.3094425-155-254989851170617/AnsiballZ_command.py'
Jan 21 23:25:36 compute-0 sudo[75673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:36 compute-0 python3.9[75675]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:25:36 compute-0 sudo[75673]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:37 compute-0 sudo[75828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tomkkncwvhnbamiwexlnyqgzzcnzltpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037937.0800776-179-32778579231628/AnsiballZ_file.py'
Jan 21 23:25:37 compute-0 sudo[75828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:37 compute-0 python3.9[75830]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:25:37 compute-0 sudo[75828]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:38 compute-0 sshd-session[74753]: Connection closed by 192.168.122.30 port 41632
Jan 21 23:25:38 compute-0 sshd-session[74750]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:25:38 compute-0 systemd-logind[784]: Session 16 logged out. Waiting for processes to exit.
Jan 21 23:25:38 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Jan 21 23:25:38 compute-0 systemd[1]: session-16.scope: Consumed 4.974s CPU time.
Jan 21 23:25:38 compute-0 systemd-logind[784]: Removed session 16.
Jan 21 23:25:43 compute-0 sshd-session[75855]: Accepted publickey for zuul from 192.168.122.30 port 60736 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:25:44 compute-0 systemd-logind[784]: New session 17 of user zuul.
Jan 21 23:25:44 compute-0 systemd[1]: Started Session 17 of User zuul.
Jan 21 23:25:44 compute-0 sshd-session[75855]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:25:45 compute-0 python3.9[76008]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:25:46 compute-0 sudo[76162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soqfnpmkasthiutsfmcuumonumavpnrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037945.6936789-62-131878360797496/AnsiballZ_setup.py'
Jan 21 23:25:46 compute-0 sudo[76162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:46 compute-0 python3.9[76164]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:25:46 compute-0 sudo[76162]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:46 compute-0 sudo[76246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prysesbsazcggwgybzhgpjvesllxkhcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037945.6936789-62-131878360797496/AnsiballZ_dnf.py'
Jan 21 23:25:46 compute-0 sudo[76246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:25:47 compute-0 python3.9[76248]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 21 23:25:48 compute-0 sudo[76246]: pam_unix(sudo:session): session closed for user root
Jan 21 23:25:49 compute-0 python3.9[76399]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:25:50 compute-0 python3.9[76550]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 23:25:51 compute-0 python3.9[76700]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:25:52 compute-0 python3.9[76850]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:25:52 compute-0 sshd-session[75858]: Connection closed by 192.168.122.30 port 60736
Jan 21 23:25:52 compute-0 sshd-session[75855]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:25:52 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Jan 21 23:25:52 compute-0 systemd[1]: session-17.scope: Consumed 6.372s CPU time.
Jan 21 23:25:52 compute-0 systemd-logind[784]: Session 17 logged out. Waiting for processes to exit.
Jan 21 23:25:52 compute-0 systemd-logind[784]: Removed session 17.
Jan 21 23:25:58 compute-0 sshd-session[76875]: Accepted publickey for zuul from 192.168.122.30 port 43528 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:25:58 compute-0 systemd-logind[784]: New session 18 of user zuul.
Jan 21 23:25:58 compute-0 systemd[1]: Started Session 18 of User zuul.
Jan 21 23:25:58 compute-0 sshd-session[76875]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:25:59 compute-0 python3.9[77028]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:26:00 compute-0 sudo[77182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkokbjovpezhylpgzzvlkljklvvmwrof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037960.417909-111-30223405027661/AnsiballZ_file.py'
Jan 21 23:26:00 compute-0 sudo[77182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:01 compute-0 python3.9[77184]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:01 compute-0 sudo[77182]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:01 compute-0 sudo[77334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcwvjexcupkvhtfrumewntnpgibypeoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037961.32426-111-14702392146910/AnsiballZ_file.py'
Jan 21 23:26:01 compute-0 sudo[77334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:01 compute-0 python3.9[77336]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:01 compute-0 sudo[77334]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:02 compute-0 sudo[77486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adidhhgysufomgynbqpreifrrsdbymon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037961.984087-157-122185646624564/AnsiballZ_stat.py'
Jan 21 23:26:02 compute-0 sudo[77486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:02 compute-0 python3.9[77488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:02 compute-0 sudo[77486]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:03 compute-0 sudo[77609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqbtoqrevdljovaixgiesmwauudpvpmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037961.984087-157-122185646624564/AnsiballZ_copy.py'
Jan 21 23:26:03 compute-0 sudo[77609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:03 compute-0 python3.9[77611]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037961.984087-157-122185646624564/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=b104a47defd3bc23f8423a5afc74edade565bb53 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:03 compute-0 sudo[77609]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:03 compute-0 sudo[77761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryvytjsipnvvpjdkgrhenhoidtjfqojq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037963.442565-157-20711024755017/AnsiballZ_stat.py'
Jan 21 23:26:03 compute-0 sudo[77761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:03 compute-0 python3.9[77763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:03 compute-0 sudo[77761]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:04 compute-0 sudo[77884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfgnbolcpyqqjiccduwpkyjfkyniacvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037963.442565-157-20711024755017/AnsiballZ_copy.py'
Jan 21 23:26:04 compute-0 sudo[77884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:04 compute-0 python3.9[77886]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037963.442565-157-20711024755017/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=9c03bcfa62361e5ef322801c360476a6187916b6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:04 compute-0 sudo[77884]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:04 compute-0 sudo[78036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jidrixxawoljyydbqxxygeitkqrkbonr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037964.6712003-157-189055036045217/AnsiballZ_stat.py'
Jan 21 23:26:04 compute-0 sudo[78036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:05 compute-0 python3.9[78038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:05 compute-0 sudo[78036]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:05 compute-0 sudo[78159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvyvejqfwxxpzmcqrxxcaueqdchutisf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037964.6712003-157-189055036045217/AnsiballZ_copy.py'
Jan 21 23:26:05 compute-0 sudo[78159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:05 compute-0 python3.9[78161]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037964.6712003-157-189055036045217/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=b64004cf2dcdbf5dc5ab7555c0b7c46648d345af backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:05 compute-0 sudo[78159]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:06 compute-0 sudo[78311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fczdavujdmpvqavaydtmbjrrtvazwlew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037965.9244626-278-164098040378319/AnsiballZ_file.py'
Jan 21 23:26:06 compute-0 sudo[78311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:06 compute-0 python3.9[78313]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:06 compute-0 sudo[78311]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:06 compute-0 sudo[78463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxkjyvzeazmpwrrzqjkexyboevibypxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037966.5925946-278-236515917082513/AnsiballZ_file.py'
Jan 21 23:26:06 compute-0 sudo[78463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:07 compute-0 python3.9[78465]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:07 compute-0 sudo[78463]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:07 compute-0 sudo[78615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymzstwnthdjfovdbvlybzubwouhmqetz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037967.2683244-323-156554335952593/AnsiballZ_stat.py'
Jan 21 23:26:07 compute-0 sudo[78615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:07 compute-0 python3.9[78617]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:07 compute-0 sudo[78615]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:08 compute-0 sudo[78738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwvrewnrtknlngdxzierygjdfbkmpbut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037967.2683244-323-156554335952593/AnsiballZ_copy.py'
Jan 21 23:26:08 compute-0 sudo[78738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:08 compute-0 python3.9[78740]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037967.2683244-323-156554335952593/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=cbdb71a7b88221491026fc34b3e4888b8fcf59f9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:08 compute-0 sudo[78738]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:08 compute-0 sudo[78890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdybsrnyugramzoftvrlgmarygbheqdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037968.3999019-323-42892406361164/AnsiballZ_stat.py'
Jan 21 23:26:08 compute-0 sudo[78890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:08 compute-0 python3.9[78892]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:08 compute-0 sudo[78890]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:09 compute-0 sudo[79013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iscyexdmujvojmefordfgnkaheoilsxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037968.3999019-323-42892406361164/AnsiballZ_copy.py'
Jan 21 23:26:09 compute-0 sudo[79013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:09 compute-0 python3.9[79015]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037968.3999019-323-42892406361164/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=d8fd7bb3e34b5ea059d1c8aca5209211b8d4078a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:09 compute-0 sudo[79013]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:09 compute-0 sudo[79165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rthjobvqwbednuhabvpjcovulyyixiqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037969.4827096-323-147465512374254/AnsiballZ_stat.py'
Jan 21 23:26:09 compute-0 sudo[79165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:09 compute-0 python3.9[79167]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:09 compute-0 sudo[79165]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:10 compute-0 sudo[79288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emgblanqeecjmimlipjofxofrikzerjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037969.4827096-323-147465512374254/AnsiballZ_copy.py'
Jan 21 23:26:10 compute-0 sudo[79288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:10 compute-0 python3.9[79290]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037969.4827096-323-147465512374254/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=df54502a7c7d4c102b41e0a22f78f3420786a142 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:10 compute-0 sudo[79288]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:11 compute-0 sudo[79440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkirafmqtgunupdtqsbbylbacawpmvky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037970.7942662-446-111981643911098/AnsiballZ_file.py'
Jan 21 23:26:11 compute-0 sudo[79440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:11 compute-0 python3.9[79442]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:11 compute-0 sudo[79440]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:11 compute-0 sudo[79592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqouaaqqosbecxwnylxrifnayerlocsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037971.4117913-446-164486877363264/AnsiballZ_file.py'
Jan 21 23:26:11 compute-0 sudo[79592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:11 compute-0 python3.9[79594]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:11 compute-0 sudo[79592]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:12 compute-0 sudo[79744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpiaueqxauetujazzonaccliwpmgfioo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037972.1114233-493-258172252586391/AnsiballZ_stat.py'
Jan 21 23:26:12 compute-0 sudo[79744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:12 compute-0 python3.9[79746]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:12 compute-0 sudo[79744]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:12 compute-0 sudo[79867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oethwsrvameiffpgfajrfgfczvhnicgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037972.1114233-493-258172252586391/AnsiballZ_copy.py'
Jan 21 23:26:12 compute-0 sudo[79867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:13 compute-0 python3.9[79869]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037972.1114233-493-258172252586391/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=5d525de2cbd5b8e8880aa87026a76dda60f8703b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:13 compute-0 sudo[79867]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:13 compute-0 sudo[80019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovuohpjnkluansyhwrqwdogpkjvswiab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037973.3205664-493-279451130992214/AnsiballZ_stat.py'
Jan 21 23:26:13 compute-0 sudo[80019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:13 compute-0 python3.9[80021]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:13 compute-0 sudo[80019]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:14 compute-0 sudo[80142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngxlklxculdmsmjdopwornfgzzrxmlvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037973.3205664-493-279451130992214/AnsiballZ_copy.py'
Jan 21 23:26:14 compute-0 sudo[80142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:14 compute-0 python3.9[80144]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037973.3205664-493-279451130992214/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=5ac53e5233bb5dc2a1a1ee89225b6d9cf54a324a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:14 compute-0 sudo[80142]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:14 compute-0 sudo[80294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olcjxappclwstqijljocmwqvvjtugsbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037974.6045005-493-264270591284392/AnsiballZ_stat.py'
Jan 21 23:26:14 compute-0 sudo[80294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:15 compute-0 python3.9[80296]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:15 compute-0 sudo[80294]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:15 compute-0 sudo[80417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opjddjhrqcnnguykncweobytvuetftyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037974.6045005-493-264270591284392/AnsiballZ_copy.py'
Jan 21 23:26:15 compute-0 sudo[80417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:15 compute-0 python3.9[80419]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037974.6045005-493-264270591284392/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=08f516987b9616757328ad19f5ac5c6d3a5bcae1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:15 compute-0 sudo[80417]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:16 compute-0 sudo[80569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaszzrvdwicegjgsxqwtuntpdqojmmtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037975.8357835-621-101917531645533/AnsiballZ_file.py'
Jan 21 23:26:16 compute-0 sudo[80569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:16 compute-0 python3.9[80571]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:16 compute-0 sudo[80569]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:16 compute-0 sudo[80721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqliwhsybcznfdukyldipwhfybzmkqhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037976.4414244-621-97595856765842/AnsiballZ_file.py'
Jan 21 23:26:16 compute-0 sudo[80721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:16 compute-0 python3.9[80723]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:16 compute-0 sudo[80721]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:17 compute-0 sudo[80873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iblrlkcynkzzsqjhulmohklkeoiivoyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037977.0962882-666-166265978985925/AnsiballZ_stat.py'
Jan 21 23:26:17 compute-0 sudo[80873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:17 compute-0 python3.9[80875]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:17 compute-0 sudo[80873]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:17 compute-0 sudo[80998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miblylzdvhbuprbcyxufsfhrtuaedysu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037977.0962882-666-166265978985925/AnsiballZ_copy.py'
Jan 21 23:26:17 compute-0 sudo[80998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:18 compute-0 sshd-session[80899]: Invalid user www-data from 188.166.69.60 port 53856
Jan 21 23:26:18 compute-0 python3.9[81000]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037977.0962882-666-166265978985925/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=ee6ea2dc62b4119e42438ed6bc2174b7e273039a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:18 compute-0 sudo[80998]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:18 compute-0 sshd-session[80899]: Connection closed by invalid user www-data 188.166.69.60 port 53856 [preauth]
Jan 21 23:26:18 compute-0 sudo[81150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wijtpchwtkddgliumcpoxtmhriragdza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037978.2523692-666-137797272413845/AnsiballZ_stat.py'
Jan 21 23:26:18 compute-0 sudo[81150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:18 compute-0 python3.9[81152]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:18 compute-0 sudo[81150]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:19 compute-0 sudo[81273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dokobmrifkortgiwqxgdxcnjeumivzqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037978.2523692-666-137797272413845/AnsiballZ_copy.py'
Jan 21 23:26:19 compute-0 sudo[81273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:19 compute-0 python3.9[81275]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037978.2523692-666-137797272413845/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=5ac53e5233bb5dc2a1a1ee89225b6d9cf54a324a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:19 compute-0 sudo[81273]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:19 compute-0 chronyd[64591]: Selected source 198.181.199.86 (pool.ntp.org)
Jan 21 23:26:19 compute-0 sudo[81425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmmkpslwozsqulhmwjugiumbgpwbwudf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037979.3469608-666-193374947292452/AnsiballZ_stat.py'
Jan 21 23:26:19 compute-0 sudo[81425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:19 compute-0 python3.9[81427]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:19 compute-0 sudo[81425]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:20 compute-0 sudo[81548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcnrmjiqjhumaczgqjiyrjkqacgmkysm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037979.3469608-666-193374947292452/AnsiballZ_copy.py'
Jan 21 23:26:20 compute-0 sudo[81548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:20 compute-0 python3.9[81550]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037979.3469608-666-193374947292452/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=bc05fac5fa4f2cdf7c86b85703e3f786633e2f6f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:20 compute-0 sudo[81548]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:21 compute-0 sudo[81700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igjmifngdgosujriavnitvydcxmobeky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037981.0939198-804-129547537766145/AnsiballZ_file.py'
Jan 21 23:26:21 compute-0 sudo[81700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:21 compute-0 python3.9[81702]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:21 compute-0 sudo[81700]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:21 compute-0 sudo[81852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znfpmavgczobddhwjigfqgijehrtkltx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037981.7305796-827-134176122832446/AnsiballZ_stat.py'
Jan 21 23:26:21 compute-0 sudo[81852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:22 compute-0 python3.9[81854]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:22 compute-0 sudo[81852]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:22 compute-0 sudo[81975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfdujrsuugosnfphdnxgnqaiwepmkymw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037981.7305796-827-134176122832446/AnsiballZ_copy.py'
Jan 21 23:26:22 compute-0 sudo[81975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:22 compute-0 python3.9[81977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037981.7305796-827-134176122832446/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:22 compute-0 sudo[81975]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:23 compute-0 sudo[82127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzzlapwazmpkssljejmkkygmqwjmyunj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037982.9145489-891-189905874957884/AnsiballZ_file.py'
Jan 21 23:26:23 compute-0 sudo[82127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:23 compute-0 python3.9[82129]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:23 compute-0 sudo[82127]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:23 compute-0 sudo[82279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhusxxevmnvnhalchxglxbcxrzwjdkrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037983.5541496-915-66117733196634/AnsiballZ_stat.py'
Jan 21 23:26:23 compute-0 sudo[82279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:24 compute-0 python3.9[82281]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:24 compute-0 sudo[82279]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:24 compute-0 sudo[82402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dswvwxuximrzxiktqhdsifvycsafkzqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037983.5541496-915-66117733196634/AnsiballZ_copy.py'
Jan 21 23:26:24 compute-0 sudo[82402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:24 compute-0 python3.9[82404]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037983.5541496-915-66117733196634/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:24 compute-0 sudo[82402]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:25 compute-0 sudo[82554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiqzjrybdtnetbievtwxuglqeyhfhkze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037984.8468518-962-173348504560861/AnsiballZ_file.py'
Jan 21 23:26:25 compute-0 sudo[82554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:25 compute-0 python3.9[82556]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:25 compute-0 sudo[82554]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:25 compute-0 sudo[82706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsnfsdrfjivxlakiudlvhtmlkqttgplg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037985.5037084-987-112697511263541/AnsiballZ_stat.py'
Jan 21 23:26:25 compute-0 sudo[82706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:26 compute-0 python3.9[82708]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:26 compute-0 sudo[82706]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:26 compute-0 sudo[82829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpwgnrgbwvpvajuycqxoizsifeykvbqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037985.5037084-987-112697511263541/AnsiballZ_copy.py'
Jan 21 23:26:26 compute-0 sudo[82829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:26 compute-0 python3.9[82831]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037985.5037084-987-112697511263541/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:26 compute-0 sudo[82829]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:27 compute-0 sudo[82981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcndsqqziwqperqwqooevtdbmlsraspr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037986.833225-1034-217769699662096/AnsiballZ_file.py'
Jan 21 23:26:27 compute-0 sudo[82981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:27 compute-0 python3.9[82983]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:27 compute-0 sudo[82981]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:27 compute-0 sudo[83133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zagzmpzqbdordaxrnbspntenmhiwjjar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037987.5076869-1059-104513531853791/AnsiballZ_stat.py'
Jan 21 23:26:27 compute-0 sudo[83133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:28 compute-0 python3.9[83135]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:28 compute-0 sudo[83133]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:28 compute-0 sudo[83256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnxlfsnbofruwevhcroswmxzwqviuljl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037987.5076869-1059-104513531853791/AnsiballZ_copy.py'
Jan 21 23:26:28 compute-0 sudo[83256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:28 compute-0 python3.9[83258]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037987.5076869-1059-104513531853791/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:28 compute-0 sudo[83256]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:29 compute-0 sudo[83408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adcnnmvibglqwcgdxmevglpgeeybtbea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037988.9608502-1107-52799433653639/AnsiballZ_file.py'
Jan 21 23:26:29 compute-0 sudo[83408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:29 compute-0 python3.9[83410]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:29 compute-0 sudo[83408]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:29 compute-0 sudo[83560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfnasmiyomoypreirckfrrficruxyfjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037989.6397583-1127-102852115143297/AnsiballZ_stat.py'
Jan 21 23:26:29 compute-0 sudo[83560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:30 compute-0 python3.9[83562]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:30 compute-0 sudo[83560]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:30 compute-0 sudo[83683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-motglmmhcglsanzycldhemvqxvyuvjfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037989.6397583-1127-102852115143297/AnsiballZ_copy.py'
Jan 21 23:26:30 compute-0 sudo[83683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:30 compute-0 python3.9[83685]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037989.6397583-1127-102852115143297/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:30 compute-0 sudo[83683]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:31 compute-0 sudo[83835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuwfifmmjandzrsvvpngnzcnqflvrlvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037990.867967-1170-7437152122277/AnsiballZ_file.py'
Jan 21 23:26:31 compute-0 sudo[83835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:31 compute-0 python3.9[83837]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:31 compute-0 sudo[83835]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:31 compute-0 sudo[83987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owavisqveanuovvholcoftrvfehdkbjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037991.486003-1195-234498346315957/AnsiballZ_stat.py'
Jan 21 23:26:31 compute-0 sudo[83987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:31 compute-0 python3.9[83989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:31 compute-0 sudo[83987]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:32 compute-0 sudo[84110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvxybionmmljlrsnzqqwziopspukihil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037991.486003-1195-234498346315957/AnsiballZ_copy.py'
Jan 21 23:26:32 compute-0 sudo[84110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:32 compute-0 python3.9[84112]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037991.486003-1195-234498346315957/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:32 compute-0 sudo[84110]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:33 compute-0 sudo[84262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehhcqvcljskeeqrlujewiiglpesjmlzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037992.7722635-1241-229169363040648/AnsiballZ_file.py'
Jan 21 23:26:33 compute-0 sudo[84262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:33 compute-0 python3.9[84264]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:33 compute-0 sudo[84262]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:33 compute-0 sudo[84414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktzxmblucbmrjimzdlehzwufwmeanynv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037993.455036-1264-196947831971829/AnsiballZ_stat.py'
Jan 21 23:26:33 compute-0 sudo[84414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:33 compute-0 python3.9[84416]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:33 compute-0 sudo[84414]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:34 compute-0 sudo[84537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sipscljaihiidzwxiogxubnsxbumkcvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769037993.455036-1264-196947831971829/AnsiballZ_copy.py'
Jan 21 23:26:34 compute-0 sudo[84537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:34 compute-0 python3.9[84539]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769037993.455036-1264-196947831971829/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=7d2dea4c5f9b91987b3f91d50d337a484f86b475 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:34 compute-0 sudo[84537]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:37 compute-0 sshd-session[76878]: Connection closed by 192.168.122.30 port 43528
Jan 21 23:26:37 compute-0 sshd-session[76875]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:26:37 compute-0 systemd-logind[784]: Session 18 logged out. Waiting for processes to exit.
Jan 21 23:26:37 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Jan 21 23:26:37 compute-0 systemd[1]: session-18.scope: Consumed 28.641s CPU time.
Jan 21 23:26:37 compute-0 systemd-logind[784]: Removed session 18.
Jan 21 23:26:43 compute-0 sshd-session[84564]: Accepted publickey for zuul from 192.168.122.30 port 51606 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:26:43 compute-0 systemd-logind[784]: New session 19 of user zuul.
Jan 21 23:26:43 compute-0 systemd[1]: Started Session 19 of User zuul.
Jan 21 23:26:43 compute-0 sshd-session[84564]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:26:44 compute-0 python3.9[84717]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:26:45 compute-0 sudo[84871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxinkitjpvqraxypkuuypwqqebtxtcga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038004.8118026-62-148205346476400/AnsiballZ_file.py'
Jan 21 23:26:45 compute-0 sudo[84871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:45 compute-0 python3.9[84873]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:45 compute-0 sudo[84871]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:45 compute-0 sudo[85023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwwvygpczoqgxbldltmdngunmgkhcgjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038005.5812483-62-4528076600940/AnsiballZ_file.py'
Jan 21 23:26:45 compute-0 sudo[85023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:46 compute-0 python3.9[85025]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:26:46 compute-0 sudo[85023]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:46 compute-0 python3.9[85175]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:26:47 compute-0 sudo[85325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhxkjvdkuonuxprguwvoxwgfaritoxcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038007.2217493-131-111870916510118/AnsiballZ_seboolean.py'
Jan 21 23:26:47 compute-0 sudo[85325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:47 compute-0 python3.9[85327]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 21 23:26:48 compute-0 sudo[85325]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:50 compute-0 sudo[85481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggxmejhnobqggtgrcyjfyfnqqstgqovz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038009.7213013-161-271488416576189/AnsiballZ_setup.py'
Jan 21 23:26:50 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 21 23:26:50 compute-0 sudo[85481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:50 compute-0 python3.9[85483]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:26:50 compute-0 sudo[85481]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:51 compute-0 sudo[85565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxfbfsgvstbzrsjykbjhfvbljzfvdwdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038009.7213013-161-271488416576189/AnsiballZ_dnf.py'
Jan 21 23:26:51 compute-0 sudo[85565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:51 compute-0 python3.9[85567]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:26:52 compute-0 sudo[85565]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:53 compute-0 sudo[85718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztuuujgztakyngpgcsaumvlghyeixdxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038012.8415396-197-103013450415868/AnsiballZ_systemd.py'
Jan 21 23:26:53 compute-0 sudo[85718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:53 compute-0 python3.9[85720]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:26:53 compute-0 sudo[85718]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:54 compute-0 sudo[85873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trzofkpznocpwzsgjipgxjitgnwgrlfa ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038014.0593863-221-217425646719498/AnsiballZ_edpm_nftables_snippet.py'
Jan 21 23:26:54 compute-0 sudo[85873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:54 compute-0 python3[85875]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 21 23:26:54 compute-0 sudo[85873]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:55 compute-0 sudo[86025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gixnoalkolnjkafaymkqmysojokiiejl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038015.0991547-248-69475899036910/AnsiballZ_file.py'
Jan 21 23:26:55 compute-0 sudo[86025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:55 compute-0 python3.9[86027]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:55 compute-0 sudo[86025]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:56 compute-0 sudo[86177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdpsttlhemitqywvdbvecojsvcsyqxkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038015.8473241-272-119042179313404/AnsiballZ_stat.py'
Jan 21 23:26:56 compute-0 sudo[86177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:56 compute-0 python3.9[86179]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:56 compute-0 sudo[86177]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:56 compute-0 sudo[86255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibktsprnggxfwutxmctjjglwoorjajre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038015.8473241-272-119042179313404/AnsiballZ_file.py'
Jan 21 23:26:56 compute-0 sudo[86255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:56 compute-0 python3.9[86257]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:56 compute-0 sudo[86255]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:57 compute-0 sudo[86407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiuyomelcvevaxbepbfqmxkgkocfmqcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038017.292566-308-149630265854682/AnsiballZ_stat.py'
Jan 21 23:26:57 compute-0 sudo[86407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:57 compute-0 python3.9[86409]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:57 compute-0 sudo[86407]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:58 compute-0 sudo[86485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbkoyiqiiqwkszkzrutmarzealfqgaxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038017.292566-308-149630265854682/AnsiballZ_file.py'
Jan 21 23:26:58 compute-0 sudo[86485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:58 compute-0 python3.9[86487]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.7aie5v72 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:58 compute-0 sudo[86485]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:58 compute-0 sudo[86637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsnfyquqjdhcsllwsbkjqpykmeealkpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038018.5649602-344-220286513227042/AnsiballZ_stat.py'
Jan 21 23:26:58 compute-0 sudo[86637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:59 compute-0 python3.9[86639]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:26:59 compute-0 sudo[86637]: pam_unix(sudo:session): session closed for user root
Jan 21 23:26:59 compute-0 sudo[86715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imgzazxcdkqyqdeowpexsisbcgbiedto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038018.5649602-344-220286513227042/AnsiballZ_file.py'
Jan 21 23:26:59 compute-0 sudo[86715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:26:59 compute-0 python3.9[86717]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:26:59 compute-0 sudo[86715]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:00 compute-0 sudo[86867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmkfofjqcpsnqbrcryzwirurztjpgbby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038019.875261-383-146534921761316/AnsiballZ_command.py'
Jan 21 23:27:00 compute-0 sudo[86867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:00 compute-0 python3.9[86869]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:00 compute-0 sudo[86867]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:01 compute-0 sudo[87020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxqlncgkqzuynlkzoqkujgfchrjufqjg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038020.7912817-407-139681156771775/AnsiballZ_edpm_nftables_from_files.py'
Jan 21 23:27:01 compute-0 sudo[87020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:01 compute-0 python3[87022]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 23:27:01 compute-0 sudo[87020]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:02 compute-0 sudo[87172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ospqtqaajhxhxegzoieymimwqfmadjjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038021.7103176-431-226371063749811/AnsiballZ_stat.py'
Jan 21 23:27:02 compute-0 sudo[87172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:02 compute-0 python3.9[87174]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:02 compute-0 sudo[87172]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:02 compute-0 sudo[87299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhfvokeucsyiibgfeodrgatkflpyipco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038021.7103176-431-226371063749811/AnsiballZ_copy.py'
Jan 21 23:27:02 compute-0 sudo[87299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:02 compute-0 sshd-session[87175]: Invalid user www-data from 188.166.69.60 port 39026
Jan 21 23:27:02 compute-0 python3.9[87301]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038021.7103176-431-226371063749811/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:02 compute-0 sudo[87299]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:03 compute-0 sshd-session[87175]: Connection closed by invalid user www-data 188.166.69.60 port 39026 [preauth]
Jan 21 23:27:03 compute-0 sudo[87451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpfnehwnjcxikalpsjhutorsfbueazkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038023.2116957-476-224589530722177/AnsiballZ_stat.py'
Jan 21 23:27:03 compute-0 sudo[87451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:03 compute-0 python3.9[87453]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:03 compute-0 sudo[87451]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:03 compute-0 sudo[87576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgmkmkhhdghddwuqhemtmzziqktnhmxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038023.2116957-476-224589530722177/AnsiballZ_copy.py'
Jan 21 23:27:03 compute-0 sudo[87576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:04 compute-0 python3.9[87578]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038023.2116957-476-224589530722177/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:04 compute-0 sudo[87576]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:04 compute-0 sudo[87728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvvbsxpemwvmjjihtxxmvwokwgbwvnhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038024.6117384-521-183344406435777/AnsiballZ_stat.py'
Jan 21 23:27:04 compute-0 sudo[87728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:05 compute-0 python3.9[87730]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:05 compute-0 sudo[87728]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:05 compute-0 sudo[87853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahpjauohrtppjvwrymnsibkuhtybawjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038024.6117384-521-183344406435777/AnsiballZ_copy.py'
Jan 21 23:27:05 compute-0 sudo[87853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:05 compute-0 python3.9[87855]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038024.6117384-521-183344406435777/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:05 compute-0 sudo[87853]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:06 compute-0 sudo[88005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryorootwtydrokpccqetckjmjsgryakd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038026.0501287-566-164729812513588/AnsiballZ_stat.py'
Jan 21 23:27:06 compute-0 sudo[88005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:06 compute-0 python3.9[88007]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:06 compute-0 sudo[88005]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:07 compute-0 sudo[88130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bebovenpetiwizoixuhesrsjrmsvqnxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038026.0501287-566-164729812513588/AnsiballZ_copy.py'
Jan 21 23:27:07 compute-0 sudo[88130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:07 compute-0 python3.9[88132]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038026.0501287-566-164729812513588/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:07 compute-0 sudo[88130]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:08 compute-0 sudo[88282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgxrmmtuevxqpndkokamravbllxayhlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038027.5469785-611-147989919330844/AnsiballZ_stat.py'
Jan 21 23:27:08 compute-0 sudo[88282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:08 compute-0 python3.9[88284]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:08 compute-0 sudo[88282]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:08 compute-0 sudo[88407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohwxnrnakycqxnouiyubifpvgmxffbnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038027.5469785-611-147989919330844/AnsiballZ_copy.py'
Jan 21 23:27:08 compute-0 sudo[88407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:09 compute-0 python3.9[88409]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038027.5469785-611-147989919330844/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:09 compute-0 sudo[88407]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:09 compute-0 sudo[88559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvyxmuyhpvtypjhxzbdtymjiiwjgfngh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038029.3072696-656-71158033125706/AnsiballZ_file.py'
Jan 21 23:27:09 compute-0 sudo[88559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:09 compute-0 python3.9[88561]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:09 compute-0 sudo[88559]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:10 compute-0 sudo[88711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixwkalbjkujauwdcglfsjdumdufnqyri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038030.1291125-680-2178498747732/AnsiballZ_command.py'
Jan 21 23:27:10 compute-0 sudo[88711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:10 compute-0 python3.9[88713]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:10 compute-0 sudo[88711]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:11 compute-0 sudo[88866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evwgkzqzdnuwbpnjblcbfdnrmujnznhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038030.9222958-704-256934817600950/AnsiballZ_blockinfile.py'
Jan 21 23:27:11 compute-0 sudo[88866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:11 compute-0 python3.9[88868]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:11 compute-0 sudo[88866]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:12 compute-0 sudo[89018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jweeugrqtqpliffqxlnsygipftogrtwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038031.9365466-731-71379091672968/AnsiballZ_command.py'
Jan 21 23:27:12 compute-0 sudo[89018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:12 compute-0 python3.9[89020]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:12 compute-0 sudo[89018]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:13 compute-0 sudo[89171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gznpskyfngrxfmwiyvoteuqllqtocqga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038032.9929285-755-216843071673783/AnsiballZ_stat.py'
Jan 21 23:27:13 compute-0 sudo[89171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:13 compute-0 python3.9[89173]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:27:13 compute-0 sudo[89171]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:14 compute-0 sudo[89325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrhivoxxwurtyzmzmvcqraoamwcxtpdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038033.7669322-779-224248639991018/AnsiballZ_command.py'
Jan 21 23:27:14 compute-0 sudo[89325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:14 compute-0 python3.9[89327]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:14 compute-0 sudo[89325]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:14 compute-0 sudo[89480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qanvvaippzwvzaswjbjjqtyqsrabxqzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038034.5487778-803-247117190520525/AnsiballZ_file.py'
Jan 21 23:27:14 compute-0 sudo[89480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:15 compute-0 python3.9[89482]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:15 compute-0 sudo[89480]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:16 compute-0 python3.9[89632]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:27:17 compute-0 sudo[89783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdebkfbupdoevatfzbrbcnbxtwtcehni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038037.2949042-923-45954225216279/AnsiballZ_command.py'
Jan 21 23:27:17 compute-0 sudo[89783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:17 compute-0 python3.9[89785]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:17 compute-0 ovs-vsctl[89786]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 21 23:27:17 compute-0 sudo[89783]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:18 compute-0 sudo[89936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovvwrzgvschpvjrariomvefzapetakmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038038.1136658-950-13502794037321/AnsiballZ_command.py'
Jan 21 23:27:18 compute-0 sudo[89936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:18 compute-0 python3.9[89938]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:18 compute-0 sudo[89936]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:19 compute-0 sudo[90091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjneejsexpyrxqufkllfigaursotfutx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038038.8954673-974-217024546894518/AnsiballZ_command.py'
Jan 21 23:27:19 compute-0 sudo[90091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:19 compute-0 python3.9[90093]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:19 compute-0 ovs-vsctl[90094]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 21 23:27:19 compute-0 sudo[90091]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:20 compute-0 python3.9[90244]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:27:20 compute-0 sudo[90396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnqxuuhepqitsvaropyogwmohkbdlbyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038040.5456276-1025-111384906524642/AnsiballZ_file.py'
Jan 21 23:27:20 compute-0 sudo[90396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:21 compute-0 python3.9[90398]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:27:21 compute-0 sudo[90396]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:21 compute-0 sudo[90548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiuezxxepbqssrojoswsvtyojszrgjbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038041.279567-1049-72140814963475/AnsiballZ_stat.py'
Jan 21 23:27:21 compute-0 sudo[90548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:21 compute-0 python3.9[90550]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:21 compute-0 sudo[90548]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:22 compute-0 sudo[90626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rroazzddeherafyhgufnoahhcnpbmxfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038041.279567-1049-72140814963475/AnsiballZ_file.py'
Jan 21 23:27:22 compute-0 sudo[90626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:22 compute-0 python3.9[90628]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:27:22 compute-0 sudo[90626]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:22 compute-0 sudo[90778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkshvtzwycopihrhasoujwanoxvnxzzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038042.3830764-1049-200979276326442/AnsiballZ_stat.py'
Jan 21 23:27:22 compute-0 sudo[90778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:22 compute-0 python3.9[90780]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:22 compute-0 sudo[90778]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:23 compute-0 sudo[90856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhahofgdxwtlhedncurgzyyrerfuthak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038042.3830764-1049-200979276326442/AnsiballZ_file.py'
Jan 21 23:27:23 compute-0 sudo[90856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:23 compute-0 python3.9[90858]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:27:23 compute-0 sudo[90856]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:23 compute-0 sudo[91008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcoinfwevdkoridwyaxlsqymdzabouec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038043.639107-1118-200051251560877/AnsiballZ_file.py'
Jan 21 23:27:23 compute-0 sudo[91008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:24 compute-0 python3.9[91010]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:24 compute-0 sudo[91008]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:24 compute-0 sudo[91160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqleicmhvizewdkfzrpcrwturadsyevf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038044.436764-1142-182751967384787/AnsiballZ_stat.py'
Jan 21 23:27:24 compute-0 sudo[91160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:24 compute-0 python3.9[91162]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:25 compute-0 sudo[91160]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:25 compute-0 sudo[91238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhexyqgozwsvbspgmyybzdefhxlfhkdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038044.436764-1142-182751967384787/AnsiballZ_file.py'
Jan 21 23:27:25 compute-0 sudo[91238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:25 compute-0 python3.9[91240]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:25 compute-0 sudo[91238]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:25 compute-0 sudo[91390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxrfrrumhpxbrqpfflfqmctbjyrbazqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038045.6886532-1178-262611934286894/AnsiballZ_stat.py'
Jan 21 23:27:25 compute-0 sudo[91390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:26 compute-0 python3.9[91392]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:26 compute-0 sudo[91390]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:26 compute-0 sudo[91468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njorjdokazinjusufbvaufuojesfyqvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038045.6886532-1178-262611934286894/AnsiballZ_file.py'
Jan 21 23:27:26 compute-0 sudo[91468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:26 compute-0 python3.9[91470]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:26 compute-0 sudo[91468]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:27 compute-0 sudo[91620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkdhklgurnldekbhqerehnqwdxduyrst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038046.8777316-1214-278828248035498/AnsiballZ_systemd.py'
Jan 21 23:27:27 compute-0 sudo[91620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:27 compute-0 python3.9[91622]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:27:27 compute-0 systemd[1]: Reloading.
Jan 21 23:27:27 compute-0 systemd-rc-local-generator[91649]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:27:27 compute-0 systemd-sysv-generator[91652]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:27:27 compute-0 sudo[91620]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:28 compute-0 sudo[91808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsihdrxedddskehyjdcqymytxmsemcul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038048.0779855-1238-68837920373186/AnsiballZ_stat.py'
Jan 21 23:27:28 compute-0 sudo[91808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:28 compute-0 python3.9[91810]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:28 compute-0 sudo[91808]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:28 compute-0 sudo[91886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lumezncfpvrymouafhslsaegfpmlklco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038048.0779855-1238-68837920373186/AnsiballZ_file.py'
Jan 21 23:27:28 compute-0 sudo[91886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:29 compute-0 python3.9[91888]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:29 compute-0 sudo[91886]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:29 compute-0 sudo[92038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rncdkzcmkgglxnfpqoaatzapsjdccahj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038049.381173-1274-128836605992748/AnsiballZ_stat.py'
Jan 21 23:27:29 compute-0 sudo[92038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:29 compute-0 python3.9[92040]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:29 compute-0 sudo[92038]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:30 compute-0 sudo[92116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddonwhjrzzclgnfpeqghpqvwgwkcaswv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038049.381173-1274-128836605992748/AnsiballZ_file.py'
Jan 21 23:27:30 compute-0 sudo[92116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:30 compute-0 python3.9[92118]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:30 compute-0 sudo[92116]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:30 compute-0 sudo[92268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lceolanrxqrtqaslxxjwdndruqtscrmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038050.6521313-1310-8067121749691/AnsiballZ_systemd.py'
Jan 21 23:27:30 compute-0 sudo[92268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:31 compute-0 python3.9[92270]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:27:31 compute-0 systemd[1]: Reloading.
Jan 21 23:27:31 compute-0 systemd-sysv-generator[92303]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:27:31 compute-0 systemd-rc-local-generator[92299]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:27:31 compute-0 systemd[1]: Starting Create netns directory...
Jan 21 23:27:31 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 21 23:27:31 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 21 23:27:31 compute-0 systemd[1]: Finished Create netns directory.
Jan 21 23:27:31 compute-0 sudo[92268]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:32 compute-0 sudo[92463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpazsqrzyndpvpwalqgokthnrfsotocd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038052.042778-1340-248324675741554/AnsiballZ_file.py'
Jan 21 23:27:32 compute-0 sudo[92463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:32 compute-0 python3.9[92465]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:27:32 compute-0 sudo[92463]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:32 compute-0 sudo[92615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkgacidfrasqrhfyumqswrbfqwcquaep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038052.740832-1364-249841731866263/AnsiballZ_stat.py'
Jan 21 23:27:32 compute-0 sudo[92615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:33 compute-0 python3.9[92617]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:33 compute-0 sudo[92615]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:33 compute-0 sudo[92738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqhsxiepfvzoennevaqdctxlbkkjfjte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038052.740832-1364-249841731866263/AnsiballZ_copy.py'
Jan 21 23:27:33 compute-0 sudo[92738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:33 compute-0 python3.9[92740]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038052.740832-1364-249841731866263/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:27:33 compute-0 sudo[92738]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:34 compute-0 sudo[92890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dptxowphjkgcuejyleywcdqkbqjmymzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038054.1980135-1415-166580849550984/AnsiballZ_file.py'
Jan 21 23:27:34 compute-0 sudo[92890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:34 compute-0 python3.9[92892]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:34 compute-0 sudo[92890]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:35 compute-0 sudo[93042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rddmpsdjdyjfeoptphgulnrihzfkcghi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038054.968801-1439-157546696928233/AnsiballZ_file.py'
Jan 21 23:27:35 compute-0 sudo[93042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:35 compute-0 python3.9[93044]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:27:35 compute-0 sudo[93042]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:35 compute-0 sudo[93194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebsokdboycvadlcrrqliqanuqqvelspb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038055.7349312-1463-174731697710895/AnsiballZ_stat.py'
Jan 21 23:27:35 compute-0 sudo[93194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:36 compute-0 python3.9[93196]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:36 compute-0 sudo[93194]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:36 compute-0 sudo[93317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfiswrvbvzvjrjoldqdwdbihmecwnvji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038055.7349312-1463-174731697710895/AnsiballZ_copy.py'
Jan 21 23:27:36 compute-0 sudo[93317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:36 compute-0 python3.9[93319]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038055.7349312-1463-174731697710895/.source.json _original_basename=.gfgoet6g follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:36 compute-0 sudo[93317]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:37 compute-0 python3.9[93469]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:39 compute-0 sudo[93890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmynsnziwuvrfraihyiemqvsslnmqrav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038059.4507303-1583-29139084824107/AnsiballZ_container_config_data.py'
Jan 21 23:27:39 compute-0 sudo[93890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:40 compute-0 python3.9[93892]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 21 23:27:40 compute-0 sudo[93890]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:41 compute-0 sudo[94042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zghdiatopywmqtwfhtgideggatqfqrww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038060.5567877-1616-175184036418191/AnsiballZ_container_config_hash.py'
Jan 21 23:27:41 compute-0 sudo[94042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:41 compute-0 python3.9[94044]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:27:41 compute-0 sudo[94042]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:42 compute-0 sudo[94194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlavlfaietdawqsljwwfbhnqwxowvvct ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038061.6622183-1646-97115682318633/AnsiballZ_edpm_container_manage.py'
Jan 21 23:27:42 compute-0 sudo[94194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:42 compute-0 python3[94196]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:27:42 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:27:42 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:27:42 compute-0 podman[94232]: 2026-01-21 23:27:42.721548104 +0000 UTC m=+0.062787876 container create 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:27:42 compute-0 podman[94232]: 2026-01-21 23:27:42.690511979 +0000 UTC m=+0.031751781 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 21 23:27:42 compute-0 python3[94196]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 21 23:27:42 compute-0 sudo[94194]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:43 compute-0 sudo[94419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzfctxwibuzvuqssrogagifvlblhcjrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038063.0745513-1670-23362333876634/AnsiballZ_stat.py'
Jan 21 23:27:43 compute-0 sudo[94419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 21 23:27:43 compute-0 python3.9[94421]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:27:43 compute-0 sudo[94419]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:44 compute-0 sudo[94573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueynjbomudpatoazzxrpryamgejulcpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038063.907746-1697-178481523082929/AnsiballZ_file.py'
Jan 21 23:27:44 compute-0 sudo[94573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:44 compute-0 python3.9[94575]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:44 compute-0 sudo[94573]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:44 compute-0 sudo[94649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbwluqfjbnzbjrubqruewwswmnmzxyad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038063.907746-1697-178481523082929/AnsiballZ_stat.py'
Jan 21 23:27:44 compute-0 sudo[94649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:44 compute-0 python3.9[94651]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:27:44 compute-0 sudo[94649]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:45 compute-0 sudo[94800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjhnstrrtzjyefyljicvjsibmlqjggie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038064.7625096-1697-114322633709317/AnsiballZ_copy.py'
Jan 21 23:27:45 compute-0 sudo[94800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:45 compute-0 python3.9[94802]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038064.7625096-1697-114322633709317/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:45 compute-0 sudo[94800]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:45 compute-0 sudo[94876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yciwvuxiklrispyrdcqhaicojtlpwbvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038064.7625096-1697-114322633709317/AnsiballZ_systemd.py'
Jan 21 23:27:45 compute-0 sudo[94876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:45 compute-0 python3.9[94878]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:27:45 compute-0 systemd[1]: Reloading.
Jan 21 23:27:45 compute-0 systemd-sysv-generator[94910]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:27:45 compute-0 systemd-rc-local-generator[94907]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:27:46 compute-0 sudo[94876]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:46 compute-0 sudo[94989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfcuafstcpdehnrchdckscphenkzzzxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038064.7625096-1697-114322633709317/AnsiballZ_systemd.py'
Jan 21 23:27:46 compute-0 sudo[94989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:46 compute-0 python3.9[94991]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:27:46 compute-0 systemd[1]: Reloading.
Jan 21 23:27:46 compute-0 systemd-rc-local-generator[95020]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:27:46 compute-0 systemd-sysv-generator[95023]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:27:46 compute-0 systemd[1]: Starting ovn_controller container...
Jan 21 23:27:47 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 21 23:27:47 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:27:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47d073cd4ffcdd112d5831cc7e02104886a4d994e3cb33c8c01d7578feb6e9f8/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 21 23:27:47 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61.
Jan 21 23:27:47 compute-0 podman[95031]: 2026-01-21 23:27:47.134920893 +0000 UTC m=+0.129334530 container init 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 23:27:47 compute-0 ovn_controller[95047]: + sudo -E kolla_set_configs
Jan 21 23:27:47 compute-0 podman[95031]: 2026-01-21 23:27:47.164284388 +0000 UTC m=+0.158698225 container start 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_managed=true)
Jan 21 23:27:47 compute-0 edpm-start-podman-container[95031]: ovn_controller
Jan 21 23:27:47 compute-0 systemd[1]: Created slice User Slice of UID 0.
Jan 21 23:27:47 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 21 23:27:47 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 21 23:27:47 compute-0 systemd[1]: Starting User Manager for UID 0...
Jan 21 23:27:47 compute-0 systemd[95086]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 21 23:27:47 compute-0 edpm-start-podman-container[95030]: Creating additional drop-in dependency for "ovn_controller" (5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61)
Jan 21 23:27:47 compute-0 podman[95053]: 2026-01-21 23:27:47.253649802 +0000 UTC m=+0.077801925 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 21 23:27:47 compute-0 systemd[1]: 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61-1f7e858f840f641.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 23:27:47 compute-0 systemd[1]: 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61-1f7e858f840f641.service: Failed with result 'exit-code'.
Jan 21 23:27:47 compute-0 systemd[1]: Reloading.
Jan 21 23:27:47 compute-0 systemd-sysv-generator[95140]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:27:47 compute-0 systemd[95086]: Queued start job for default target Main User Target.
Jan 21 23:27:47 compute-0 systemd-rc-local-generator[95136]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:27:47 compute-0 systemd[95086]: Created slice User Application Slice.
Jan 21 23:27:47 compute-0 systemd[95086]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 21 23:27:47 compute-0 systemd[95086]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:27:47 compute-0 systemd[95086]: Reached target Paths.
Jan 21 23:27:47 compute-0 systemd[95086]: Reached target Timers.
Jan 21 23:27:47 compute-0 systemd[95086]: Starting D-Bus User Message Bus Socket...
Jan 21 23:27:47 compute-0 systemd[95086]: Starting Create User's Volatile Files and Directories...
Jan 21 23:27:47 compute-0 systemd[95086]: Finished Create User's Volatile Files and Directories.
Jan 21 23:27:47 compute-0 systemd[95086]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:27:47 compute-0 systemd[95086]: Reached target Sockets.
Jan 21 23:27:47 compute-0 systemd[95086]: Reached target Basic System.
Jan 21 23:27:47 compute-0 systemd[95086]: Reached target Main User Target.
Jan 21 23:27:47 compute-0 systemd[95086]: Startup finished in 151ms.
Jan 21 23:27:47 compute-0 systemd[1]: Started User Manager for UID 0.
Jan 21 23:27:47 compute-0 systemd[1]: Started ovn_controller container.
Jan 21 23:27:47 compute-0 systemd[1]: Started Session c1 of User root.
Jan 21 23:27:47 compute-0 sudo[94989]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:47 compute-0 ovn_controller[95047]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 23:27:47 compute-0 ovn_controller[95047]: INFO:__main__:Validating config file
Jan 21 23:27:47 compute-0 ovn_controller[95047]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 23:27:47 compute-0 ovn_controller[95047]: INFO:__main__:Writing out command to execute
Jan 21 23:27:47 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 21 23:27:47 compute-0 ovn_controller[95047]: ++ cat /run_command
Jan 21 23:27:47 compute-0 ovn_controller[95047]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 21 23:27:47 compute-0 ovn_controller[95047]: + ARGS=
Jan 21 23:27:47 compute-0 ovn_controller[95047]: + sudo kolla_copy_cacerts
Jan 21 23:27:47 compute-0 systemd[1]: Started Session c2 of User root.
Jan 21 23:27:47 compute-0 sshd-session[95098]: Invalid user www-data from 188.166.69.60 port 59922
Jan 21 23:27:47 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 21 23:27:47 compute-0 ovn_controller[95047]: + [[ ! -n '' ]]
Jan 21 23:27:47 compute-0 ovn_controller[95047]: + . kolla_extend_start
Jan 21 23:27:47 compute-0 ovn_controller[95047]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 21 23:27:47 compute-0 ovn_controller[95047]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 21 23:27:47 compute-0 ovn_controller[95047]: + umask 0022
Jan 21 23:27:47 compute-0 ovn_controller[95047]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 21 23:27:47 compute-0 NetworkManager[55139]: <info>  [1769038067.7032] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 21 23:27:47 compute-0 NetworkManager[55139]: <info>  [1769038067.7041] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:27:47 compute-0 NetworkManager[55139]: <warn>  [1769038067.7044] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:27:47 compute-0 NetworkManager[55139]: <info>  [1769038067.7049] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 21 23:27:47 compute-0 NetworkManager[55139]: <info>  [1769038067.7053] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 21 23:27:47 compute-0 NetworkManager[55139]: <info>  [1769038067.7055] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 21 23:27:47 compute-0 kernel: br-int: entered promiscuous mode
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 23:27:47 compute-0 ovn_controller[95047]: 2026-01-21T23:27:47Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 21 23:27:47 compute-0 NetworkManager[55139]: <info>  [1769038067.7266] manager: (ovn-f0bd48-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 21 23:27:47 compute-0 systemd-udevd[95183]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:27:47 compute-0 systemd-udevd[95186]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:27:47 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Jan 21 23:27:47 compute-0 NetworkManager[55139]: <info>  [1769038067.7417] device (genev_sys_6081): carrier: link connected
Jan 21 23:27:47 compute-0 NetworkManager[55139]: <info>  [1769038067.7420] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Jan 21 23:27:47 compute-0 sshd-session[95098]: Connection closed by invalid user www-data 188.166.69.60 port 59922 [preauth]
Jan 21 23:27:47 compute-0 NetworkManager[55139]: <info>  [1769038067.9882] manager: (ovn-ce4b29-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 21 23:27:48 compute-0 NetworkManager[55139]: <info>  [1769038068.3978] manager: (ovn-74526b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 21 23:27:49 compute-0 python3.9[95314]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 23:27:50 compute-0 sudo[95464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hstffzoupimuzzbogganrexydhjzpxdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038070.0202522-1832-10098018055541/AnsiballZ_stat.py'
Jan 21 23:27:50 compute-0 sudo[95464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:50 compute-0 python3.9[95466]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:27:50 compute-0 sudo[95464]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:50 compute-0 sudo[95587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuhtcrgttganaykvpackdvfxoldjzkti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038070.0202522-1832-10098018055541/AnsiballZ_copy.py'
Jan 21 23:27:50 compute-0 sudo[95587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:51 compute-0 python3.9[95589]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038070.0202522-1832-10098018055541/.source.yaml _original_basename=.2xns4yhm follow=False checksum=1fa0f89c2313d90a3d28193c1cbb0dd87b38dad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:27:51 compute-0 sudo[95587]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:51 compute-0 sudo[95739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viudszddbpnnzusymjfdftmfjmtwypog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038071.4587889-1877-179138812016151/AnsiballZ_command.py'
Jan 21 23:27:51 compute-0 sudo[95739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:51 compute-0 python3.9[95741]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:51 compute-0 ovs-vsctl[95742]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 21 23:27:51 compute-0 sudo[95739]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:52 compute-0 sudo[95892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shpcwsbwwugazhhziubbwxtozrakoyzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038072.2152936-1901-275378038560746/AnsiballZ_command.py'
Jan 21 23:27:52 compute-0 sudo[95892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:52 compute-0 python3.9[95894]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:52 compute-0 ovs-vsctl[95896]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 21 23:27:52 compute-0 sudo[95892]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:53 compute-0 sudo[96047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frjjqnzprrruvlciogazuwdmtxeqidqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038073.4293628-1943-276357092422090/AnsiballZ_command.py'
Jan 21 23:27:53 compute-0 sudo[96047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:27:53 compute-0 python3.9[96049]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:27:53 compute-0 ovs-vsctl[96050]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 21 23:27:54 compute-0 sudo[96047]: pam_unix(sudo:session): session closed for user root
Jan 21 23:27:54 compute-0 sshd-session[84567]: Connection closed by 192.168.122.30 port 51606
Jan 21 23:27:54 compute-0 sshd-session[84564]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:27:54 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Jan 21 23:27:54 compute-0 systemd[1]: session-19.scope: Consumed 47.794s CPU time.
Jan 21 23:27:54 compute-0 systemd-logind[784]: Session 19 logged out. Waiting for processes to exit.
Jan 21 23:27:54 compute-0 systemd-logind[784]: Removed session 19.
Jan 21 23:27:57 compute-0 systemd[1]: Stopping User Manager for UID 0...
Jan 21 23:27:57 compute-0 systemd[95086]: Activating special unit Exit the Session...
Jan 21 23:27:57 compute-0 systemd[95086]: Stopped target Main User Target.
Jan 21 23:27:57 compute-0 systemd[95086]: Stopped target Basic System.
Jan 21 23:27:57 compute-0 systemd[95086]: Stopped target Paths.
Jan 21 23:27:57 compute-0 systemd[95086]: Stopped target Sockets.
Jan 21 23:27:57 compute-0 systemd[95086]: Stopped target Timers.
Jan 21 23:27:57 compute-0 systemd[95086]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:27:57 compute-0 systemd[95086]: Closed D-Bus User Message Bus Socket.
Jan 21 23:27:57 compute-0 systemd[95086]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:27:57 compute-0 systemd[95086]: Removed slice User Application Slice.
Jan 21 23:27:57 compute-0 systemd[95086]: Reached target Shutdown.
Jan 21 23:27:57 compute-0 systemd[95086]: Finished Exit the Session.
Jan 21 23:27:57 compute-0 systemd[95086]: Reached target Exit the Session.
Jan 21 23:27:57 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Jan 21 23:27:57 compute-0 systemd[1]: Stopped User Manager for UID 0.
Jan 21 23:27:57 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 21 23:27:57 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 21 23:27:57 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 21 23:27:57 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 21 23:27:57 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Jan 21 23:27:59 compute-0 sshd-session[96077]: Accepted publickey for zuul from 192.168.122.30 port 42838 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:27:59 compute-0 systemd-logind[784]: New session 21 of user zuul.
Jan 21 23:27:59 compute-0 systemd[1]: Started Session 21 of User zuul.
Jan 21 23:27:59 compute-0 sshd-session[96077]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:28:01 compute-0 python3.9[96230]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:28:02 compute-0 sudo[96384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eomsyvfedcpsoyeauwhpaeslnxnanhkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038081.9344587-62-50533527272098/AnsiballZ_file.py'
Jan 21 23:28:02 compute-0 sudo[96384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:02 compute-0 python3.9[96386]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:02 compute-0 sudo[96384]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:03 compute-0 sudo[96536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmfqgasxwkuboqbubchaiwszjupdoyma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038082.7478888-62-242548894627964/AnsiballZ_file.py'
Jan 21 23:28:03 compute-0 sudo[96536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:03 compute-0 python3.9[96538]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:03 compute-0 sudo[96536]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:03 compute-0 sudo[96688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbgfpftjyrmhdzdyhqinaxfhirsdzegz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038083.4020715-62-258992569975193/AnsiballZ_file.py'
Jan 21 23:28:03 compute-0 sudo[96688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:03 compute-0 python3.9[96690]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:03 compute-0 sudo[96688]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:04 compute-0 sudo[96840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbutnbbpxcjgjqiuikgibboqkotdgrum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038084.0495353-62-223018226071219/AnsiballZ_file.py'
Jan 21 23:28:04 compute-0 sudo[96840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:04 compute-0 python3.9[96842]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:04 compute-0 sudo[96840]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:04 compute-0 sudo[96992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gopudggtrafnfbasexvwqrpuavkdnles ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038084.632048-62-213862579190688/AnsiballZ_file.py'
Jan 21 23:28:04 compute-0 sudo[96992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:05 compute-0 python3.9[96994]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:05 compute-0 sudo[96992]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:06 compute-0 python3.9[97145]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:28:06 compute-0 sudo[97295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcezqzgbrrbtrohpuqolgxabsurgemai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038086.427407-194-16982504273112/AnsiballZ_seboolean.py'
Jan 21 23:28:06 compute-0 sudo[97295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:07 compute-0 python3.9[97297]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 21 23:28:07 compute-0 sudo[97295]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:08 compute-0 python3.9[97447]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:09 compute-0 python3.9[97568]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038087.9539192-218-168709493257427/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:10 compute-0 python3.9[97718]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:10 compute-0 python3.9[97839]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038089.5952218-263-259580589642047/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:11 compute-0 sudo[97989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsvbbpndggvixcpnjeccocigmhudfetp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038091.1264176-314-64435058036101/AnsiballZ_setup.py'
Jan 21 23:28:11 compute-0 sudo[97989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:11 compute-0 python3.9[97991]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:28:12 compute-0 sudo[97989]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:12 compute-0 sudo[98073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-objmrsmpfobphywqmcwnwjioiggonnvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038091.1264176-314-64435058036101/AnsiballZ_dnf.py'
Jan 21 23:28:12 compute-0 sudo[98073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:12 compute-0 python3.9[98075]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:28:14 compute-0 sudo[98073]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:15 compute-0 sudo[98226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdvqtnvqcgzjojhqrhvslggjunxoaycr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038094.6231887-350-193587393310965/AnsiballZ_systemd.py'
Jan 21 23:28:15 compute-0 sudo[98226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:15 compute-0 python3.9[98228]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:28:15 compute-0 sudo[98226]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:16 compute-0 python3.9[98381]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:17 compute-0 python3.9[98502]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038095.9627903-374-131810175707612/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:17 compute-0 ovn_controller[95047]: 2026-01-21T23:28:17Z|00025|memory|INFO|15872 kB peak resident set size after 29.9 seconds
Jan 21 23:28:17 compute-0 ovn_controller[95047]: 2026-01-21T23:28:17Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 21 23:28:17 compute-0 podman[98626]: 2026-01-21 23:28:17.564387546 +0000 UTC m=+0.112767502 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:28:17 compute-0 python3.9[98663]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:18 compute-0 python3.9[98800]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038097.1839993-374-45912798865308/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:19 compute-0 python3.9[98950]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:20 compute-0 python3.9[99071]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038099.0298934-506-76652579803492/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:20 compute-0 python3.9[99221]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:21 compute-0 python3.9[99342]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038100.4195728-506-132645466681866/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:22 compute-0 python3.9[99492]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:28:22 compute-0 sudo[99644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eopbsyverbphdghqijsiwztxehprblmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038102.594719-620-191819423731886/AnsiballZ_file.py'
Jan 21 23:28:22 compute-0 sudo[99644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:23 compute-0 python3.9[99646]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:23 compute-0 sudo[99644]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:23 compute-0 sudo[99796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzcmyvfjstbvnqcouzymlazaiujzakaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038103.3495166-644-192908883309620/AnsiballZ_stat.py'
Jan 21 23:28:23 compute-0 sudo[99796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:23 compute-0 python3.9[99798]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:23 compute-0 sudo[99796]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:24 compute-0 sudo[99874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvihafdtipqtwovprzuvtdwtaoywexvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038103.3495166-644-192908883309620/AnsiballZ_file.py'
Jan 21 23:28:24 compute-0 sudo[99874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:24 compute-0 python3.9[99876]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:24 compute-0 sudo[99874]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:24 compute-0 sudo[100026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmdbrskspusknhwwwkgdnabytutkdunm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038104.4990802-644-56859804121224/AnsiballZ_stat.py'
Jan 21 23:28:24 compute-0 sudo[100026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:24 compute-0 python3.9[100028]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:25 compute-0 sudo[100026]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:25 compute-0 sudo[100104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goshrwrvabisxxfrpasycrxziqfaldop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038104.4990802-644-56859804121224/AnsiballZ_file.py'
Jan 21 23:28:25 compute-0 sudo[100104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:25 compute-0 python3.9[100106]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:25 compute-0 sudo[100104]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:26 compute-0 sudo[100256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sriiibkvbmyfanhisjpguvwjcwftsfhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038105.7813346-713-259884423485328/AnsiballZ_file.py'
Jan 21 23:28:26 compute-0 sudo[100256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:26 compute-0 python3.9[100258]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:26 compute-0 sudo[100256]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:26 compute-0 sudo[100408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhofbcjxkcywpuspznhwclzfgvbcaymj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038106.5862646-737-89185750046975/AnsiballZ_stat.py'
Jan 21 23:28:26 compute-0 sudo[100408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:27 compute-0 python3.9[100410]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:27 compute-0 sudo[100408]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:27 compute-0 sudo[100486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdffkakodesihjgklyoishdmvfztufun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038106.5862646-737-89185750046975/AnsiballZ_file.py'
Jan 21 23:28:27 compute-0 sudo[100486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:27 compute-0 python3.9[100488]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:27 compute-0 sudo[100486]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:28 compute-0 sudo[100638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htniqpnouoxhuqavkeihpbvqtwbfkexl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038107.8330727-773-135323941806260/AnsiballZ_stat.py'
Jan 21 23:28:28 compute-0 sudo[100638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:28 compute-0 python3.9[100640]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:28 compute-0 sudo[100638]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:28 compute-0 sudo[100716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hadtkjbyftjfgdbmhwqjcphtkwvrxifc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038107.8330727-773-135323941806260/AnsiballZ_file.py'
Jan 21 23:28:28 compute-0 sudo[100716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:28 compute-0 python3.9[100718]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:28 compute-0 sudo[100716]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:29 compute-0 sudo[100868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbrvtcailfmoitvfzwqplmncikuuucen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038109.0887341-809-162033174124674/AnsiballZ_systemd.py'
Jan 21 23:28:29 compute-0 sudo[100868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:29 compute-0 python3.9[100870]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:28:29 compute-0 systemd[1]: Reloading.
Jan 21 23:28:29 compute-0 systemd-rc-local-generator[100897]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:28:29 compute-0 systemd-sysv-generator[100903]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:28:29 compute-0 sudo[100868]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:30 compute-0 sudo[101059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwlhxonamktbfnrulmqhruuagolgwfxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038110.5678234-833-192338324040055/AnsiballZ_stat.py'
Jan 21 23:28:30 compute-0 sudo[101059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:31 compute-0 python3.9[101061]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:31 compute-0 sudo[101059]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:31 compute-0 sudo[101137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urncfbbywujvbmqxjaewwvdvkakdfing ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038110.5678234-833-192338324040055/AnsiballZ_file.py'
Jan 21 23:28:31 compute-0 sudo[101137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:31 compute-0 python3.9[101139]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:31 compute-0 sudo[101137]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:31 compute-0 sshd-session[101140]: Invalid user www-data from 188.166.69.60 port 48924
Jan 21 23:28:31 compute-0 sshd-session[101140]: Connection closed by invalid user www-data 188.166.69.60 port 48924 [preauth]
Jan 21 23:28:32 compute-0 sudo[101291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lypwhefpyvyoebqnxwidymheruchnehr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038111.7977018-869-144679149456583/AnsiballZ_stat.py'
Jan 21 23:28:32 compute-0 sudo[101291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:32 compute-0 python3.9[101293]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:32 compute-0 sudo[101291]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:32 compute-0 sudo[101369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbwynsrbgzevhdrxwozzaclpzetaucca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038111.7977018-869-144679149456583/AnsiballZ_file.py'
Jan 21 23:28:32 compute-0 sudo[101369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:32 compute-0 python3.9[101371]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:32 compute-0 sudo[101369]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:33 compute-0 sudo[101521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueinkzuohmeizvamlknntwxbdnzmocjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038113.1002135-905-76976174389447/AnsiballZ_systemd.py'
Jan 21 23:28:33 compute-0 sudo[101521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:33 compute-0 python3.9[101523]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:28:33 compute-0 systemd[1]: Reloading.
Jan 21 23:28:33 compute-0 systemd-sysv-generator[101553]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:28:33 compute-0 systemd-rc-local-generator[101548]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:28:34 compute-0 systemd[1]: Starting Create netns directory...
Jan 21 23:28:34 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 21 23:28:34 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 21 23:28:34 compute-0 systemd[1]: Finished Create netns directory.
Jan 21 23:28:34 compute-0 sudo[101521]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:34 compute-0 sudo[101714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mamicpwpfynxevmberqbidhcjjfymgta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038114.434129-935-173000704681849/AnsiballZ_file.py'
Jan 21 23:28:34 compute-0 sudo[101714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:34 compute-0 python3.9[101716]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:34 compute-0 sudo[101714]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:35 compute-0 sudo[101866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcwphbgisihkswrujvvjsjtjrthuksqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038115.2253337-959-191519690863639/AnsiballZ_stat.py'
Jan 21 23:28:35 compute-0 sudo[101866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:35 compute-0 python3.9[101868]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:35 compute-0 sudo[101866]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:36 compute-0 sudo[101989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixwxpfarblxkvjgvkeiuzssndtgnvhjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038115.2253337-959-191519690863639/AnsiballZ_copy.py'
Jan 21 23:28:36 compute-0 sudo[101989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:36 compute-0 python3.9[101991]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038115.2253337-959-191519690863639/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:36 compute-0 sudo[101989]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:37 compute-0 sudo[102141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flgzdyrffhbpzqlqspboxktsqnjlowzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038116.882129-1010-267139049742119/AnsiballZ_file.py'
Jan 21 23:28:37 compute-0 sudo[102141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:37 compute-0 python3.9[102143]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:37 compute-0 sudo[102141]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:37 compute-0 sudo[102293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftohommdacloofgbgbgqijsagwdosmqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038117.5519557-1034-39007017105451/AnsiballZ_file.py'
Jan 21 23:28:37 compute-0 sudo[102293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:38 compute-0 python3.9[102295]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:28:38 compute-0 sudo[102293]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:38 compute-0 sudo[102445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lptbsgfgaambvcvvtcoabrppcymjwkht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038118.2977536-1058-79920000581828/AnsiballZ_stat.py'
Jan 21 23:28:38 compute-0 sudo[102445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:38 compute-0 python3.9[102447]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:28:38 compute-0 sudo[102445]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:39 compute-0 sudo[102568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dphccoejudducnccirtvotldwmfczklr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038118.2977536-1058-79920000581828/AnsiballZ_copy.py'
Jan 21 23:28:39 compute-0 sudo[102568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:39 compute-0 python3.9[102570]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038118.2977536-1058-79920000581828/.source.json _original_basename=.89kqmmdc follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:39 compute-0 sudo[102568]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:40 compute-0 python3.9[102720]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:42 compute-0 sudo[103141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwlkegahiwniltdndydhmptwhktustez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038121.883222-1178-10171471026075/AnsiballZ_container_config_data.py'
Jan 21 23:28:42 compute-0 sudo[103141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:42 compute-0 python3.9[103143]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 21 23:28:42 compute-0 sudo[103141]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:43 compute-0 sudo[103293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjhcomdzelrxqmaeyfjqieafbpikketm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038123.0079305-1211-228374732712054/AnsiballZ_container_config_hash.py'
Jan 21 23:28:43 compute-0 sudo[103293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:43 compute-0 python3.9[103295]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:28:44 compute-0 sudo[103293]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:44 compute-0 sudo[103445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cudsqyzqgvdrmmslkpxqnjdpawanupip ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038124.3652103-1241-45366908675461/AnsiballZ_edpm_container_manage.py'
Jan 21 23:28:44 compute-0 sudo[103445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:45 compute-0 python3[103447]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:28:51 compute-0 podman[103504]: 2026-01-21 23:28:51.56916631 +0000 UTC m=+3.934145080 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:28:53 compute-0 podman[103460]: 2026-01-21 23:28:53.341674346 +0000 UTC m=+7.968754874 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:28:53 compute-0 podman[103588]: 2026-01-21 23:28:53.540824137 +0000 UTC m=+0.073952568 container create 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 21 23:28:53 compute-0 podman[103588]: 2026-01-21 23:28:53.502974343 +0000 UTC m=+0.036102844 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:28:53 compute-0 python3[103447]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:28:53 compute-0 sudo[103445]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:56 compute-0 sudo[103776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpqmhfusyfjxftsvdpsulztweoqxpcjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038136.3270738-1265-97958002694792/AnsiballZ_stat.py'
Jan 21 23:28:56 compute-0 sudo[103776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:56 compute-0 python3.9[103778]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:28:56 compute-0 sudo[103776]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:57 compute-0 sudo[103930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjnhkoofkvexrmpwxrbjelnexcpczbzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038137.1739388-1292-241929862106604/AnsiballZ_file.py'
Jan 21 23:28:57 compute-0 sudo[103930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:57 compute-0 python3.9[103932]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:57 compute-0 sudo[103930]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:57 compute-0 sudo[104006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixbkdqgiqiwdfezpxbrymghrubxqstcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038137.1739388-1292-241929862106604/AnsiballZ_stat.py'
Jan 21 23:28:57 compute-0 sudo[104006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:58 compute-0 python3.9[104008]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:28:58 compute-0 sudo[104006]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:58 compute-0 sudo[104157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qutrgrjrtviqmrjczfydizejthnrrpqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038138.263542-1292-125504273771154/AnsiballZ_copy.py'
Jan 21 23:28:58 compute-0 sudo[104157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:58 compute-0 python3.9[104159]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038138.263542-1292-125504273771154/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:28:59 compute-0 sudo[104157]: pam_unix(sudo:session): session closed for user root
Jan 21 23:28:59 compute-0 sudo[104233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueznjrnwkpcxktwdkrutbswiaxjoiujo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038138.263542-1292-125504273771154/AnsiballZ_systemd.py'
Jan 21 23:28:59 compute-0 sudo[104233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:28:59 compute-0 python3.9[104235]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:28:59 compute-0 systemd[1]: Reloading.
Jan 21 23:28:59 compute-0 systemd-sysv-generator[104264]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:28:59 compute-0 systemd-rc-local-generator[104256]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:28:59 compute-0 sudo[104233]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:00 compute-0 sudo[104344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aarrvoacjvawypjrvigoseznzwmiozoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038138.263542-1292-125504273771154/AnsiballZ_systemd.py'
Jan 21 23:29:00 compute-0 sudo[104344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:00 compute-0 python3.9[104346]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:00 compute-0 systemd[1]: Reloading.
Jan 21 23:29:00 compute-0 systemd-rc-local-generator[104373]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:29:00 compute-0 systemd-sysv-generator[104377]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:29:00 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Jan 21 23:29:00 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:29:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ee549c88dd9e1e48fb01da66c2a4e40761f451b3d0808c732cdfbac417e1ee0/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 21 23:29:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ee549c88dd9e1e48fb01da66c2a4e40761f451b3d0808c732cdfbac417e1ee0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:29:01 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c.
Jan 21 23:29:01 compute-0 podman[104387]: 2026-01-21 23:29:01.00891523 +0000 UTC m=+0.134337928 container init 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: + sudo -E kolla_set_configs
Jan 21 23:29:01 compute-0 podman[104387]: 2026-01-21 23:29:01.035649714 +0000 UTC m=+0.161072392 container start 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:29:01 compute-0 edpm-start-podman-container[104387]: ovn_metadata_agent
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Validating config file
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Copying service configuration files
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Writing out command to execute
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 21 23:29:01 compute-0 podman[104410]: 2026-01-21 23:29:01.113821059 +0000 UTC m=+0.063319660 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: ++ cat /run_command
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: + CMD=neutron-ovn-metadata-agent
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: + ARGS=
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: + sudo kolla_copy_cacerts
Jan 21 23:29:01 compute-0 edpm-start-podman-container[104386]: Creating additional drop-in dependency for "ovn_metadata_agent" (86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c)
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: + [[ ! -n '' ]]
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: + . kolla_extend_start
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: Running command: 'neutron-ovn-metadata-agent'
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: + umask 0022
Jan 21 23:29:01 compute-0 ovn_metadata_agent[104403]: + exec neutron-ovn-metadata-agent
Jan 21 23:29:01 compute-0 systemd[1]: Reloading.
Jan 21 23:29:01 compute-0 systemd-sysv-generator[104484]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:29:01 compute-0 systemd-rc-local-generator[104481]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:29:01 compute-0 systemd[1]: Started ovn_metadata_agent container.
Jan 21 23:29:01 compute-0 sudo[104344]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:02 compute-0 python3.9[104641]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.115 104408 INFO neutron.common.config [-] Logging enabled!
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.115 104408 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.115 104408 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.116 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.116 104408 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.116 104408 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.116 104408 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.116 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.117 104408 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.117 104408 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.117 104408 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.117 104408 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.117 104408 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.117 104408 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.117 104408 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.117 104408 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.117 104408 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.118 104408 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.118 104408 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.118 104408 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.118 104408 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.118 104408 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.118 104408 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.118 104408 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.119 104408 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.119 104408 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.119 104408 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.119 104408 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.119 104408 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.119 104408 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.119 104408 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.119 104408 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.119 104408 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.120 104408 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.120 104408 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.120 104408 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.120 104408 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.120 104408 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.120 104408 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.120 104408 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.120 104408 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.120 104408 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.121 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.121 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.121 104408 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.121 104408 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.121 104408 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.121 104408 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.121 104408 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.121 104408 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.121 104408 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.121 104408 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.122 104408 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.122 104408 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.122 104408 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.122 104408 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.122 104408 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.122 104408 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.122 104408 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.122 104408 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.122 104408 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.123 104408 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.123 104408 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.123 104408 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.123 104408 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.123 104408 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.123 104408 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.123 104408 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.123 104408 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.123 104408 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.123 104408 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.124 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.124 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.124 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.124 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.124 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.124 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.124 104408 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.124 104408 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.125 104408 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.125 104408 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.125 104408 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.125 104408 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.125 104408 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.125 104408 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.125 104408 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.125 104408 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.125 104408 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.125 104408 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.126 104408 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.126 104408 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.126 104408 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.126 104408 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.126 104408 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.126 104408 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.126 104408 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.126 104408 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.127 104408 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.127 104408 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.127 104408 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.127 104408 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.127 104408 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.127 104408 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.127 104408 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.127 104408 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.128 104408 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.128 104408 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.128 104408 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.128 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.128 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.128 104408 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.128 104408 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.128 104408 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.129 104408 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.129 104408 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.129 104408 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.129 104408 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.129 104408 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.129 104408 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.129 104408 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.129 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.130 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.130 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.130 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.130 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.130 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.130 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.130 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.131 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.131 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.131 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.131 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.131 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.131 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.131 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.132 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.132 104408 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.132 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.132 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.132 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.132 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.133 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.133 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.133 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.133 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.133 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.133 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.134 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.134 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.134 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.134 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.134 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.134 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.134 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.135 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.135 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.135 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.135 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.135 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.135 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.135 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.136 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.136 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.136 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.136 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.136 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.136 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.136 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.137 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.137 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.137 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.137 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.137 104408 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.137 104408 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.138 104408 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.138 104408 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.138 104408 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.138 104408 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.138 104408 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.138 104408 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.138 104408 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.139 104408 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.139 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.139 104408 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.139 104408 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.139 104408 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.139 104408 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.139 104408 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.140 104408 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.140 104408 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.140 104408 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.140 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.140 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.140 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.140 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.141 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.141 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.141 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.141 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.141 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.141 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.141 104408 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.142 104408 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.142 104408 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.142 104408 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.142 104408 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.142 104408 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.142 104408 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.142 104408 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.142 104408 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.142 104408 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.143 104408 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.143 104408 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.143 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.143 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.143 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.143 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.143 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.143 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.144 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.144 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.144 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.144 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.144 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.144 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.144 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.144 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.145 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.145 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.145 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.145 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.145 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.145 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.145 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.146 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.146 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.146 104408 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.146 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.146 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.146 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.146 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.146 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.147 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.147 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.147 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.147 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.147 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.147 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.148 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.148 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.148 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.148 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.148 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.148 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.148 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.149 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.149 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.149 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.149 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.149 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.149 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.149 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.149 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.149 104408 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.150 104408 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.150 104408 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.150 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.150 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.150 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.150 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.150 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.151 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.151 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.151 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.151 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.151 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.151 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.151 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.152 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.152 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.152 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.152 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.152 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.152 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.152 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.152 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.153 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.153 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.153 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.153 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.153 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.153 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.153 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.153 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.154 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.154 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.154 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.154 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.154 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.154 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.154 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.154 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.155 104408 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.155 104408 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.166 104408 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.166 104408 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.166 104408 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.167 104408 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.167 104408 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.182 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 7f404a2f-20ba-4b9b-88d6-fa3588630efa (UUID: 7f404a2f-20ba-4b9b-88d6-fa3588630efa) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.210 104408 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.210 104408 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.210 104408 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.210 104408 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.214 104408 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.221 104408 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.228 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '7f404a2f-20ba-4b9b-88d6-fa3588630efa'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], external_ids={}, name=7f404a2f-20ba-4b9b-88d6-fa3588630efa, nb_cfg_timestamp=1769038075734, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.229 104408 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f9e44de3b80>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.230 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.230 104408 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.231 104408 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.231 104408 INFO oslo_service.service [-] Starting 1 workers
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.235 104408 DEBUG oslo_service.service [-] Started child 104736 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.238 104736 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-1946256'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.241 104408 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpetajdl5t/privsep.sock']
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.264 104736 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.265 104736 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.265 104736 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.279 104736 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.285 104736 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 21 23:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.291 104736 INFO eventlet.wsgi.server [-] (104736) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Jan 21 23:29:03 compute-0 sudo[104795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llcxmrrlzsvcyxwwjtbhrjuekdunijml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038143.1109326-1427-97439489247822/AnsiballZ_stat.py'
Jan 21 23:29:03 compute-0 sudo[104795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:03 compute-0 python3.9[104797]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:29:03 compute-0 sudo[104795]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:03 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 21 23:29:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:04.004 104408 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 21 23:29:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:04.005 104408 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpetajdl5t/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 21 23:29:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.840 104855 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 23:29:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.844 104855 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 23:29:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.846 104855 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 21 23:29:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:03.846 104855 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104855
Jan 21 23:29:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:04.010 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[019a9338-2410-46aa-ba3a-2578e5532804]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:29:04 compute-0 sudo[104925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcitequsitxjulmcvgmpqzocayyyeiyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038143.1109326-1427-97439489247822/AnsiballZ_copy.py'
Jan 21 23:29:04 compute-0 sudo[104925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:04 compute-0 python3.9[104927]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038143.1109326-1427-97439489247822/.source.yaml _original_basename=.h4_c75vf follow=False checksum=3944cba7eabd17aea9c4028b478ec4257c60bf08 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:04 compute-0 sudo[104925]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:04.532 104855 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:29:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:04.533 104855 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:29:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:04.533 104855 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.117 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[0b1fbcf5-4906-48cd-83d3-2f22f12c2b50]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.122 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, column=external_ids, values=({'neutron:ovn-metadata-id': '0b04f086-4ce5-53ee-9807-6dcea114b03d'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:29:05 compute-0 sshd-session[96080]: Connection closed by 192.168.122.30 port 42838
Jan 21 23:29:05 compute-0 sshd-session[96077]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:29:05 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Jan 21 23:29:05 compute-0 systemd[1]: session-21.scope: Consumed 56.132s CPU time.
Jan 21 23:29:05 compute-0 systemd-logind[784]: Session 21 logged out. Waiting for processes to exit.
Jan 21 23:29:05 compute-0 systemd-logind[784]: Removed session 21.
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.200 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.207 104408 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.207 104408 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.207 104408 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.207 104408 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.207 104408 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.207 104408 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.208 104408 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.208 104408 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.208 104408 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.208 104408 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.208 104408 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.208 104408 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.208 104408 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.208 104408 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.208 104408 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.209 104408 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.209 104408 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.209 104408 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.209 104408 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.209 104408 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.209 104408 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.210 104408 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.210 104408 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.210 104408 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.210 104408 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.210 104408 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.210 104408 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.210 104408 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.211 104408 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.211 104408 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.211 104408 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.211 104408 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.211 104408 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.211 104408 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.212 104408 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.212 104408 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.212 104408 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.212 104408 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.212 104408 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.212 104408 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.213 104408 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.213 104408 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.213 104408 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.213 104408 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.213 104408 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.213 104408 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.213 104408 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.213 104408 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.213 104408 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.214 104408 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.214 104408 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.214 104408 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.214 104408 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.214 104408 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.214 104408 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.214 104408 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.214 104408 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.214 104408 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.215 104408 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.215 104408 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.215 104408 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.215 104408 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.215 104408 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.215 104408 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.215 104408 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.215 104408 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.215 104408 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.216 104408 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.216 104408 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.216 104408 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.216 104408 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.216 104408 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.216 104408 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.216 104408 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.216 104408 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.216 104408 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.217 104408 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.217 104408 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.217 104408 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.217 104408 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.217 104408 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.217 104408 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.217 104408 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.217 104408 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.217 104408 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.217 104408 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.218 104408 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.218 104408 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.218 104408 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.218 104408 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.218 104408 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.218 104408 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.218 104408 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.218 104408 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.218 104408 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.219 104408 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.219 104408 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.219 104408 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.219 104408 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.219 104408 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.219 104408 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.219 104408 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.219 104408 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.219 104408 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.220 104408 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.220 104408 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.220 104408 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.220 104408 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.220 104408 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.220 104408 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.221 104408 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.221 104408 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.221 104408 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.221 104408 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.221 104408 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.221 104408 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.221 104408 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.222 104408 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.222 104408 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.222 104408 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.222 104408 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.222 104408 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.222 104408 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.223 104408 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.223 104408 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.223 104408 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.223 104408 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.223 104408 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.223 104408 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.223 104408 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.224 104408 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.224 104408 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.224 104408 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.224 104408 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.224 104408 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.224 104408 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.224 104408 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.224 104408 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.225 104408 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.225 104408 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.225 104408 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.225 104408 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.225 104408 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.225 104408 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.225 104408 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.225 104408 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.225 104408 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.226 104408 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.226 104408 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.226 104408 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.226 104408 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.226 104408 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.226 104408 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.226 104408 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.226 104408 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.226 104408 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.226 104408 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.227 104408 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.227 104408 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.227 104408 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.227 104408 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.227 104408 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.227 104408 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.227 104408 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.227 104408 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.227 104408 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.227 104408 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.228 104408 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.228 104408 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.228 104408 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.228 104408 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.228 104408 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.228 104408 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.228 104408 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.229 104408 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.229 104408 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.229 104408 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.229 104408 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.229 104408 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.229 104408 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.229 104408 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.229 104408 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.230 104408 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.230 104408 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.230 104408 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.230 104408 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.230 104408 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.230 104408 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.230 104408 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.230 104408 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.231 104408 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.231 104408 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.231 104408 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.231 104408 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.231 104408 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.231 104408 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.231 104408 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.231 104408 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.231 104408 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.232 104408 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.232 104408 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.232 104408 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.232 104408 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.232 104408 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.232 104408 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.232 104408 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.232 104408 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.232 104408 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.233 104408 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.233 104408 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.233 104408 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.233 104408 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.233 104408 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.233 104408 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.233 104408 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.234 104408 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.234 104408 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.234 104408 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.234 104408 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.234 104408 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.234 104408 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.234 104408 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.234 104408 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.234 104408 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.235 104408 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.235 104408 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.235 104408 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.235 104408 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.235 104408 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.235 104408 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.235 104408 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.235 104408 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.235 104408 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.235 104408 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.236 104408 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.236 104408 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.236 104408 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.236 104408 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.236 104408 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.236 104408 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.236 104408 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.236 104408 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.237 104408 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.237 104408 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.237 104408 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.237 104408 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.237 104408 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.237 104408 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.237 104408 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.238 104408 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.238 104408 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.238 104408 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.238 104408 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.238 104408 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.238 104408 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.238 104408 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.238 104408 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.239 104408 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.239 104408 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.239 104408 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.239 104408 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.239 104408 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.239 104408 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.239 104408 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.239 104408 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.240 104408 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.240 104408 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.240 104408 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.240 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.240 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.240 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.240 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.240 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.241 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.241 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.241 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.241 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.241 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.241 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.241 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.241 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.241 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.242 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.242 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.242 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.242 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.242 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.242 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.242 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.242 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.242 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.242 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.243 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.243 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.243 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.243 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.243 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.243 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.243 104408 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.243 104408 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.244 104408 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.244 104408 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.244 104408 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:29:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:29:05.244 104408 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 23:29:10 compute-0 sshd-session[104953]: Accepted publickey for zuul from 192.168.122.30 port 46194 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:29:10 compute-0 systemd-logind[784]: New session 22 of user zuul.
Jan 21 23:29:10 compute-0 systemd[1]: Started Session 22 of User zuul.
Jan 21 23:29:10 compute-0 sshd-session[104953]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:29:11 compute-0 python3.9[105106]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:29:12 compute-0 sudo[105260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soxpbnnxxkibbkfrkgfmpkcmqvploflx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038152.363782-62-234819116237628/AnsiballZ_command.py'
Jan 21 23:29:12 compute-0 sudo[105260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:13 compute-0 python3.9[105262]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:13 compute-0 sudo[105260]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:15 compute-0 sudo[105427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odvtjwxkobaspoprrexrxuliijjdpemu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038154.360859-95-199449137451538/AnsiballZ_systemd_service.py'
Jan 21 23:29:15 compute-0 sudo[105427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:15 compute-0 python3.9[105429]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:29:15 compute-0 systemd[1]: Reloading.
Jan 21 23:29:15 compute-0 systemd-sysv-generator[105459]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:29:15 compute-0 systemd-rc-local-generator[105454]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:29:15 compute-0 sshd-session[105382]: Invalid user www-data from 188.166.69.60 port 57612
Jan 21 23:29:15 compute-0 sudo[105427]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:15 compute-0 sshd-session[105382]: Connection closed by invalid user www-data 188.166.69.60 port 57612 [preauth]
Jan 21 23:29:16 compute-0 python3.9[105614]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:29:17 compute-0 network[105631]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:29:17 compute-0 network[105632]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:29:17 compute-0 network[105633]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:29:21 compute-0 sudo[105892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djbkgidbfyyintvrjghlhncjjcywcvgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038161.1550994-152-125308906310712/AnsiballZ_systemd_service.py'
Jan 21 23:29:21 compute-0 sudo[105892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:21 compute-0 python3.9[105894]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:21 compute-0 sudo[105892]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:22 compute-0 sudo[106045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqjxmxvfpckljjrytecmicvwqyfsnpxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038161.9821596-152-270976439864947/AnsiballZ_systemd_service.py'
Jan 21 23:29:22 compute-0 sudo[106045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:22 compute-0 python3.9[106047]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:22 compute-0 sudo[106045]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:23 compute-0 sudo[106210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uygwrvxsjgihwdsdfpivtnewhwrbjhgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038162.853823-152-146319984892707/AnsiballZ_systemd_service.py'
Jan 21 23:29:23 compute-0 sudo[106210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:23 compute-0 podman[106173]: 2026-01-21 23:29:23.276942758 +0000 UTC m=+0.114822022 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 23:29:24 compute-0 python3.9[106218]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:24 compute-0 sudo[106210]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:25 compute-0 sudo[106379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxuuxikxgkfjtwyiozuckefzcqjnbybb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038164.6556668-152-44221166728363/AnsiballZ_systemd_service.py'
Jan 21 23:29:25 compute-0 sudo[106379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:25 compute-0 python3.9[106381]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:25 compute-0 sudo[106379]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:26 compute-0 sudo[106532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byjpvzwgatbygwwqmaqsirgymaakextv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038166.0592234-152-226622658758078/AnsiballZ_systemd_service.py'
Jan 21 23:29:26 compute-0 sudo[106532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:26 compute-0 python3.9[106534]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:26 compute-0 sudo[106532]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:27 compute-0 sudo[106685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oamnwfkrpzgytryqlyzgxiwgqnmliemq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038166.9433107-152-86893935732002/AnsiballZ_systemd_service.py'
Jan 21 23:29:27 compute-0 sudo[106685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:27 compute-0 python3.9[106687]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:27 compute-0 sudo[106685]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:28 compute-0 sudo[106838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpcowqmxuhvaaixdvkphcncaiohtextw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038167.8605778-152-204565543744526/AnsiballZ_systemd_service.py'
Jan 21 23:29:28 compute-0 sudo[106838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:28 compute-0 python3.9[106840]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:29:28 compute-0 sudo[106838]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:29 compute-0 sudo[106991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mznjstvspiadyzjljowpdokayuzjzdiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038169.2275078-308-55425190224888/AnsiballZ_file.py'
Jan 21 23:29:29 compute-0 sudo[106991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:29 compute-0 python3.9[106993]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:29 compute-0 sudo[106991]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:30 compute-0 sudo[107143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooytvxlcovvieipaauqugxhikuywntcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038170.078938-308-131808328112272/AnsiballZ_file.py'
Jan 21 23:29:30 compute-0 sudo[107143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:30 compute-0 python3.9[107145]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:30 compute-0 sudo[107143]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:31 compute-0 sudo[107295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlcxvivxzjfhhrkfcovnbllzgvlowrot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038170.6928563-308-119761022125529/AnsiballZ_file.py'
Jan 21 23:29:31 compute-0 sudo[107295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:31 compute-0 python3.9[107297]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:31 compute-0 sudo[107295]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:31 compute-0 sudo[107458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fngystvjcgefsfityfcntdjwywzzzwlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038171.3546395-308-26513103098089/AnsiballZ_file.py'
Jan 21 23:29:31 compute-0 sudo[107458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:31 compute-0 podman[107417]: 2026-01-21 23:29:31.737734149 +0000 UTC m=+0.096317539 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:29:31 compute-0 python3.9[107464]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:31 compute-0 sudo[107458]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:32 compute-0 sudo[107618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgmemkjxnzqdrnuasxyuyfwmdrcisrlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038172.0601208-308-60061593456749/AnsiballZ_file.py'
Jan 21 23:29:32 compute-0 sudo[107618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:32 compute-0 python3.9[107620]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:32 compute-0 sudo[107618]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:33 compute-0 sudo[107770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uteexeslbaujshwcssaijtlmbibbjaca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038172.7849956-308-227439539301120/AnsiballZ_file.py'
Jan 21 23:29:33 compute-0 sudo[107770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:33 compute-0 python3.9[107772]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:33 compute-0 sudo[107770]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:33 compute-0 sudo[107922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsnwdtcybvkprvqhhzzxrieoxufneemz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038173.4988923-308-6522273643937/AnsiballZ_file.py'
Jan 21 23:29:33 compute-0 sudo[107922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:34 compute-0 python3.9[107924]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:34 compute-0 sudo[107922]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:35 compute-0 sudo[108074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkxvdvouxkalbonvknytgmjrbskocirl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038174.745158-458-156175086736988/AnsiballZ_file.py'
Jan 21 23:29:35 compute-0 sudo[108074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:35 compute-0 python3.9[108076]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:35 compute-0 sudo[108074]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:35 compute-0 sudo[108226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cievgwwsdsrytkfgxdsjttbbhcbhylfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038175.5079353-458-174703220072018/AnsiballZ_file.py'
Jan 21 23:29:35 compute-0 sudo[108226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:36 compute-0 python3.9[108228]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:36 compute-0 sudo[108226]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:36 compute-0 sudo[108378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxvaxjqbpqnqnhrzrpdpxejxlbfeovkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038176.2202828-458-161704620831986/AnsiballZ_file.py'
Jan 21 23:29:36 compute-0 sudo[108378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:36 compute-0 python3.9[108380]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:36 compute-0 sudo[108378]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:37 compute-0 sudo[108530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnjduevlbosjpnchjtauuaahyttokekk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038176.8962557-458-243647465509647/AnsiballZ_file.py'
Jan 21 23:29:37 compute-0 sudo[108530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:37 compute-0 python3.9[108532]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:37 compute-0 sudo[108530]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:38 compute-0 sudo[108682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awpkstunmeywxwmwrbsgccteiktxhezm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038177.6712563-458-134850872655230/AnsiballZ_file.py'
Jan 21 23:29:38 compute-0 sudo[108682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:38 compute-0 python3.9[108684]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:38 compute-0 sudo[108682]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:38 compute-0 sudo[108834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqytjgttdjfcyhgbsujcqqxbvaokhqqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038178.389853-458-114700312570822/AnsiballZ_file.py'
Jan 21 23:29:38 compute-0 sudo[108834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:38 compute-0 python3.9[108836]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:38 compute-0 sudo[108834]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:39 compute-0 sudo[108986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfiravdvbakpjxixpybycdarphvzcyuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038179.1500542-458-71083340447320/AnsiballZ_file.py'
Jan 21 23:29:39 compute-0 sudo[108986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:39 compute-0 python3.9[108988]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:29:39 compute-0 sudo[108986]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:40 compute-0 sudo[109138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibwrfsmqewwrhrkpmesfzsinaanenwia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038180.0523136-611-216348882629613/AnsiballZ_command.py'
Jan 21 23:29:40 compute-0 sudo[109138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:40 compute-0 python3.9[109140]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:40 compute-0 sudo[109138]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:41 compute-0 python3.9[109292]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 23:29:42 compute-0 sudo[109442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqxfnlzrtxstmtkmfdlpdoxvlfouftkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038182.1098583-665-156231405707655/AnsiballZ_systemd_service.py'
Jan 21 23:29:42 compute-0 sudo[109442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:42 compute-0 python3.9[109444]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:29:42 compute-0 systemd[1]: Reloading.
Jan 21 23:29:42 compute-0 systemd-rc-local-generator[109466]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:29:42 compute-0 systemd-sysv-generator[109473]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:29:43 compute-0 sudo[109442]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:43 compute-0 sudo[109628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnrtphyjsfgpntcbunenwbgjgltaiyin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038183.30468-689-239051932695408/AnsiballZ_command.py'
Jan 21 23:29:43 compute-0 sudo[109628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:43 compute-0 python3.9[109630]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:43 compute-0 sudo[109628]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:44 compute-0 sudo[109781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhaqwhqpwhxxxofjjsfwwwuhwsccjnlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038184.0413055-689-128217419226617/AnsiballZ_command.py'
Jan 21 23:29:44 compute-0 sudo[109781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:44 compute-0 python3.9[109783]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:44 compute-0 sudo[109781]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:45 compute-0 sudo[109934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdwvrizxpafyzquunulvyqgyiiwndrza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038184.7997835-689-29477640203701/AnsiballZ_command.py'
Jan 21 23:29:45 compute-0 sudo[109934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:45 compute-0 python3.9[109936]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:45 compute-0 sudo[109934]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:45 compute-0 sudo[110087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkfcsnjrzpgirkjdjfnharrvsqyidgsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038185.5104134-689-216593748357275/AnsiballZ_command.py'
Jan 21 23:29:45 compute-0 sudo[110087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:45 compute-0 python3.9[110089]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:46 compute-0 sudo[110087]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:46 compute-0 sudo[110240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkllyaclwhbnwvqgefxpwxnyvbcfzvpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038186.4075985-689-6886698858588/AnsiballZ_command.py'
Jan 21 23:29:46 compute-0 sudo[110240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:46 compute-0 python3.9[110242]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:47 compute-0 sudo[110240]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:47 compute-0 sudo[110393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yksjtkyyygrbmxxtbyenybatednketpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038187.210409-689-54265267704320/AnsiballZ_command.py'
Jan 21 23:29:47 compute-0 sudo[110393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:47 compute-0 python3.9[110395]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:47 compute-0 sudo[110393]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:48 compute-0 sudo[110546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlsuhrcpvbprxwzsknfdigqwgdmjuowz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038187.930919-689-245374792355303/AnsiballZ_command.py'
Jan 21 23:29:48 compute-0 sudo[110546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:48 compute-0 python3.9[110548]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:29:48 compute-0 sudo[110546]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:50 compute-0 sudo[110699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmuhhyrktnjqhgugjqsmrplceaumaazf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038189.4892275-851-100965120596465/AnsiballZ_getent.py'
Jan 21 23:29:50 compute-0 sudo[110699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:50 compute-0 python3.9[110701]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 21 23:29:50 compute-0 sudo[110699]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:50 compute-0 sudo[110852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iilwyafrommgtsycvjsizliyzszeqnfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038190.4619467-875-89189655210396/AnsiballZ_group.py'
Jan 21 23:29:50 compute-0 sudo[110852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:51 compute-0 python3.9[110854]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 23:29:51 compute-0 groupadd[110855]: group added to /etc/group: name=libvirt, GID=42473
Jan 21 23:29:51 compute-0 groupadd[110855]: group added to /etc/gshadow: name=libvirt
Jan 21 23:29:51 compute-0 groupadd[110855]: new group: name=libvirt, GID=42473
Jan 21 23:29:51 compute-0 sudo[110852]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:52 compute-0 sudo[111010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqdhhqgcwsdkqvxvmxjxybjmxnamlnlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038191.6583803-899-168209074940448/AnsiballZ_user.py'
Jan 21 23:29:52 compute-0 sudo[111010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:52 compute-0 python3.9[111012]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 23:29:52 compute-0 useradd[111014]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 21 23:29:52 compute-0 sudo[111010]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:53 compute-0 sudo[111170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikdmozsimlbdgpccmyidxpjeqrwuvcxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038193.0254936-932-205326463460094/AnsiballZ_setup.py'
Jan 21 23:29:53 compute-0 sudo[111170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:53 compute-0 podman[111172]: 2026-01-21 23:29:53.487064159 +0000 UTC m=+0.149654423 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:29:53 compute-0 python3.9[111173]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:29:53 compute-0 sudo[111170]: pam_unix(sudo:session): session closed for user root
Jan 21 23:29:54 compute-0 sudo[111280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztdagbofyztadkpnkkvkegdiuxjvdtby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038193.0254936-932-205326463460094/AnsiballZ_dnf.py'
Jan 21 23:29:54 compute-0 sudo[111280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:29:54 compute-0 python3.9[111282]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:29:57 compute-0 sshd-session[111290]: Invalid user www-data from 188.166.69.60 port 58406
Jan 21 23:29:57 compute-0 sshd-session[111290]: Connection closed by invalid user www-data 188.166.69.60 port 58406 [preauth]
Jan 21 23:30:02 compute-0 podman[111414]: 2026-01-21 23:30:02.706140893 +0000 UTC m=+0.067770991 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:30:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:30:03.157 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:30:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:30:03.158 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:30:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:30:03.158 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:30:19 compute-0 kernel: SELinux:  Converting 2765 SID table entries...
Jan 21 23:30:19 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 23:30:19 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 21 23:30:19 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 23:30:19 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 21 23:30:19 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 23:30:19 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 23:30:19 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 23:30:23 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 21 23:30:23 compute-0 podman[111504]: 2026-01-21 23:30:23.779697144 +0000 UTC m=+0.124069772 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 21 23:30:31 compute-0 kernel: SELinux:  Converting 2765 SID table entries...
Jan 21 23:30:31 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 23:30:31 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 21 23:30:31 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 23:30:31 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 21 23:30:31 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 23:30:31 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 23:30:31 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 23:30:33 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 21 23:30:33 compute-0 podman[111537]: 2026-01-21 23:30:33.763015987 +0000 UTC m=+0.082654583 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 23:30:43 compute-0 sshd-session[111553]: Invalid user www-data from 188.166.69.60 port 45578
Jan 21 23:30:43 compute-0 sshd-session[111553]: Connection closed by invalid user www-data 188.166.69.60 port 45578 [preauth]
Jan 21 23:30:54 compute-0 podman[118140]: 2026-01-21 23:30:54.721976009 +0000 UTC m=+0.094399639 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:31:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:31:03.158 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:31:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:31:03.160 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:31:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:31:03.160 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:31:04 compute-0 podman[123936]: 2026-01-21 23:31:04.683932155 +0000 UTC m=+0.053930021 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:31:24 compute-0 sshd-session[128469]: Received disconnect from 91.224.92.190 port 27180:11:  [preauth]
Jan 21 23:31:24 compute-0 sshd-session[128469]: Disconnected from authenticating user root 91.224.92.190 port 27180 [preauth]
Jan 21 23:31:25 compute-0 podman[128471]: 2026-01-21 23:31:25.770783064 +0000 UTC m=+0.132473969 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 21 23:31:27 compute-0 sshd-session[128501]: Invalid user www-data from 188.166.69.60 port 44012
Jan 21 23:31:27 compute-0 sshd-session[128501]: Connection closed by invalid user www-data 188.166.69.60 port 44012 [preauth]
Jan 21 23:31:27 compute-0 kernel: SELinux:  Converting 2766 SID table entries...
Jan 21 23:31:27 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 21 23:31:27 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 21 23:31:27 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 21 23:31:27 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 21 23:31:27 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 21 23:31:27 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 21 23:31:27 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 21 23:31:29 compute-0 groupadd[128511]: group added to /etc/group: name=dnsmasq, GID=993
Jan 21 23:31:29 compute-0 groupadd[128511]: group added to /etc/gshadow: name=dnsmasq
Jan 21 23:31:29 compute-0 groupadd[128511]: new group: name=dnsmasq, GID=993
Jan 21 23:31:29 compute-0 useradd[128518]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 21 23:31:29 compute-0 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Jan 21 23:31:29 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 21 23:31:29 compute-0 dbus-broker-launch[768]: Noticed file-system modification, trigger reload.
Jan 21 23:31:30 compute-0 groupadd[128531]: group added to /etc/group: name=clevis, GID=992
Jan 21 23:31:30 compute-0 groupadd[128531]: group added to /etc/gshadow: name=clevis
Jan 21 23:31:30 compute-0 groupadd[128531]: new group: name=clevis, GID=992
Jan 21 23:31:30 compute-0 useradd[128538]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 21 23:31:30 compute-0 usermod[128548]: add 'clevis' to group 'tss'
Jan 21 23:31:30 compute-0 usermod[128548]: add 'clevis' to shadow group 'tss'
Jan 21 23:31:32 compute-0 polkitd[43580]: Reloading rules
Jan 21 23:31:32 compute-0 polkitd[43580]: Collecting garbage unconditionally...
Jan 21 23:31:32 compute-0 polkitd[43580]: Loading rules from directory /etc/polkit-1/rules.d
Jan 21 23:31:32 compute-0 polkitd[43580]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 21 23:31:32 compute-0 polkitd[43580]: Finished loading, compiling and executing 3 rules
Jan 21 23:31:32 compute-0 polkitd[43580]: Reloading rules
Jan 21 23:31:32 compute-0 polkitd[43580]: Collecting garbage unconditionally...
Jan 21 23:31:32 compute-0 polkitd[43580]: Loading rules from directory /etc/polkit-1/rules.d
Jan 21 23:31:32 compute-0 polkitd[43580]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 21 23:31:32 compute-0 polkitd[43580]: Finished loading, compiling and executing 3 rules
Jan 21 23:31:34 compute-0 groupadd[128738]: group added to /etc/group: name=ceph, GID=167
Jan 21 23:31:34 compute-0 groupadd[128738]: group added to /etc/gshadow: name=ceph
Jan 21 23:31:34 compute-0 groupadd[128738]: new group: name=ceph, GID=167
Jan 21 23:31:34 compute-0 useradd[128744]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 21 23:31:35 compute-0 podman[128751]: 2026-01-21 23:31:35.722501035 +0000 UTC m=+0.085561528 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:31:37 compute-0 sshd[1006]: Received signal 15; terminating.
Jan 21 23:31:37 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Jan 21 23:31:37 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Jan 21 23:31:37 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Jan 21 23:31:37 compute-0 systemd[1]: sshd.service: Consumed 3.706s CPU time, read 32.0K from disk, written 172.0K to disk.
Jan 21 23:31:37 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Jan 21 23:31:37 compute-0 systemd[1]: Stopping sshd-keygen.target...
Jan 21 23:31:37 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 23:31:37 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 23:31:37 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 21 23:31:37 compute-0 systemd[1]: Reached target sshd-keygen.target.
Jan 21 23:31:37 compute-0 systemd[1]: Starting OpenSSH server daemon...
Jan 21 23:31:37 compute-0 sshd[129281]: Server listening on 0.0.0.0 port 22.
Jan 21 23:31:37 compute-0 sshd[129281]: Server listening on :: port 22.
Jan 21 23:31:37 compute-0 systemd[1]: Started OpenSSH server daemon.
Jan 21 23:31:39 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:31:39 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:31:39 compute-0 systemd[1]: Reloading.
Jan 21 23:31:39 compute-0 systemd-sysv-generator[129541]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:31:39 compute-0 systemd-rc-local-generator[129538]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:31:39 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:31:42 compute-0 sudo[111280]: pam_unix(sudo:session): session closed for user root
Jan 21 23:31:49 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:31:49 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:31:49 compute-0 systemd[1]: man-db-cache-update.service: Consumed 13.143s CPU time.
Jan 21 23:31:49 compute-0 systemd[1]: run-ra4d053e329ae4140b53af00722f8cc1b.service: Deactivated successfully.
Jan 21 23:31:56 compute-0 podman[137943]: 2026-01-21 23:31:56.722819722 +0000 UTC m=+0.095831422 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:31:57 compute-0 sshd-session[137941]: Received disconnect from 203.83.238.251 port 46300:11:  [preauth]
Jan 21 23:31:57 compute-0 sshd-session[137941]: Disconnected from authenticating user root 203.83.238.251 port 46300 [preauth]
Jan 21 23:32:00 compute-0 sudo[138095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqfcurxbooxfcithmkujvwbsoufhzwjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038319.9713047-968-126384946201359/AnsiballZ_systemd.py'
Jan 21 23:32:00 compute-0 sudo[138095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:00 compute-0 python3.9[138097]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:32:01 compute-0 systemd[1]: Reloading.
Jan 21 23:32:01 compute-0 systemd-rc-local-generator[138120]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:01 compute-0 systemd-sysv-generator[138128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:01 compute-0 sudo[138095]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:01 compute-0 sudo[138286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcwmcaepvlvmnqxdgiyvxxyjaxpqgwym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038321.50446-968-223743520030013/AnsiballZ_systemd.py'
Jan 21 23:32:01 compute-0 sudo[138286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:02 compute-0 python3.9[138288]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:32:02 compute-0 systemd[1]: Reloading.
Jan 21 23:32:02 compute-0 systemd-rc-local-generator[138318]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:02 compute-0 systemd-sysv-generator[138322]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:02 compute-0 sudo[138286]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:02 compute-0 sudo[138476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqsrtzjjlqpxianyrxfhbzbkirykylcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038322.5891895-968-57191490228288/AnsiballZ_systemd.py'
Jan 21 23:32:02 compute-0 sudo[138476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:32:03.160 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:32:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:32:03.162 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:32:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:32:03.163 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:32:03 compute-0 python3.9[138478]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:32:03 compute-0 systemd[1]: Reloading.
Jan 21 23:32:03 compute-0 systemd-rc-local-generator[138509]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:03 compute-0 systemd-sysv-generator[138513]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:03 compute-0 sudo[138476]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:04 compute-0 sudo[138666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dktebcuodiuzajjfvixnhhekdmpzawjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038323.812378-968-263853187877317/AnsiballZ_systemd.py'
Jan 21 23:32:04 compute-0 sudo[138666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:04 compute-0 python3.9[138668]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:32:04 compute-0 systemd[1]: Reloading.
Jan 21 23:32:04 compute-0 systemd-rc-local-generator[138699]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:04 compute-0 systemd-sysv-generator[138703]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:04 compute-0 sudo[138666]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:06 compute-0 sudo[138867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vahhpqvungbqdacgliczpwbunfvjbwzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038326.2651165-1055-200304961031057/AnsiballZ_systemd.py'
Jan 21 23:32:06 compute-0 sudo[138867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:06 compute-0 podman[138829]: 2026-01-21 23:32:06.708050288 +0000 UTC m=+0.101755525 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:32:07 compute-0 python3.9[138875]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:07 compute-0 systemd[1]: Reloading.
Jan 21 23:32:07 compute-0 systemd-rc-local-generator[138904]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:07 compute-0 systemd-sysv-generator[138907]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:07 compute-0 sudo[138867]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:08 compute-0 sudo[139064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thckseidhykvkoopjxubloomnfrldyyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038328.109513-1055-44561671161325/AnsiballZ_systemd.py'
Jan 21 23:32:08 compute-0 sudo[139064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:08 compute-0 python3.9[139066]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:08 compute-0 systemd[1]: Reloading.
Jan 21 23:32:09 compute-0 systemd-rc-local-generator[139097]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:09 compute-0 systemd-sysv-generator[139102]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:09 compute-0 sudo[139064]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:09 compute-0 sudo[139253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyysengpcayuustgzzjjvyyqxqyowhhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038329.4184208-1055-208861670298517/AnsiballZ_systemd.py'
Jan 21 23:32:09 compute-0 sudo[139253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:10 compute-0 python3.9[139255]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:10 compute-0 systemd[1]: Reloading.
Jan 21 23:32:10 compute-0 systemd-rc-local-generator[139288]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:10 compute-0 systemd-sysv-generator[139292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:10 compute-0 sudo[139253]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:11 compute-0 sudo[139443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpbbaazvyblwvlfktniyypiddfozbhum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038330.6861959-1055-116983040037886/AnsiballZ_systemd.py'
Jan 21 23:32:11 compute-0 sudo[139443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:11 compute-0 python3.9[139445]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:11 compute-0 sudo[139443]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:11 compute-0 sshd-session[139447]: Invalid user webmaster from 188.166.69.60 port 53594
Jan 21 23:32:11 compute-0 sshd-session[139447]: Connection closed by invalid user webmaster 188.166.69.60 port 53594 [preauth]
Jan 21 23:32:12 compute-0 sudo[139600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlrwwxhrcfmtyrobhihudolldmhxqejm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038331.7446458-1055-158091742854403/AnsiballZ_systemd.py'
Jan 21 23:32:12 compute-0 sudo[139600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:12 compute-0 python3.9[139602]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:12 compute-0 systemd[1]: Reloading.
Jan 21 23:32:12 compute-0 systemd-sysv-generator[139639]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:12 compute-0 systemd-rc-local-generator[139636]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:12 compute-0 sudo[139600]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:13 compute-0 sudo[139791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dezzvhdhfuldxpwgquzdiymwjddizeux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038333.514703-1163-47088169925785/AnsiballZ_systemd.py'
Jan 21 23:32:13 compute-0 sudo[139791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:14 compute-0 python3.9[139793]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 21 23:32:14 compute-0 systemd[1]: Reloading.
Jan 21 23:32:14 compute-0 systemd-rc-local-generator[139824]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:32:14 compute-0 systemd-sysv-generator[139827]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:32:14 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 21 23:32:14 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 21 23:32:14 compute-0 sudo[139791]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:15 compute-0 sudo[139984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyxuqyxajvectnjycftpjeeopgjtywyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038334.9621587-1187-108648965609597/AnsiballZ_systemd.py'
Jan 21 23:32:15 compute-0 sudo[139984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:15 compute-0 python3.9[139986]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:15 compute-0 sudo[139984]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:16 compute-0 sudo[140139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prgszydexrskmsutkdbrjihbfwvvtkbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038335.905996-1187-181761088422612/AnsiballZ_systemd.py'
Jan 21 23:32:16 compute-0 sudo[140139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:16 compute-0 python3.9[140141]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:16 compute-0 sudo[140139]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:17 compute-0 sudo[140294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oybxcmdfnjxwisvnnlobjwgenavuqakb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038336.7816448-1187-26462837739104/AnsiballZ_systemd.py'
Jan 21 23:32:17 compute-0 sudo[140294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:17 compute-0 python3.9[140296]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:17 compute-0 sudo[140294]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:18 compute-0 sudo[140449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qitnluwpuaekjyqpbxclqmampaikagbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038337.7602987-1187-53998486471107/AnsiballZ_systemd.py'
Jan 21 23:32:18 compute-0 sudo[140449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:18 compute-0 python3.9[140451]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:18 compute-0 sudo[140449]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:19 compute-0 sudo[140604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zercnmozmnfaalzytemiykcllomievtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038338.663053-1187-246334900183273/AnsiballZ_systemd.py'
Jan 21 23:32:19 compute-0 sudo[140604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:19 compute-0 python3.9[140606]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:19 compute-0 sudo[140604]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:19 compute-0 sudo[140759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmtfqawnuhxccolgurwdspxgkfsdvyts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038339.611754-1187-223902301777304/AnsiballZ_systemd.py'
Jan 21 23:32:19 compute-0 sudo[140759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:20 compute-0 python3.9[140761]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:20 compute-0 sudo[140759]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:20 compute-0 sudo[140914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okoskgbezsxjylekjxljsysrextwhuja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038340.5189884-1187-246886145331977/AnsiballZ_systemd.py'
Jan 21 23:32:20 compute-0 sudo[140914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:21 compute-0 python3.9[140916]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:21 compute-0 sudo[140914]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:21 compute-0 sudo[141069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymobmfynqoykjesqbrmyjnkjdsykzadt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038341.3832781-1187-85795451255595/AnsiballZ_systemd.py'
Jan 21 23:32:21 compute-0 sudo[141069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:22 compute-0 python3.9[141071]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:22 compute-0 sudo[141069]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:22 compute-0 sudo[141224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqigfququprjvsndzxdibthmddsuoqnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038342.3171763-1187-12520324471514/AnsiballZ_systemd.py'
Jan 21 23:32:22 compute-0 sudo[141224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:22 compute-0 python3.9[141226]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:24 compute-0 sudo[141224]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:24 compute-0 sudo[141379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbfktzfgexwychfysfjrdxwtncxvmyko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038344.1872673-1187-186627623313356/AnsiballZ_systemd.py'
Jan 21 23:32:24 compute-0 sudo[141379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:24 compute-0 python3.9[141381]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:24 compute-0 sudo[141379]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:25 compute-0 sudo[141534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgqlofzwwgsnnkpalnmhusdnokoxabtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038345.038703-1187-35901612655874/AnsiballZ_systemd.py'
Jan 21 23:32:25 compute-0 sudo[141534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:25 compute-0 python3.9[141536]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:25 compute-0 sudo[141534]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:26 compute-0 sudo[141689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxbccnwuacqqcginkumcisjsigyrfdgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038345.9309018-1187-228579938654186/AnsiballZ_systemd.py'
Jan 21 23:32:26 compute-0 sudo[141689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:26 compute-0 python3.9[141691]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:26 compute-0 sudo[141689]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:27 compute-0 sudo[141855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyjgphpryxaggzwuttmjurjsdmdaesrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038346.839446-1187-87808267155798/AnsiballZ_systemd.py'
Jan 21 23:32:27 compute-0 sudo[141855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:27 compute-0 podman[141818]: 2026-01-21 23:32:27.267044569 +0000 UTC m=+0.113795439 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:32:27 compute-0 python3.9[141863]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:27 compute-0 sudo[141855]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:28 compute-0 sudo[142026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaevbabngqguztqlshmmsddtjyblfnyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038348.053473-1187-73808919979601/AnsiballZ_systemd.py'
Jan 21 23:32:28 compute-0 sudo[142026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:28 compute-0 python3.9[142028]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 21 23:32:28 compute-0 sudo[142026]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:29 compute-0 sudo[142181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czlrehyknryndudmdzmwvyfpcuebmwzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038349.567821-1493-10188190811819/AnsiballZ_file.py'
Jan 21 23:32:29 compute-0 sudo[142181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:30 compute-0 python3.9[142183]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:32:30 compute-0 sudo[142181]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:30 compute-0 sudo[142333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylmyuyreprrrummgvhroiejzyegsrydf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038350.2540376-1493-123244320806616/AnsiballZ_file.py'
Jan 21 23:32:30 compute-0 sudo[142333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:30 compute-0 python3.9[142335]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:32:30 compute-0 sudo[142333]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:31 compute-0 sudo[142485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckjdatitxoanvylkpxuabiqikbdsmvll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038350.9554589-1493-259876369516287/AnsiballZ_file.py'
Jan 21 23:32:31 compute-0 sudo[142485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:31 compute-0 python3.9[142487]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:32:31 compute-0 sudo[142485]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:32 compute-0 sudo[142637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exabyaavntkuvlolbykogedjzwtupwll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038351.7084258-1493-182031370503857/AnsiballZ_file.py'
Jan 21 23:32:32 compute-0 sudo[142637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:32 compute-0 python3.9[142639]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:32:32 compute-0 sudo[142637]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:32 compute-0 sudo[142789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqbfnqlikaugaenzmrbnguetyatfpzbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038352.3690069-1493-113663507505701/AnsiballZ_file.py'
Jan 21 23:32:32 compute-0 sudo[142789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:32 compute-0 python3.9[142791]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:32:32 compute-0 sudo[142789]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:33 compute-0 sudo[142941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkwmhurupcjhedvqfevbgztukkwbzdup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038353.080722-1493-223863017289016/AnsiballZ_file.py'
Jan 21 23:32:33 compute-0 sudo[142941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:33 compute-0 python3.9[142943]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:32:33 compute-0 sudo[142941]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:34 compute-0 python3.9[143093]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:32:35 compute-0 sudo[143243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veuawmfilnwhclpzzroeoimxsrlbuzvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038355.1681273-1646-171985083623322/AnsiballZ_stat.py'
Jan 21 23:32:35 compute-0 sudo[143243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:35 compute-0 python3.9[143245]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:35 compute-0 sudo[143243]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:36 compute-0 sudo[143368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aczpgtxqmfrfdyzufhtojdqupwjmgxab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038355.1681273-1646-171985083623322/AnsiballZ_copy.py'
Jan 21 23:32:36 compute-0 sudo[143368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:36 compute-0 python3.9[143370]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038355.1681273-1646-171985083623322/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:36 compute-0 sudo[143368]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:36 compute-0 podman[143371]: 2026-01-21 23:32:36.926122202 +0000 UTC m=+0.079414943 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 21 23:32:37 compute-0 sudo[143539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sugwotnvfphpjgsgokthylzowwmcbgwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038357.0662487-1646-46676772276045/AnsiballZ_stat.py'
Jan 21 23:32:37 compute-0 sudo[143539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:37 compute-0 python3.9[143541]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:37 compute-0 sudo[143539]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:38 compute-0 sudo[143664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmgbmrgwhmctcoaiymmuefqoyabftxrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038357.0662487-1646-46676772276045/AnsiballZ_copy.py'
Jan 21 23:32:38 compute-0 sudo[143664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:38 compute-0 python3.9[143666]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038357.0662487-1646-46676772276045/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:38 compute-0 sudo[143664]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:38 compute-0 sudo[143816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryjlivsiekiepuijhudccldokuoqjvyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038358.5915449-1646-125049863430828/AnsiballZ_stat.py'
Jan 21 23:32:38 compute-0 sudo[143816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:39 compute-0 python3.9[143818]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:39 compute-0 sudo[143816]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:39 compute-0 sudo[143941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goameotrplxoxtqnwrzirfhbahemsjvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038358.5915449-1646-125049863430828/AnsiballZ_copy.py'
Jan 21 23:32:39 compute-0 sudo[143941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:39 compute-0 python3.9[143943]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038358.5915449-1646-125049863430828/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:39 compute-0 sudo[143941]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:40 compute-0 sudo[144093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbfhqnhvoevvvnwlgplzlcazoiicixec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038359.9632335-1646-265788494165427/AnsiballZ_stat.py'
Jan 21 23:32:40 compute-0 sudo[144093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:40 compute-0 python3.9[144095]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:40 compute-0 sudo[144093]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:40 compute-0 sudo[144218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjcgsoukuxxezyvhxzhwdhxioxnxpnsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038359.9632335-1646-265788494165427/AnsiballZ_copy.py'
Jan 21 23:32:40 compute-0 sudo[144218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:41 compute-0 python3.9[144220]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038359.9632335-1646-265788494165427/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:41 compute-0 sudo[144218]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:41 compute-0 sudo[144370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyglbwpgvvqmijwwmskhmqfeizhvgmpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038361.3479457-1646-210435629024626/AnsiballZ_stat.py'
Jan 21 23:32:41 compute-0 sudo[144370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:41 compute-0 python3.9[144372]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:41 compute-0 sudo[144370]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:42 compute-0 sudo[144495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlfdajwkhyipmgtdmqhbfdvaiphwcvab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038361.3479457-1646-210435629024626/AnsiballZ_copy.py'
Jan 21 23:32:42 compute-0 sudo[144495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:42 compute-0 python3.9[144497]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038361.3479457-1646-210435629024626/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:42 compute-0 sudo[144495]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:42 compute-0 sudo[144647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fumxlcgsmonzekwffhvhrjvkwdlcmlga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038362.570371-1646-188677713227697/AnsiballZ_stat.py'
Jan 21 23:32:42 compute-0 sudo[144647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:43 compute-0 python3.9[144649]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:43 compute-0 sudo[144647]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:43 compute-0 sudo[144772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuuhpfeergjcisbugrytrxwokjefvvxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038362.570371-1646-188677713227697/AnsiballZ_copy.py'
Jan 21 23:32:43 compute-0 sudo[144772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:43 compute-0 python3.9[144774]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038362.570371-1646-188677713227697/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:43 compute-0 sudo[144772]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:44 compute-0 sudo[144924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvkcjbbikfbhwlghmalrttsxytckjbuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038363.7813942-1646-41428212245861/AnsiballZ_stat.py'
Jan 21 23:32:44 compute-0 sudo[144924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:44 compute-0 python3.9[144926]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:44 compute-0 sudo[144924]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:44 compute-0 sudo[145047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxjelhokvjtinjgljkkbjjhujfzyvvgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038363.7813942-1646-41428212245861/AnsiballZ_copy.py'
Jan 21 23:32:44 compute-0 sudo[145047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:44 compute-0 python3.9[145049]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038363.7813942-1646-41428212245861/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:44 compute-0 sudo[145047]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:45 compute-0 sudo[145199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryedgsybcyfttypfpqlxmevifrpvgdgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038364.9411025-1646-36325235866599/AnsiballZ_stat.py'
Jan 21 23:32:45 compute-0 sudo[145199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:45 compute-0 python3.9[145201]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:45 compute-0 sudo[145199]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:45 compute-0 sudo[145324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxmeykvhkmkwccadqkqtaklqkmbtceyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038364.9411025-1646-36325235866599/AnsiballZ_copy.py'
Jan 21 23:32:45 compute-0 sudo[145324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:45 compute-0 python3.9[145326]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769038364.9411025-1646-36325235866599/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:45 compute-0 sudo[145324]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:47 compute-0 sudo[145476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iblrpkqpmwnqaowvjlrwwjgxulwjwyqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038366.8135047-1985-222016789401645/AnsiballZ_command.py'
Jan 21 23:32:47 compute-0 sudo[145476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:47 compute-0 python3.9[145478]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 21 23:32:47 compute-0 sudo[145476]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:47 compute-0 sudo[145629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgkbvcepftjziqoxkeadgjgiyxfyvyqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038367.683942-2012-210178709858093/AnsiballZ_file.py'
Jan 21 23:32:47 compute-0 sudo[145629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:48 compute-0 python3.9[145631]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:48 compute-0 sudo[145629]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:48 compute-0 sudo[145781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgxnwhtvefyaqconeemdldycaqppyzjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038368.3415177-2012-250353747174318/AnsiballZ_file.py'
Jan 21 23:32:48 compute-0 sudo[145781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:48 compute-0 python3.9[145783]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:48 compute-0 sudo[145781]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:49 compute-0 sudo[145933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncujdqsptmecrnlxqmywztvsnfnenlqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038368.9520686-2012-268356609240035/AnsiballZ_file.py'
Jan 21 23:32:49 compute-0 sudo[145933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:49 compute-0 python3.9[145935]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:49 compute-0 sudo[145933]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:49 compute-0 sudo[146085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brzsvrebxedjwtazvxpkcxzskaaavtru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038369.5987613-2012-279071678472631/AnsiballZ_file.py'
Jan 21 23:32:49 compute-0 sudo[146085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:50 compute-0 python3.9[146087]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:50 compute-0 sudo[146085]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:50 compute-0 sudo[146237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubuessplogegpegiiltmfoldoemupkcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038370.225389-2012-83259959343312/AnsiballZ_file.py'
Jan 21 23:32:50 compute-0 sudo[146237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:50 compute-0 python3.9[146239]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:50 compute-0 sudo[146237]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:51 compute-0 sudo[146389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcfvyzcxfmtcpjjyihnvwaqoddsivjiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038370.8735368-2012-66950364377762/AnsiballZ_file.py'
Jan 21 23:32:51 compute-0 sudo[146389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:51 compute-0 python3.9[146391]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:51 compute-0 sudo[146389]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:51 compute-0 sudo[146541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkswfjdcsqlqpmbjjhmehdfrmraulqiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038371.5729504-2012-276359379427426/AnsiballZ_file.py'
Jan 21 23:32:51 compute-0 sudo[146541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:52 compute-0 python3.9[146543]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:52 compute-0 sudo[146541]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:52 compute-0 sudo[146693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgphhkifkhjifzmxhceuoxeftaxnfrza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038372.285761-2012-122884285904515/AnsiballZ_file.py'
Jan 21 23:32:52 compute-0 sudo[146693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:52 compute-0 python3.9[146695]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:52 compute-0 sudo[146693]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:53 compute-0 sudo[146845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoutnitrxzgfiaivylcvnvlimyrifhti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038372.942072-2012-250941073423253/AnsiballZ_file.py'
Jan 21 23:32:53 compute-0 sudo[146845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:53 compute-0 python3.9[146847]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:53 compute-0 sudo[146845]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:54 compute-0 sudo[146997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lanmhgspvbxhwrpapimscvkambgvmauo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038373.702829-2012-16429761655396/AnsiballZ_file.py'
Jan 21 23:32:54 compute-0 sudo[146997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:54 compute-0 python3.9[146999]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:54 compute-0 sudo[146997]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:54 compute-0 sudo[147149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbwecgisglwhgzmczoefjnzuuwbpqoau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038374.4107063-2012-242928688941777/AnsiballZ_file.py'
Jan 21 23:32:54 compute-0 sudo[147149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:54 compute-0 python3.9[147151]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:54 compute-0 sudo[147149]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:55 compute-0 sudo[147301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxkpcxcmrkpkqwypbrypkptbkggyudrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038375.096035-2012-119611523575852/AnsiballZ_file.py'
Jan 21 23:32:55 compute-0 sudo[147301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:55 compute-0 python3.9[147303]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:55 compute-0 sudo[147301]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:56 compute-0 sshd-session[147304]: Invalid user webmaster from 188.166.69.60 port 34886
Jan 21 23:32:56 compute-0 sshd-session[147304]: Connection closed by invalid user webmaster 188.166.69.60 port 34886 [preauth]
Jan 21 23:32:56 compute-0 sudo[147455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwcicnmgvviqtkwledrlulipaypbotwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038375.8441467-2012-222664356755148/AnsiballZ_file.py'
Jan 21 23:32:56 compute-0 sudo[147455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:56 compute-0 python3.9[147457]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:56 compute-0 sudo[147455]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:56 compute-0 sudo[147607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgrwoixyqrbljpisnpetuwyjhuwktxjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038376.5837588-2012-55181226459735/AnsiballZ_file.py'
Jan 21 23:32:56 compute-0 sudo[147607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:57 compute-0 python3.9[147609]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:57 compute-0 sudo[147607]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:57 compute-0 podman[147634]: 2026-01-21 23:32:57.740352002 +0000 UTC m=+0.109618760 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 21 23:32:58 compute-0 sudo[147785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvmbnwwxmdisluvdqqgowulheavrrzpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038378.6597154-2309-38867523732451/AnsiballZ_stat.py'
Jan 21 23:32:58 compute-0 sudo[147785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:59 compute-0 python3.9[147787]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:32:59 compute-0 sudo[147785]: pam_unix(sudo:session): session closed for user root
Jan 21 23:32:59 compute-0 sudo[147908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glushrajifenbrrrtacmopzaiqfvedgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038378.6597154-2309-38867523732451/AnsiballZ_copy.py'
Jan 21 23:32:59 compute-0 sudo[147908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:32:59 compute-0 python3.9[147910]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038378.6597154-2309-38867523732451/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:32:59 compute-0 sudo[147908]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:00 compute-0 sudo[148060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kophyesjqlrhaeqigdygyjazqhtmnqea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038379.9427154-2309-125561215212012/AnsiballZ_stat.py'
Jan 21 23:33:00 compute-0 sudo[148060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:00 compute-0 python3.9[148062]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:00 compute-0 sudo[148060]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:00 compute-0 sudo[148183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezbkucrhuyrwxzoucaykiqfpgvhemgrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038379.9427154-2309-125561215212012/AnsiballZ_copy.py'
Jan 21 23:33:00 compute-0 sudo[148183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:00 compute-0 python3.9[148185]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038379.9427154-2309-125561215212012/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:00 compute-0 sudo[148183]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:01 compute-0 sudo[148335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pysaqnbejazmaruioqdllzedbmxcagci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038381.1069908-2309-71786294646919/AnsiballZ_stat.py'
Jan 21 23:33:01 compute-0 sudo[148335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:01 compute-0 python3.9[148337]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:01 compute-0 sudo[148335]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:02 compute-0 sudo[148458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgjxufdhruasyxvsjlyhyqhujcnefngt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038381.1069908-2309-71786294646919/AnsiballZ_copy.py'
Jan 21 23:33:02 compute-0 sudo[148458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:02 compute-0 python3.9[148460]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038381.1069908-2309-71786294646919/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:02 compute-0 sudo[148458]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:02 compute-0 sudo[148610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqcnvwgwamdzqzyjgqkpybcsusplifxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038382.4103868-2309-93120913298253/AnsiballZ_stat.py'
Jan 21 23:33:02 compute-0 sudo[148610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:02 compute-0 python3.9[148612]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:02 compute-0 sudo[148610]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:33:03.162 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:33:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:33:03.163 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:33:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:33:03.163 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:33:03 compute-0 sudo[148733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeiemladkkjfpibudcvroaztwhenfjst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038382.4103868-2309-93120913298253/AnsiballZ_copy.py'
Jan 21 23:33:03 compute-0 sudo[148733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:03 compute-0 python3.9[148735]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038382.4103868-2309-93120913298253/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:03 compute-0 sudo[148733]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:03 compute-0 sudo[148885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imtyvbxpfeihbzitbnqeigdloguybsls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038383.6641452-2309-99646220015091/AnsiballZ_stat.py'
Jan 21 23:33:03 compute-0 sudo[148885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:04 compute-0 python3.9[148887]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:04 compute-0 sudo[148885]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:04 compute-0 sudo[149008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgbebmuuqqcptfbpvdshzslfldwmnxxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038383.6641452-2309-99646220015091/AnsiballZ_copy.py'
Jan 21 23:33:04 compute-0 sudo[149008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:04 compute-0 python3.9[149010]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038383.6641452-2309-99646220015091/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:04 compute-0 sudo[149008]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:05 compute-0 sudo[149160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aciutwhmtuyfraghxjyhcdvihnvyrxop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038384.9451804-2309-129887494714069/AnsiballZ_stat.py'
Jan 21 23:33:05 compute-0 sudo[149160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:05 compute-0 python3.9[149162]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:05 compute-0 sudo[149160]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:05 compute-0 sudo[149283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxkplalhlpprlqytgjmouchoitlsfvvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038384.9451804-2309-129887494714069/AnsiballZ_copy.py'
Jan 21 23:33:05 compute-0 sudo[149283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:06 compute-0 python3.9[149285]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038384.9451804-2309-129887494714069/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:06 compute-0 sudo[149283]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:06 compute-0 sudo[149435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbpsgyvixbxcbqwqkcosounjetcfvnwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038386.2120028-2309-140430703957230/AnsiballZ_stat.py'
Jan 21 23:33:06 compute-0 sudo[149435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:06 compute-0 python3.9[149437]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:06 compute-0 sudo[149435]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:07 compute-0 sudo[149568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mksybxwqwgcxmmquahfekaqktlgapgzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038386.2120028-2309-140430703957230/AnsiballZ_copy.py'
Jan 21 23:33:07 compute-0 sudo[149568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:07 compute-0 podman[149532]: 2026-01-21 23:33:07.039765007 +0000 UTC m=+0.066706814 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 21 23:33:07 compute-0 python3.9[149570]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038386.2120028-2309-140430703957230/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:07 compute-0 sudo[149568]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:07 compute-0 sudo[149729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnaptrntitbnqdodwozaliiqvylbelry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038387.3981297-2309-60836444359348/AnsiballZ_stat.py'
Jan 21 23:33:07 compute-0 sudo[149729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:07 compute-0 python3.9[149731]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:07 compute-0 sudo[149729]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:08 compute-0 sudo[149852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsrhnakpprnbwrancuthjsejpupvswjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038387.3981297-2309-60836444359348/AnsiballZ_copy.py'
Jan 21 23:33:08 compute-0 sudo[149852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:08 compute-0 python3.9[149854]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038387.3981297-2309-60836444359348/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:08 compute-0 sudo[149852]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:09 compute-0 sudo[150004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqldcixvaebgtpiekvizzmgueivqvwio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038388.7428405-2309-92718666775051/AnsiballZ_stat.py'
Jan 21 23:33:09 compute-0 sudo[150004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:09 compute-0 python3.9[150006]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:09 compute-0 sudo[150004]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:09 compute-0 sudo[150127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udllmhlsywztocfhqqrvqmtjslzgznsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038388.7428405-2309-92718666775051/AnsiballZ_copy.py'
Jan 21 23:33:09 compute-0 sudo[150127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:09 compute-0 python3.9[150129]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038388.7428405-2309-92718666775051/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:09 compute-0 sudo[150127]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:10 compute-0 sudo[150279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlrkotojkfabtnhsivjpzbjvhytmuopb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038389.8411548-2309-86654139087204/AnsiballZ_stat.py'
Jan 21 23:33:10 compute-0 sudo[150279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:10 compute-0 python3.9[150281]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:10 compute-0 sudo[150279]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:10 compute-0 sudo[150402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmsfcifbxyoruyssspzdalkdoxcfjzhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038389.8411548-2309-86654139087204/AnsiballZ_copy.py'
Jan 21 23:33:10 compute-0 sudo[150402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:10 compute-0 python3.9[150404]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038389.8411548-2309-86654139087204/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:10 compute-0 sudo[150402]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:11 compute-0 sudo[150554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egxemqamfyvvwcewkhljvualdpforump ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038391.0130625-2309-150869970934479/AnsiballZ_stat.py'
Jan 21 23:33:11 compute-0 sudo[150554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:11 compute-0 python3.9[150556]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:11 compute-0 sudo[150554]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:11 compute-0 sudo[150677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywneefmhtergydycssnxhlsndwipatfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038391.0130625-2309-150869970934479/AnsiballZ_copy.py'
Jan 21 23:33:11 compute-0 sudo[150677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:12 compute-0 python3.9[150679]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038391.0130625-2309-150869970934479/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:12 compute-0 sudo[150677]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:12 compute-0 sudo[150829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptnvlnpsrqvhqatokytkzbhdgelbxzfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038392.2612865-2309-164737639869420/AnsiballZ_stat.py'
Jan 21 23:33:12 compute-0 sudo[150829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:12 compute-0 python3.9[150831]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:12 compute-0 sudo[150829]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:13 compute-0 sudo[150952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucbffyavzcikxsngjzzzeofytfvsvzlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038392.2612865-2309-164737639869420/AnsiballZ_copy.py'
Jan 21 23:33:13 compute-0 sudo[150952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:13 compute-0 python3.9[150954]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038392.2612865-2309-164737639869420/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:13 compute-0 sudo[150952]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:13 compute-0 sudo[151104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flpirdynalcxdtlpewerievrpxddpzak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038393.632748-2309-182992601762940/AnsiballZ_stat.py'
Jan 21 23:33:13 compute-0 sudo[151104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:14 compute-0 python3.9[151106]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:14 compute-0 sudo[151104]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:14 compute-0 sudo[151227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atptqlarmdmcildtncwzkpdzdinevqno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038393.632748-2309-182992601762940/AnsiballZ_copy.py'
Jan 21 23:33:14 compute-0 sudo[151227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:14 compute-0 python3.9[151229]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038393.632748-2309-182992601762940/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:14 compute-0 sudo[151227]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:15 compute-0 sudo[151379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdjgvfypfuolpeltbpgsxorkbzldyhoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038394.9083028-2309-31847837073380/AnsiballZ_stat.py'
Jan 21 23:33:15 compute-0 sudo[151379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:15 compute-0 python3.9[151381]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:15 compute-0 sudo[151379]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:15 compute-0 sudo[151502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izrejqhvnimyaercdruwydpnvulvtgzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038394.9083028-2309-31847837073380/AnsiballZ_copy.py'
Jan 21 23:33:15 compute-0 sudo[151502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:15 compute-0 python3.9[151504]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038394.9083028-2309-31847837073380/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:15 compute-0 sudo[151502]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:18 compute-0 python3.9[151654]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:33:19 compute-0 sudo[151807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwigaodlvgwiyajtoninkleiyheacnfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038398.6969826-2927-151497231013728/AnsiballZ_seboolean.py'
Jan 21 23:33:19 compute-0 sudo[151807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:19 compute-0 python3.9[151809]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 21 23:33:20 compute-0 sudo[151807]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:21 compute-0 sudo[151963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcaokqkmniaancowsyqkqompskszgfqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038401.0905347-2951-183980098252227/AnsiballZ_copy.py'
Jan 21 23:33:21 compute-0 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 21 23:33:21 compute-0 sudo[151963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:21 compute-0 python3.9[151965]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:21 compute-0 sudo[151963]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:22 compute-0 sudo[152115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zicpcpziwevebmbnmfbatuuuryiicgcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038401.7329981-2951-245960650440767/AnsiballZ_copy.py'
Jan 21 23:33:22 compute-0 sudo[152115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:22 compute-0 python3.9[152117]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:22 compute-0 sudo[152115]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:22 compute-0 sudo[152267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkhharbmbfvdmemgpnkwhjmlcxveqdfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038402.35285-2951-126015580742081/AnsiballZ_copy.py'
Jan 21 23:33:22 compute-0 sudo[152267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:22 compute-0 python3.9[152269]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:22 compute-0 sudo[152267]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:23 compute-0 sudo[152419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgmqttghhgdmpastolzaxohltqibsobd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038402.9259255-2951-106936856035065/AnsiballZ_copy.py'
Jan 21 23:33:23 compute-0 sudo[152419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:23 compute-0 python3.9[152421]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:23 compute-0 sudo[152419]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:23 compute-0 sudo[152571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oduppqzgwniithaumaqvldhrtjexmbga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038403.5048833-2951-252669481337159/AnsiballZ_copy.py'
Jan 21 23:33:23 compute-0 sudo[152571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:23 compute-0 python3.9[152573]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:24 compute-0 sudo[152571]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:25 compute-0 sudo[152723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-angbkxnwsjvdcojiemrwypjbtjlpnsrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038404.8378477-3059-235649740556211/AnsiballZ_copy.py'
Jan 21 23:33:25 compute-0 sudo[152723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:25 compute-0 python3.9[152725]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:25 compute-0 sudo[152723]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:25 compute-0 sudo[152875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjbprxmhbofvswipnrsrnotmvvzfukpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038405.4582295-3059-69620772330135/AnsiballZ_copy.py'
Jan 21 23:33:25 compute-0 sudo[152875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:25 compute-0 python3.9[152877]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:25 compute-0 sudo[152875]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:26 compute-0 sudo[153027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuhvaaqyokuaogwcrraktqmrcuyuelhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038406.0663438-3059-239884735614419/AnsiballZ_copy.py'
Jan 21 23:33:26 compute-0 sudo[153027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:26 compute-0 python3.9[153029]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:26 compute-0 sudo[153027]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:26 compute-0 sudo[153179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naglddzcpkknigqiqtfiyynpzbivawve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038406.6884081-3059-97107420706864/AnsiballZ_copy.py'
Jan 21 23:33:26 compute-0 sudo[153179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:27 compute-0 python3.9[153181]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:27 compute-0 sudo[153179]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:27 compute-0 sudo[153331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwupofjquvbwdtqcesitzecabjpdjoai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038407.329947-3059-215661357738797/AnsiballZ_copy.py'
Jan 21 23:33:27 compute-0 sudo[153331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:27 compute-0 python3.9[153333]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:27 compute-0 sudo[153331]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:27 compute-0 podman[153334]: 2026-01-21 23:33:27.924946044 +0000 UTC m=+0.095134610 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 21 23:33:28 compute-0 sudo[153509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efyontvnavibsxyoigwpcwkfhchobfrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038408.4673693-3167-221268845742666/AnsiballZ_systemd.py'
Jan 21 23:33:28 compute-0 sudo[153509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:29 compute-0 python3.9[153511]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:33:29 compute-0 systemd[1]: Reloading.
Jan 21 23:33:29 compute-0 systemd-sysv-generator[153542]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:33:29 compute-0 systemd-rc-local-generator[153539]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:33:29 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Jan 21 23:33:29 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Jan 21 23:33:29 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 21 23:33:29 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 21 23:33:29 compute-0 systemd[1]: Starting libvirt logging daemon...
Jan 21 23:33:29 compute-0 systemd[1]: Started libvirt logging daemon.
Jan 21 23:33:29 compute-0 sudo[153509]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:29 compute-0 sudo[153702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjrozwlqymoeroywedgjcgrdeolzopbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038409.5952508-3167-258022423928558/AnsiballZ_systemd.py'
Jan 21 23:33:29 compute-0 sudo[153702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:30 compute-0 python3.9[153704]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:33:30 compute-0 systemd[1]: Reloading.
Jan 21 23:33:30 compute-0 systemd-sysv-generator[153731]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:33:30 compute-0 systemd-rc-local-generator[153727]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:33:30 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 21 23:33:30 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 21 23:33:30 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 21 23:33:30 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 21 23:33:30 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 21 23:33:30 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 21 23:33:30 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 21 23:33:30 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 21 23:33:30 compute-0 sudo[153702]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:31 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 21 23:33:31 compute-0 sudo[153918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyykrdimbnlsczevdtwqcjoyqioltnwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038410.7559345-3167-70788636098357/AnsiballZ_systemd.py'
Jan 21 23:33:31 compute-0 sudo[153918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:31 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 21 23:33:31 compute-0 python3.9[153920]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:33:31 compute-0 systemd[1]: Reloading.
Jan 21 23:33:31 compute-0 systemd-rc-local-generator[153943]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:33:31 compute-0 systemd-sysv-generator[153946]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:33:31 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 21 23:33:31 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 21 23:33:31 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 21 23:33:31 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 21 23:33:31 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 21 23:33:31 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 21 23:33:31 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 21 23:33:31 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 21 23:33:31 compute-0 sudo[153918]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:32 compute-0 sudo[154138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzwpjfoxppckjyrztgwwhjjzmrxzepdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038411.9120464-3167-111044027975553/AnsiballZ_systemd.py'
Jan 21 23:33:32 compute-0 sudo[154138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:32 compute-0 python3.9[154140]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:33:32 compute-0 systemd[1]: Reloading.
Jan 21 23:33:32 compute-0 systemd-rc-local-generator[154166]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:33:32 compute-0 systemd-sysv-generator[154169]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:33:32 compute-0 setroubleshoot[153892]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a6ec1176-a097-48c6-b502-837dbe704c19
Jan 21 23:33:32 compute-0 setroubleshoot[153892]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 21 23:33:32 compute-0 setroubleshoot[153892]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a6ec1176-a097-48c6-b502-837dbe704c19
Jan 21 23:33:32 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:33:32 compute-0 setroubleshoot[153892]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 21 23:33:32 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Jan 21 23:33:32 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 21 23:33:32 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 21 23:33:32 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 21 23:33:32 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 21 23:33:32 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 21 23:33:32 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 21 23:33:32 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 21 23:33:32 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 21 23:33:32 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 21 23:33:32 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 21 23:33:32 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 21 23:33:32 compute-0 sudo[154138]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:33 compute-0 sudo[154355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcaqyswvajatmnzcfjkcdvretpmtgrpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038413.0352316-3167-100928138838271/AnsiballZ_systemd.py'
Jan 21 23:33:33 compute-0 sudo[154355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:33 compute-0 python3.9[154357]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:33:33 compute-0 systemd[1]: Reloading.
Jan 21 23:33:33 compute-0 systemd-rc-local-generator[154384]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:33:33 compute-0 systemd-sysv-generator[154389]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:33:33 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Jan 21 23:33:33 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Jan 21 23:33:33 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 21 23:33:33 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 21 23:33:33 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 21 23:33:33 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 21 23:33:33 compute-0 systemd[1]: Starting libvirt secret daemon...
Jan 21 23:33:34 compute-0 systemd[1]: Started libvirt secret daemon.
Jan 21 23:33:34 compute-0 sudo[154355]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:35 compute-0 sudo[154567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwucfqyiatkqgjfaawepbwmsviejsqrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038415.1117156-3278-90180133852002/AnsiballZ_file.py'
Jan 21 23:33:35 compute-0 sudo[154567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:35 compute-0 python3.9[154569]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:35 compute-0 sudo[154567]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:36 compute-0 sudo[154719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxnekoodgbhortjssbcexpzcakkmlizf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038415.9788432-3302-93525736964169/AnsiballZ_find.py'
Jan 21 23:33:36 compute-0 sudo[154719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:36 compute-0 python3.9[154721]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 23:33:36 compute-0 sudo[154719]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:37 compute-0 sudo[154881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkxmvzkyvolhivxhnfnljimyrekdjbrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038417.2978485-3344-2839008794220/AnsiballZ_stat.py'
Jan 21 23:33:37 compute-0 sudo[154881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:37 compute-0 podman[154845]: 2026-01-21 23:33:37.632776 +0000 UTC m=+0.060755792 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:33:37 compute-0 python3.9[154888]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:37 compute-0 sudo[154881]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:38 compute-0 sudo[155011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezpzxxdsgtufajxknvzhgceedeoqvfwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038417.2978485-3344-2839008794220/AnsiballZ_copy.py'
Jan 21 23:33:38 compute-0 sudo[155011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:38 compute-0 python3.9[155013]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038417.2978485-3344-2839008794220/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:38 compute-0 sudo[155011]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:39 compute-0 sudo[155163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkvwiiebekcztzburikfzrkgfbjtevad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038419.023899-3392-150963967388369/AnsiballZ_file.py'
Jan 21 23:33:39 compute-0 sudo[155163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:39 compute-0 python3.9[155165]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:39 compute-0 sudo[155163]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:40 compute-0 sudo[155317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imzjkrtptnmppvijlkoroijngigdlugb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038419.8037443-3416-252364954694403/AnsiballZ_stat.py'
Jan 21 23:33:40 compute-0 sudo[155317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:40 compute-0 python3.9[155319]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:40 compute-0 sshd-session[155189]: Invalid user webmaster from 188.166.69.60 port 53614
Jan 21 23:33:40 compute-0 sudo[155317]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:40 compute-0 sshd-session[155189]: Connection closed by invalid user webmaster 188.166.69.60 port 53614 [preauth]
Jan 21 23:33:40 compute-0 sudo[155395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjekzqzfevtempdkdallqtvgzpuujpqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038419.8037443-3416-252364954694403/AnsiballZ_file.py'
Jan 21 23:33:40 compute-0 sudo[155395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:40 compute-0 python3.9[155397]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:40 compute-0 sudo[155395]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:41 compute-0 sudo[155547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awfxuicbgvupkqrwmwowtasbukwxhjjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038421.0902183-3452-242300677246344/AnsiballZ_stat.py'
Jan 21 23:33:41 compute-0 sudo[155547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:41 compute-0 python3.9[155549]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:41 compute-0 sudo[155547]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:41 compute-0 sudo[155625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grggulzfaksfafooaksaclmktbcugssg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038421.0902183-3452-242300677246344/AnsiballZ_file.py'
Jan 21 23:33:41 compute-0 sudo[155625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:42 compute-0 python3.9[155627]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.j0ifdp5u recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:42 compute-0 sudo[155625]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:42 compute-0 sudo[155777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nahzfhhcbvbaqwdoauoczxmwrifrxhcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038422.3862944-3488-202941884639398/AnsiballZ_stat.py'
Jan 21 23:33:42 compute-0 sudo[155777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:42 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 21 23:33:42 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 21 23:33:42 compute-0 python3.9[155779]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:42 compute-0 sudo[155777]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:43 compute-0 sudo[155856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yepcoreegfvjflyctfmueljdkawobnhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038422.3862944-3488-202941884639398/AnsiballZ_file.py'
Jan 21 23:33:43 compute-0 sudo[155856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:43 compute-0 python3.9[155858]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:43 compute-0 sudo[155856]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:44 compute-0 sudo[156008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wooywhcmagjpnzewuvgfpzngviqymiof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038423.726781-3527-133734245036274/AnsiballZ_command.py'
Jan 21 23:33:44 compute-0 sudo[156008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:44 compute-0 python3.9[156010]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:33:44 compute-0 sudo[156008]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:45 compute-0 sudo[156161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpybtprnpqjndtdyxjnvwjgqenyjftdw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038424.5293407-3551-156958931539372/AnsiballZ_edpm_nftables_from_files.py'
Jan 21 23:33:45 compute-0 sudo[156161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:45 compute-0 python3[156163]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 23:33:45 compute-0 sudo[156161]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:45 compute-0 sudo[156313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrlzwpahvcytvahpwzsucskfqcgckmvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038425.4325438-3575-252864715259101/AnsiballZ_stat.py'
Jan 21 23:33:45 compute-0 sudo[156313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:45 compute-0 python3.9[156315]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:46 compute-0 sudo[156313]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:46 compute-0 sudo[156391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gufuzxwwmdnvvsdkdaxlniayzbfsbwwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038425.4325438-3575-252864715259101/AnsiballZ_file.py'
Jan 21 23:33:46 compute-0 sudo[156391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:46 compute-0 python3.9[156393]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:46 compute-0 sudo[156391]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:47 compute-0 sudo[156543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vydgssboexthwpjlfpmhmauzoksacazu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038426.7570784-3611-86237125450752/AnsiballZ_stat.py'
Jan 21 23:33:47 compute-0 sudo[156543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:47 compute-0 python3.9[156545]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:47 compute-0 sudo[156543]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:47 compute-0 sudo[156668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nankwvlvrrnjzkrysehggqmttglexckp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038426.7570784-3611-86237125450752/AnsiballZ_copy.py'
Jan 21 23:33:47 compute-0 sudo[156668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:47 compute-0 python3.9[156670]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038426.7570784-3611-86237125450752/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:47 compute-0 sudo[156668]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:48 compute-0 sudo[156820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dukmbdgfypwdfozctysnamkutrfflljw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038428.2429287-3656-209450197231677/AnsiballZ_stat.py'
Jan 21 23:33:48 compute-0 sudo[156820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:48 compute-0 python3.9[156822]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:48 compute-0 sudo[156820]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:48 compute-0 sudo[156898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pehctqfvnrzzybosbefjphwjrbtkuioj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038428.2429287-3656-209450197231677/AnsiballZ_file.py'
Jan 21 23:33:48 compute-0 sudo[156898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:49 compute-0 python3.9[156900]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:49 compute-0 sudo[156898]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:49 compute-0 sudo[157050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmkdtuwjxhcrnsgqqrrfcptcnhxsbeci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038429.5991313-3692-56466022084406/AnsiballZ_stat.py'
Jan 21 23:33:49 compute-0 sudo[157050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:50 compute-0 python3.9[157052]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:50 compute-0 sudo[157050]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:50 compute-0 sudo[157128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shuklqawucqvzodtbpsqymbfpilpzvau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038429.5991313-3692-56466022084406/AnsiballZ_file.py'
Jan 21 23:33:50 compute-0 sudo[157128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:50 compute-0 python3.9[157130]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:50 compute-0 sudo[157128]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:51 compute-0 sudo[157280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdjyleyfocctpvvyqwzenbsbosmygrzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038430.9310074-3728-54041358634722/AnsiballZ_stat.py'
Jan 21 23:33:51 compute-0 sudo[157280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:51 compute-0 python3.9[157282]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:51 compute-0 sudo[157280]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:51 compute-0 sudo[157405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzeqecftesnfhninktrqntcuthoufhjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038430.9310074-3728-54041358634722/AnsiballZ_copy.py'
Jan 21 23:33:51 compute-0 sudo[157405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:52 compute-0 python3.9[157407]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038430.9310074-3728-54041358634722/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:52 compute-0 sudo[157405]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:52 compute-0 sudo[157557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpiogrvedotsdfbmfiidabdwjhtmdapt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038432.606681-3773-82635924831010/AnsiballZ_file.py'
Jan 21 23:33:52 compute-0 sudo[157557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:53 compute-0 python3.9[157559]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:53 compute-0 sudo[157557]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:53 compute-0 sudo[157709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocwafvotuxlqzxmoonwnzvlgzzvqwpki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038433.377826-3797-41008817539996/AnsiballZ_command.py'
Jan 21 23:33:53 compute-0 sudo[157709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:53 compute-0 python3.9[157711]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:33:53 compute-0 sudo[157709]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:54 compute-0 sudo[157864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpymtrqoglsterssxlllvwzclvoumudm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038434.2017052-3821-161183333155656/AnsiballZ_blockinfile.py'
Jan 21 23:33:54 compute-0 sudo[157864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:54 compute-0 python3.9[157866]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:54 compute-0 sudo[157864]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:55 compute-0 sudo[158016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvaybjbqverpsngjlimotyjtpapbvkvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038435.2643135-3848-208423280236799/AnsiballZ_command.py'
Jan 21 23:33:55 compute-0 sudo[158016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:55 compute-0 python3.9[158018]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:33:55 compute-0 sudo[158016]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:56 compute-0 sudo[158169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viumrgsmjzlfwppvpekfpysqpdhpsalj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038436.1084054-3872-11971900226523/AnsiballZ_stat.py'
Jan 21 23:33:56 compute-0 sudo[158169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:56 compute-0 python3.9[158171]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:33:56 compute-0 sudo[158169]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:57 compute-0 sudo[158323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwskeqzdzdgytivzzjzhxrfumoahbjup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038436.8743842-3896-15614379280207/AnsiballZ_command.py'
Jan 21 23:33:57 compute-0 sudo[158323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:57 compute-0 python3.9[158325]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:33:57 compute-0 sudo[158323]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:57 compute-0 sudo[158478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odqoiwgjhbsbebezmfcekffulkpoddtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038437.690703-3920-154260068726481/AnsiballZ_file.py'
Jan 21 23:33:57 compute-0 sudo[158478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:58 compute-0 podman[158480]: 2026-01-21 23:33:58.098862504 +0000 UTC m=+0.110269954 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 21 23:33:58 compute-0 python3.9[158481]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:58 compute-0 sudo[158478]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:58 compute-0 sudo[158657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqzozwozrrtfxttechyadtvsodrvdgni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038438.5507174-3944-138906178076033/AnsiballZ_stat.py'
Jan 21 23:33:58 compute-0 sudo[158657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:59 compute-0 python3.9[158659]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:33:59 compute-0 sudo[158657]: pam_unix(sudo:session): session closed for user root
Jan 21 23:33:59 compute-0 sudo[158780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtcxbqhqbmxmtlvpwqpqfiiufkwgxhvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038438.5507174-3944-138906178076033/AnsiballZ_copy.py'
Jan 21 23:33:59 compute-0 sudo[158780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:33:59 compute-0 python3.9[158782]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038438.5507174-3944-138906178076033/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:33:59 compute-0 sudo[158780]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:00 compute-0 sudo[158932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppiemxgoyetjxffhtqgtqljlhztkbhcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038440.086227-3989-143794237105350/AnsiballZ_stat.py'
Jan 21 23:34:00 compute-0 sudo[158932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:00 compute-0 python3.9[158934]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:34:00 compute-0 sudo[158932]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:00 compute-0 sudo[159055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvtnzlsviahkzfiasozlallauvtwldaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038440.086227-3989-143794237105350/AnsiballZ_copy.py'
Jan 21 23:34:00 compute-0 sudo[159055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:01 compute-0 python3.9[159057]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038440.086227-3989-143794237105350/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:01 compute-0 sudo[159055]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:01 compute-0 sudo[159207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejiwxgkbbjxikpwevokreerfmijqdqjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038441.489182-4034-60390015608259/AnsiballZ_stat.py'
Jan 21 23:34:01 compute-0 sudo[159207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:01 compute-0 python3.9[159209]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:34:01 compute-0 sudo[159207]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:02 compute-0 sudo[159330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulnxelmfotixlgedjxizionewynojjpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038441.489182-4034-60390015608259/AnsiballZ_copy.py'
Jan 21 23:34:02 compute-0 sudo[159330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:02 compute-0 python3.9[159332]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038441.489182-4034-60390015608259/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:02 compute-0 sudo[159330]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:34:03.164 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:34:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:34:03.165 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:34:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:34:03.165 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:34:03 compute-0 sudo[159482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcxxuhhdiljftzwkmplwosegseuydpnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038442.949403-4079-132871096777179/AnsiballZ_systemd.py'
Jan 21 23:34:03 compute-0 sudo[159482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:03 compute-0 python3.9[159484]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:34:03 compute-0 systemd[1]: Reloading.
Jan 21 23:34:03 compute-0 systemd-rc-local-generator[159510]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:34:03 compute-0 systemd-sysv-generator[159514]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:34:03 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Jan 21 23:34:03 compute-0 sudo[159482]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:04 compute-0 sudo[159673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jumlbqypfpykpyuptckyqxzpczactqwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038444.2864478-4103-6444042948583/AnsiballZ_systemd.py'
Jan 21 23:34:04 compute-0 sudo[159673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:04 compute-0 python3.9[159675]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 21 23:34:04 compute-0 systemd[1]: Reloading.
Jan 21 23:34:05 compute-0 systemd-sysv-generator[159705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:34:05 compute-0 systemd-rc-local-generator[159701]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:34:05 compute-0 systemd[1]: Reloading.
Jan 21 23:34:05 compute-0 systemd-rc-local-generator[159736]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:34:05 compute-0 systemd-sysv-generator[159743]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:34:05 compute-0 sudo[159673]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:06 compute-0 sshd-session[104956]: Connection closed by 192.168.122.30 port 46194
Jan 21 23:34:06 compute-0 sshd-session[104953]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:34:06 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Jan 21 23:34:06 compute-0 systemd[1]: session-22.scope: Consumed 3min 35.223s CPU time.
Jan 21 23:34:06 compute-0 systemd-logind[784]: Session 22 logged out. Waiting for processes to exit.
Jan 21 23:34:06 compute-0 systemd-logind[784]: Removed session 22.
Jan 21 23:34:08 compute-0 podman[159771]: 2026-01-21 23:34:08.696897092 +0000 UTC m=+0.067301083 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:34:11 compute-0 sshd-session[159791]: Accepted publickey for zuul from 192.168.122.30 port 41928 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:34:11 compute-0 systemd-logind[784]: New session 23 of user zuul.
Jan 21 23:34:11 compute-0 systemd[1]: Started Session 23 of User zuul.
Jan 21 23:34:11 compute-0 sshd-session[159791]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:34:13 compute-0 python3.9[159944]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:34:14 compute-0 python3.9[160098]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:34:14 compute-0 network[160115]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:34:14 compute-0 network[160116]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:34:14 compute-0 network[160117]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:34:19 compute-0 sudo[160386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbuecvzgvtbunxladivgnrevopzjgfpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038459.2260985-101-199493828590936/AnsiballZ_setup.py'
Jan 21 23:34:19 compute-0 sudo[160386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:19 compute-0 python3.9[160388]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 21 23:34:20 compute-0 sudo[160386]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:20 compute-0 sudo[160470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvqhvxkoxkzxnuzfrtcytlfclthsfvbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038459.2260985-101-199493828590936/AnsiballZ_dnf.py'
Jan 21 23:34:20 compute-0 sudo[160470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:20 compute-0 python3.9[160472]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:34:23 compute-0 sshd-session[160474]: Invalid user webmaster from 188.166.69.60 port 53272
Jan 21 23:34:23 compute-0 sshd-session[160474]: Connection closed by invalid user webmaster 188.166.69.60 port 53272 [preauth]
Jan 21 23:34:26 compute-0 sudo[160470]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:27 compute-0 sudo[160625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oocnkefilhehidctdhnifbbptnaqpzrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038466.7962806-137-135925284530727/AnsiballZ_stat.py'
Jan 21 23:34:27 compute-0 sudo[160625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:27 compute-0 python3.9[160627]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:34:27 compute-0 sudo[160625]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:28 compute-0 sudo[160783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxcyrosrmkzrjivycyrbkxdxovqmfdju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038467.8475986-167-32519247247843/AnsiballZ_command.py'
Jan 21 23:34:28 compute-0 sudo[160783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:28 compute-0 podman[160751]: 2026-01-21 23:34:28.627718408 +0000 UTC m=+0.108633002 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 23:34:28 compute-0 python3.9[160791]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:34:28 compute-0 sudo[160783]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:29 compute-0 sudo[160958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znushrklgtkhgcxlurahctmvklbporat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038469.1753962-197-163048696602190/AnsiballZ_stat.py'
Jan 21 23:34:29 compute-0 sudo[160958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:29 compute-0 python3.9[160960]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:34:29 compute-0 sudo[160958]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:30 compute-0 sudo[161110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsarhkoqnrtdauhldqvsqnxemxdjfryc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038470.080804-221-9366977151990/AnsiballZ_command.py'
Jan 21 23:34:30 compute-0 sudo[161110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:30 compute-0 python3.9[161112]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:34:30 compute-0 sudo[161110]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:31 compute-0 sudo[161263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abzixlkkxbbxfbrefzcuiolulbolpvyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038470.895635-245-263536939650994/AnsiballZ_stat.py'
Jan 21 23:34:31 compute-0 sudo[161263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:31 compute-0 python3.9[161265]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:34:31 compute-0 sudo[161263]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:31 compute-0 sudo[161386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-winmwcwrwreionvxozqrnmztvnybobpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038470.895635-245-263536939650994/AnsiballZ_copy.py'
Jan 21 23:34:31 compute-0 sudo[161386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:32 compute-0 python3.9[161388]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038470.895635-245-263536939650994/.source.iscsi _original_basename=.501lfpfe follow=False checksum=2848bc1d743ceafff3b4715059e9b2255df59326 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:32 compute-0 sudo[161386]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:32 compute-0 sudo[161538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjhbbxratanxhpltnbfroxhvgybkylch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038472.4465165-290-231501501908218/AnsiballZ_file.py'
Jan 21 23:34:32 compute-0 sudo[161538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:33 compute-0 python3.9[161540]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:33 compute-0 sudo[161538]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:33 compute-0 sudo[161690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irexinqkpuinemfhhqwqnpphfadvpwjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038473.3922005-314-92885821963506/AnsiballZ_lineinfile.py'
Jan 21 23:34:33 compute-0 sudo[161690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:34 compute-0 python3.9[161692]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:34 compute-0 sudo[161690]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:35 compute-0 sudo[161842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgabfoifngwqoifhysvirrtdqeqjyxyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038474.4140308-341-31942881928225/AnsiballZ_systemd_service.py'
Jan 21 23:34:35 compute-0 sudo[161842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:35 compute-0 python3.9[161844]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:34:35 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 21 23:34:35 compute-0 sudo[161842]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:36 compute-0 sudo[161998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baztaljgtongwdmcaqhoneraxgzduowb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038475.850498-365-232948590048080/AnsiballZ_systemd_service.py'
Jan 21 23:34:36 compute-0 sudo[161998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:36 compute-0 python3.9[162000]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:34:36 compute-0 systemd[1]: Reloading.
Jan 21 23:34:36 compute-0 systemd-rc-local-generator[162030]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:34:36 compute-0 systemd-sysv-generator[162033]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:34:36 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 21 23:34:36 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 21 23:34:36 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Jan 21 23:34:36 compute-0 systemd[1]: Started Open-iSCSI.
Jan 21 23:34:36 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 21 23:34:36 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 21 23:34:36 compute-0 sudo[161998]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:38 compute-0 python3.9[162199]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:34:38 compute-0 network[162216]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:34:38 compute-0 network[162217]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:34:38 compute-0 network[162218]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:34:39 compute-0 podman[162225]: 2026-01-21 23:34:39.044210998 +0000 UTC m=+0.054978883 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 23:34:44 compute-0 sudo[162507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjyanqsuoiyuzclpnrzobahsszrevkkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038483.7710552-434-151877182955945/AnsiballZ_dnf.py'
Jan 21 23:34:44 compute-0 sudo[162507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:44 compute-0 python3.9[162509]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:34:46 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:34:46 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:34:47 compute-0 systemd[1]: Reloading.
Jan 21 23:34:47 compute-0 systemd-rc-local-generator[162555]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:34:47 compute-0 systemd-sysv-generator[162558]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:34:47 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:34:47 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:34:47 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:34:47 compute-0 systemd[1]: run-re6b0cb5b86ff44829fa5f18e4d215a6b.service: Deactivated successfully.
Jan 21 23:34:47 compute-0 sudo[162507]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:48 compute-0 sudo[162823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihvdnsmogjluvnfjmvadvkhdkwuoxrqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038488.3301206-461-106910442550308/AnsiballZ_file.py'
Jan 21 23:34:48 compute-0 sudo[162823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:48 compute-0 python3.9[162825]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 21 23:34:48 compute-0 sudo[162823]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:49 compute-0 sudo[162975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwggngeemgudnvbmjjpiguoevjrbnlpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038489.1755857-485-202725170385687/AnsiballZ_modprobe.py'
Jan 21 23:34:49 compute-0 sudo[162975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:49 compute-0 python3.9[162977]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 21 23:34:49 compute-0 sudo[162975]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:50 compute-0 sudo[163131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubrkuftkmftyjzzihrukujgsvhgeqpfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038490.1263103-509-278613908087372/AnsiballZ_stat.py'
Jan 21 23:34:50 compute-0 sudo[163131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:50 compute-0 python3.9[163133]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:34:50 compute-0 sudo[163131]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:51 compute-0 sudo[163254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffhkewornoimlzysjumqhlhuzwuwaggm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038490.1263103-509-278613908087372/AnsiballZ_copy.py'
Jan 21 23:34:51 compute-0 sudo[163254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:51 compute-0 python3.9[163256]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038490.1263103-509-278613908087372/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:51 compute-0 sudo[163254]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:51 compute-0 sudo[163406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqncbmsgsfbgzrgkywjsjfbnzfmqemyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038491.7034826-557-170190279405208/AnsiballZ_lineinfile.py'
Jan 21 23:34:51 compute-0 sudo[163406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:52 compute-0 python3.9[163408]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:52 compute-0 sudo[163406]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:53 compute-0 sudo[163558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwonobyyazmcepcyabgumwkyiusrnosj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038492.5514357-581-10843876308166/AnsiballZ_systemd.py'
Jan 21 23:34:53 compute-0 sudo[163558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:53 compute-0 python3.9[163560]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:34:53 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 21 23:34:53 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 21 23:34:53 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 21 23:34:53 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 21 23:34:53 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 21 23:34:53 compute-0 sudo[163558]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:55 compute-0 sudo[163715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snztfwmjntlnsurtaiibzuomnxkdpyku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038494.8062935-605-168657926690479/AnsiballZ_command.py'
Jan 21 23:34:55 compute-0 sudo[163715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:55 compute-0 python3.9[163717]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:34:55 compute-0 sudo[163715]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:56 compute-0 sudo[163868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnbgyijwvkdkdqayjqxditqmfppfprvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038495.8240361-635-179509679786915/AnsiballZ_stat.py'
Jan 21 23:34:56 compute-0 sudo[163868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:56 compute-0 python3.9[163870]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:34:56 compute-0 sudo[163868]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:56 compute-0 sudo[164020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zblomwirakltxlyyyyvubbafdmiybdgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038496.7042196-662-246520645660893/AnsiballZ_stat.py'
Jan 21 23:34:56 compute-0 sudo[164020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:57 compute-0 python3.9[164022]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:34:57 compute-0 sudo[164020]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:57 compute-0 sudo[164143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmrcypmhokcajlgezfipkeoqcrgfpcqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038496.7042196-662-246520645660893/AnsiballZ_copy.py'
Jan 21 23:34:57 compute-0 sudo[164143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:57 compute-0 python3.9[164145]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038496.7042196-662-246520645660893/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:57 compute-0 sudo[164143]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:58 compute-0 sudo[164295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umnyoqfwvtarpgexygdeamsfdcnsbnun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038498.3383112-707-244255926702063/AnsiballZ_command.py'
Jan 21 23:34:58 compute-0 sudo[164295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:58 compute-0 podman[164297]: 2026-01-21 23:34:58.785855901 +0000 UTC m=+0.111898574 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 21 23:34:58 compute-0 python3.9[164298]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:34:58 compute-0 sudo[164295]: pam_unix(sudo:session): session closed for user root
Jan 21 23:34:59 compute-0 sudo[164475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpidhavolsfbfgxlewuqwxepsivesjaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038499.44608-731-145411130982834/AnsiballZ_lineinfile.py'
Jan 21 23:34:59 compute-0 sudo[164475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:34:59 compute-0 python3.9[164477]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:34:59 compute-0 sudo[164475]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:00 compute-0 sudo[164627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdcbpzmyhyfhwmaztryugtsmmpaqmqem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038500.2906308-755-217113522956905/AnsiballZ_replace.py'
Jan 21 23:35:00 compute-0 sudo[164627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:00 compute-0 python3.9[164629]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:01 compute-0 sudo[164627]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:01 compute-0 sudo[164779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vugebdxugrqcgsbbckjmzsnyehpwxkql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038501.1998866-779-13987683112502/AnsiballZ_replace.py'
Jan 21 23:35:01 compute-0 sudo[164779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:01 compute-0 python3.9[164781]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:01 compute-0 sudo[164779]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:02 compute-0 sudo[164931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuthoefmhdblsksmscjzxdzxwebxcwod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038502.1354282-806-274115984930894/AnsiballZ_lineinfile.py'
Jan 21 23:35:02 compute-0 sudo[164931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:02 compute-0 python3.9[164933]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:02 compute-0 sudo[164931]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:03 compute-0 sudo[165083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akagtpdkgoiqyqyvghinelsyzmkwlpun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038502.7470255-806-105877120680630/AnsiballZ_lineinfile.py'
Jan 21 23:35:03 compute-0 sudo[165083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:35:03.166 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:35:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:35:03.167 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:35:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:35:03.167 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:35:03 compute-0 python3.9[165085]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:03 compute-0 sudo[165083]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:03 compute-0 sudo[165235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncheryevqkjhqjcwiotgccyubyqebpvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038503.333524-806-251442950495635/AnsiballZ_lineinfile.py'
Jan 21 23:35:03 compute-0 sudo[165235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:03 compute-0 python3.9[165237]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:03 compute-0 sudo[165235]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:04 compute-0 sudo[165387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmxkednpsalvjydhrckatihxfsfcposw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038503.9893699-806-220406146205456/AnsiballZ_lineinfile.py'
Jan 21 23:35:04 compute-0 sudo[165387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:04 compute-0 python3.9[165389]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:04 compute-0 sudo[165387]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:05 compute-0 sudo[165539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzqxyxsrfslckpledvcpzmrgdpqfpqom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038505.4462817-893-65822407411971/AnsiballZ_stat.py'
Jan 21 23:35:05 compute-0 sudo[165539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:05 compute-0 python3.9[165541]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:35:06 compute-0 sudo[165539]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:06 compute-0 sudo[165693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwvgwcaxgcdzotbnmjahpmngnluiyqaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038506.3166485-917-214300403658967/AnsiballZ_command.py'
Jan 21 23:35:06 compute-0 sudo[165693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:06 compute-0 python3.9[165695]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:35:06 compute-0 sudo[165693]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:07 compute-0 sshd-session[165720]: Invalid user webmaster from 188.166.69.60 port 43402
Jan 21 23:35:07 compute-0 sshd-session[165720]: Connection closed by invalid user webmaster 188.166.69.60 port 43402 [preauth]
Jan 21 23:35:07 compute-0 sudo[165848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsuhzalbwcnbqtepmuanmuqylzhnloqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038507.287442-944-52078710240006/AnsiballZ_systemd_service.py'
Jan 21 23:35:07 compute-0 sudo[165848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:08 compute-0 python3.9[165850]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:08 compute-0 systemd[1]: Listening on multipathd control socket.
Jan 21 23:35:09 compute-0 sudo[165848]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:09 compute-0 podman[165879]: 2026-01-21 23:35:09.711833992 +0000 UTC m=+0.070553788 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:35:10 compute-0 sudo[166023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvkvulkqejoaiqefutjfvfwhohhyujun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038509.7065957-968-240713817451703/AnsiballZ_systemd_service.py'
Jan 21 23:35:10 compute-0 sudo[166023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:10 compute-0 python3.9[166025]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:10 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 21 23:35:10 compute-0 udevadm[166030]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 21 23:35:10 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 21 23:35:10 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 21 23:35:10 compute-0 multipathd[166033]: --------start up--------
Jan 21 23:35:10 compute-0 multipathd[166033]: read /etc/multipath.conf
Jan 21 23:35:10 compute-0 multipathd[166033]: path checkers start up
Jan 21 23:35:10 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 21 23:35:10 compute-0 sudo[166023]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:11 compute-0 sudo[166190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seqtivgvnyopxvnixxcyxfechxndymqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038511.3087893-1004-270180515304567/AnsiballZ_file.py'
Jan 21 23:35:11 compute-0 sudo[166190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:12 compute-0 python3.9[166192]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 21 23:35:12 compute-0 sudo[166190]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:12 compute-0 sudo[166342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uljazgmneighjycuoxxkuqiocsltokmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038512.2639964-1028-18557624458664/AnsiballZ_modprobe.py'
Jan 21 23:35:12 compute-0 sudo[166342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:12 compute-0 python3.9[166344]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 21 23:35:12 compute-0 kernel: Key type psk registered
Jan 21 23:35:12 compute-0 sudo[166342]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:13 compute-0 sudo[166508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvvslkoflqwyglpymxsupnqloflzocbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038513.2380779-1052-250536115178070/AnsiballZ_stat.py'
Jan 21 23:35:13 compute-0 sudo[166508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:13 compute-0 python3.9[166510]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:35:13 compute-0 sudo[166508]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:14 compute-0 sudo[166631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdbugelrgrskuclqurfownccrjvczzfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038513.2380779-1052-250536115178070/AnsiballZ_copy.py'
Jan 21 23:35:14 compute-0 sudo[166631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:14 compute-0 python3.9[166633]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038513.2380779-1052-250536115178070/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:14 compute-0 sudo[166631]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:15 compute-0 sudo[166783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aksplecoolbulteczeavjplxipggbqhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038514.8667245-1100-136994697447096/AnsiballZ_lineinfile.py'
Jan 21 23:35:15 compute-0 sudo[166783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:15 compute-0 python3.9[166785]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:15 compute-0 sudo[166783]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:16 compute-0 sudo[166935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbhumknenwuifqkexbxkrvzlrfgerfty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038515.7409735-1124-266235503660346/AnsiballZ_systemd.py'
Jan 21 23:35:16 compute-0 sudo[166935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:16 compute-0 python3.9[166937]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:35:16 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 21 23:35:16 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 21 23:35:16 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 21 23:35:16 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 21 23:35:16 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 21 23:35:16 compute-0 sudo[166935]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:17 compute-0 sudo[167091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igyegynwkvatyoccnqmudsisjvcxkntv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038517.2070296-1148-180696401493912/AnsiballZ_dnf.py'
Jan 21 23:35:17 compute-0 sudo[167091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:17 compute-0 python3.9[167093]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 21 23:35:19 compute-0 systemd[1]: Reloading.
Jan 21 23:35:19 compute-0 systemd-rc-local-generator[167126]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:35:19 compute-0 systemd-sysv-generator[167129]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:35:20 compute-0 systemd[1]: Reloading.
Jan 21 23:35:20 compute-0 systemd-rc-local-generator[167159]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:35:20 compute-0 systemd-sysv-generator[167163]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:35:20 compute-0 systemd-logind[784]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 21 23:35:20 compute-0 systemd-logind[784]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 21 23:35:20 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 21 23:35:20 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 21 23:35:20 compute-0 systemd[1]: Reloading.
Jan 21 23:35:20 compute-0 systemd-rc-local-generator[167254]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:35:20 compute-0 systemd-sysv-generator[167257]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:35:21 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 21 23:35:21 compute-0 sudo[167091]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:22 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 21 23:35:22 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 21 23:35:22 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.745s CPU time.
Jan 21 23:35:22 compute-0 systemd[1]: run-re2dc098d85c94af6a11e68160c31e4a9.service: Deactivated successfully.
Jan 21 23:35:22 compute-0 sudo[168556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbndejoaiimcrnrswkvtjdhidlfhkvdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038522.5071814-1172-208418555832300/AnsiballZ_systemd_service.py'
Jan 21 23:35:22 compute-0 sudo[168556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:23 compute-0 python3.9[168558]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:35:23 compute-0 systemd[1]: Stopping Open-iSCSI...
Jan 21 23:35:23 compute-0 iscsid[162040]: iscsid shutting down.
Jan 21 23:35:23 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Jan 21 23:35:23 compute-0 systemd[1]: Stopped Open-iSCSI.
Jan 21 23:35:23 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 21 23:35:23 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 21 23:35:23 compute-0 systemd[1]: Started Open-iSCSI.
Jan 21 23:35:23 compute-0 sudo[168556]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:23 compute-0 sudo[168712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqsyktyxppoavyefatlqnlcjzajaonsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038523.5713856-1196-24762854864878/AnsiballZ_systemd_service.py'
Jan 21 23:35:23 compute-0 sudo[168712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:24 compute-0 python3.9[168714]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:35:24 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 21 23:35:24 compute-0 multipathd[166033]: exit (signal)
Jan 21 23:35:24 compute-0 multipathd[166033]: --------shut down-------
Jan 21 23:35:24 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Jan 21 23:35:24 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 21 23:35:24 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 21 23:35:24 compute-0 multipathd[168720]: --------start up--------
Jan 21 23:35:24 compute-0 multipathd[168720]: read /etc/multipath.conf
Jan 21 23:35:24 compute-0 multipathd[168720]: path checkers start up
Jan 21 23:35:24 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 21 23:35:24 compute-0 sudo[168712]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:25 compute-0 python3.9[168877]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:35:26 compute-0 sudo[169031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shqvfochubclynomkwzjncxjwmwzuhar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038526.1954591-1248-131365781423977/AnsiballZ_file.py'
Jan 21 23:35:26 compute-0 sudo[169031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:26 compute-0 python3.9[169033]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:26 compute-0 sudo[169031]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:27 compute-0 sudo[169183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvkadhmvvcwcfpouvgpopuolihojljvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038527.5635061-1281-137643978355710/AnsiballZ_systemd_service.py'
Jan 21 23:35:27 compute-0 sudo[169183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:28 compute-0 python3.9[169185]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:35:28 compute-0 systemd[1]: Reloading.
Jan 21 23:35:28 compute-0 systemd-sysv-generator[169213]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:35:28 compute-0 systemd-rc-local-generator[169210]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:35:28 compute-0 sudo[169183]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:29 compute-0 podman[169344]: 2026-01-21 23:35:29.19971184 +0000 UTC m=+0.093362740 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 21 23:35:29 compute-0 python3.9[169381]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:35:29 compute-0 network[169414]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:35:29 compute-0 network[169415]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:35:29 compute-0 network[169416]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:35:30 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 21 23:35:31 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 21 23:35:32 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 21 23:35:34 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 21 23:35:38 compute-0 sudo[169690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-expegapiaumjdrkthfbcjrodqofyngli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038537.6839771-1338-214349206288980/AnsiballZ_systemd_service.py'
Jan 21 23:35:38 compute-0 sudo[169690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:38 compute-0 python3.9[169692]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:38 compute-0 sudo[169690]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:38 compute-0 sudo[169843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrwjlvxpulyxxvgvyulptltzcbjbwwkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038538.593289-1338-7444443576262/AnsiballZ_systemd_service.py'
Jan 21 23:35:38 compute-0 sudo[169843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:39 compute-0 python3.9[169845]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:39 compute-0 sudo[169843]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:39 compute-0 sudo[170007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-somkawipijmvrwkqchlidmmoqqturxmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038539.4206393-1338-189370518631930/AnsiballZ_systemd_service.py'
Jan 21 23:35:39 compute-0 sudo[170007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:39 compute-0 podman[169970]: 2026-01-21 23:35:39.896825869 +0000 UTC m=+0.097598216 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 23:35:40 compute-0 python3.9[170009]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:40 compute-0 sudo[170007]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:40 compute-0 sudo[170166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzcdunqhgbvsyyxmzzgandtogoysgpgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038540.3553514-1338-98338060803631/AnsiballZ_systemd_service.py'
Jan 21 23:35:40 compute-0 sudo[170166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:41 compute-0 python3.9[170168]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:41 compute-0 sudo[170166]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:41 compute-0 sudo[170319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aappamhiqdnlriymqonpmrwmhtjdlocr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038541.25416-1338-158881627948960/AnsiballZ_systemd_service.py'
Jan 21 23:35:41 compute-0 sudo[170319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:41 compute-0 python3.9[170321]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:41 compute-0 sudo[170319]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:42 compute-0 sudo[170472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbqqcsibotcpilqtdawbvfqibryjdvte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038542.1930163-1338-79637463617327/AnsiballZ_systemd_service.py'
Jan 21 23:35:42 compute-0 sudo[170472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:42 compute-0 python3.9[170474]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:42 compute-0 sudo[170472]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:43 compute-0 sudo[170625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhvsrxfdobuieoplxedpvlppahsmugrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038543.0631158-1338-201226675771383/AnsiballZ_systemd_service.py'
Jan 21 23:35:43 compute-0 sudo[170625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:43 compute-0 python3.9[170627]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:43 compute-0 sudo[170625]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:44 compute-0 sudo[170778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyecmwghkdipbgyerbcpizbxcebdazcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038543.8951228-1338-145466015010464/AnsiballZ_systemd_service.py'
Jan 21 23:35:44 compute-0 sudo[170778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:44 compute-0 python3.9[170780]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:35:44 compute-0 sudo[170778]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:46 compute-0 sudo[170931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzgqyolpnyizldroasxlaewfwgizfiou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038546.0203836-1515-243204366661824/AnsiballZ_file.py'
Jan 21 23:35:46 compute-0 sudo[170931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:46 compute-0 python3.9[170933]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:46 compute-0 sudo[170931]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:47 compute-0 sudo[171083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlhcwqjwaehlehgeeiggknpydkxmkyyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038546.9933448-1515-131433862366685/AnsiballZ_file.py'
Jan 21 23:35:47 compute-0 sudo[171083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:47 compute-0 python3.9[171085]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:47 compute-0 sudo[171083]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:48 compute-0 sudo[171235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdkqzgrkwpkfexjvtqbsxfhmvrgwxhey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038547.8318033-1515-96478267564588/AnsiballZ_file.py'
Jan 21 23:35:48 compute-0 sudo[171235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:48 compute-0 python3.9[171237]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:48 compute-0 sudo[171235]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:48 compute-0 sudo[171387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kphvafxjnjvlexbdqarnhoykykvfscpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038548.586776-1515-40822375894793/AnsiballZ_file.py'
Jan 21 23:35:48 compute-0 sudo[171387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:49 compute-0 python3.9[171389]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:49 compute-0 sudo[171387]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:49 compute-0 sudo[171539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmhtbdsnwnhcnmywaxibhxmmmsezofmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038549.3281457-1515-264288276993999/AnsiballZ_file.py'
Jan 21 23:35:49 compute-0 sudo[171539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:49 compute-0 python3.9[171541]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:49 compute-0 sudo[171539]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:50 compute-0 sudo[171693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuhwifegtvgmyfhupaqldjsrlkhiexnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038549.9829493-1515-97607296327861/AnsiballZ_file.py'
Jan 21 23:35:50 compute-0 sudo[171693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:50 compute-0 python3.9[171695]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:50 compute-0 sudo[171693]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:50 compute-0 sshd-session[171680]: Invalid user webmaster from 188.166.69.60 port 45214
Jan 21 23:35:50 compute-0 sshd-session[171680]: Connection closed by invalid user webmaster 188.166.69.60 port 45214 [preauth]
Jan 21 23:35:50 compute-0 sudo[171845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzzddoivtkziejaohncmpczluhctwxmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038550.6491096-1515-216005452667144/AnsiballZ_file.py'
Jan 21 23:35:50 compute-0 sudo[171845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:51 compute-0 python3.9[171847]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:51 compute-0 sudo[171845]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:52 compute-0 sudo[171997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejqtjnhyphueayqieqwbkghfnlamyxwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038551.3535802-1515-44030537612894/AnsiballZ_file.py'
Jan 21 23:35:52 compute-0 sudo[171997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:52 compute-0 python3.9[171999]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:52 compute-0 sudo[171997]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:54 compute-0 sudo[172149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drrjqzcqsaynovgtbskvfnoucmryvgdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038553.8485847-1686-230398346149370/AnsiballZ_file.py'
Jan 21 23:35:54 compute-0 sudo[172149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:54 compute-0 python3.9[172151]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:54 compute-0 sudo[172149]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:55 compute-0 sudo[172301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrknsqweofjgqgrviozivgkemdesjxqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038554.680281-1686-281392900817487/AnsiballZ_file.py'
Jan 21 23:35:55 compute-0 sudo[172301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:55 compute-0 python3.9[172303]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:55 compute-0 sudo[172301]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:55 compute-0 sudo[172453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dupuzltbqekyfyyvarlllpvbqaykdhkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038555.374449-1686-38066979899459/AnsiballZ_file.py'
Jan 21 23:35:55 compute-0 sudo[172453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:55 compute-0 python3.9[172455]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:55 compute-0 sudo[172453]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:56 compute-0 sudo[172605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxzdtemrhuwomftkwhpobsmksjuudhrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038556.333924-1686-123568813163643/AnsiballZ_file.py'
Jan 21 23:35:56 compute-0 sudo[172605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:56 compute-0 python3.9[172607]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:56 compute-0 sudo[172605]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:57 compute-0 sudo[172757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flnxzrlluuiwjlrwiyobonoviolmyjjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038557.0735373-1686-196663208602690/AnsiballZ_file.py'
Jan 21 23:35:57 compute-0 sudo[172757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:57 compute-0 python3.9[172759]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:57 compute-0 sudo[172757]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:58 compute-0 sudo[172909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqkpfhnuedmdwbtddxbvtxiuefstbbxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038557.805043-1686-7445003406937/AnsiballZ_file.py'
Jan 21 23:35:58 compute-0 sudo[172909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:58 compute-0 python3.9[172911]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:58 compute-0 sudo[172909]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:58 compute-0 sudo[173061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxjkfhayjieoqfatxiuroefjrqtobfgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038558.4749157-1686-179055337156649/AnsiballZ_file.py'
Jan 21 23:35:58 compute-0 sudo[173061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:58 compute-0 python3.9[173063]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:58 compute-0 sudo[173061]: pam_unix(sudo:session): session closed for user root
Jan 21 23:35:59 compute-0 sudo[173226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meawcnxmbqabypdrqkdkmdqxncplrlbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038559.0847433-1686-161772696861497/AnsiballZ_file.py'
Jan 21 23:35:59 compute-0 sudo[173226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:35:59 compute-0 podman[173187]: 2026-01-21 23:35:59.438625928 +0000 UTC m=+0.081802321 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 21 23:35:59 compute-0 python3.9[173236]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:35:59 compute-0 sudo[173226]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:01 compute-0 sudo[173393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbfzivizlholazuwdbvbtqnlmpgjmckj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038560.7390893-1860-142486727791721/AnsiballZ_command.py'
Jan 21 23:36:01 compute-0 sudo[173393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:01 compute-0 python3.9[173395]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:01 compute-0 sudo[173393]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:02 compute-0 python3.9[173547]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 23:36:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:36:03.167 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:36:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:36:03.170 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:36:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:36:03.170 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:36:03 compute-0 sudo[173697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hogojvgjfbdfqerehaxszttucftswcaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038563.1965015-1914-24187669480455/AnsiballZ_systemd_service.py'
Jan 21 23:36:03 compute-0 sudo[173697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:04 compute-0 python3.9[173699]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:36:04 compute-0 systemd[1]: Reloading.
Jan 21 23:36:04 compute-0 systemd-rc-local-generator[173724]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:36:04 compute-0 systemd-sysv-generator[173728]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:36:04 compute-0 sudo[173697]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:05 compute-0 sudo[173883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlzamefolgilriwkqdezvnedsmmrlxbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038565.0869603-1938-182537260516850/AnsiballZ_command.py'
Jan 21 23:36:05 compute-0 sudo[173883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:05 compute-0 python3.9[173885]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:05 compute-0 sudo[173883]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:06 compute-0 sudo[174036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifhvmhovnmvhchwrzxakaptmslwyitqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038565.9464908-1938-125729467702381/AnsiballZ_command.py'
Jan 21 23:36:06 compute-0 sudo[174036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:06 compute-0 python3.9[174038]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:06 compute-0 sudo[174036]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:07 compute-0 sudo[174189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfjlxkohfjwkxtzltsbuclcxcscavrgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038566.6631122-1938-49289522211976/AnsiballZ_command.py'
Jan 21 23:36:07 compute-0 sudo[174189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:07 compute-0 python3.9[174191]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:07 compute-0 sudo[174189]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:07 compute-0 sudo[174342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gynwhssnkokzdpuvukvannnldobpvzjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038567.3840854-1938-149526058681779/AnsiballZ_command.py'
Jan 21 23:36:07 compute-0 sudo[174342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:07 compute-0 python3.9[174344]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:07 compute-0 sudo[174342]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:08 compute-0 sudo[174495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlapsbcnnperrkkcxuipabhsputvuvvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038568.1419024-1938-29908169107490/AnsiballZ_command.py'
Jan 21 23:36:08 compute-0 sudo[174495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:08 compute-0 python3.9[174497]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:08 compute-0 sudo[174495]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:09 compute-0 sudo[174648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srzgdqbkluqbsrjdezzojiuikxjajvqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038568.8035917-1938-60928510023063/AnsiballZ_command.py'
Jan 21 23:36:09 compute-0 sudo[174648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:09 compute-0 python3.9[174650]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:09 compute-0 sudo[174648]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:09 compute-0 sudo[174801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhlzofczpsjwetgtohhmzigwmhfyxnwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038569.4842012-1938-273116456897618/AnsiballZ_command.py'
Jan 21 23:36:09 compute-0 sudo[174801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:09 compute-0 python3.9[174803]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:10 compute-0 sudo[174801]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:10 compute-0 podman[174805]: 2026-01-21 23:36:10.081134452 +0000 UTC m=+0.076813984 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:36:10 compute-0 sudo[174972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsdizdeirmzlmpduxsehulmfjnnbmksl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038570.1751406-1938-56731108566353/AnsiballZ_command.py'
Jan 21 23:36:10 compute-0 sudo[174972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:10 compute-0 python3.9[174974]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:36:10 compute-0 sudo[174972]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:13 compute-0 sudo[175125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fttipabtmzbpfuoleszmyrpbmdfxntfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038572.8898242-2145-97106659069473/AnsiballZ_file.py'
Jan 21 23:36:13 compute-0 sudo[175125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:13 compute-0 python3.9[175127]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:13 compute-0 sudo[175125]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:13 compute-0 sudo[175277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obkxupzcewjbzjggtemicttgjaodyfjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038573.605806-2145-233911661942471/AnsiballZ_file.py'
Jan 21 23:36:13 compute-0 sudo[175277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:14 compute-0 python3.9[175279]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:14 compute-0 sudo[175277]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:14 compute-0 sudo[175429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwlqterjebpwhyczpooqekgizzfgvgrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038574.2672403-2145-105011013693474/AnsiballZ_file.py'
Jan 21 23:36:14 compute-0 sudo[175429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:14 compute-0 python3.9[175431]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:14 compute-0 sudo[175429]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:15 compute-0 sudo[175581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcsqgvjfwjkdfjlcgvuxhscfkjlmeagp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038575.3959262-2211-175409577753987/AnsiballZ_file.py'
Jan 21 23:36:15 compute-0 sudo[175581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:15 compute-0 python3.9[175583]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:15 compute-0 sudo[175581]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:16 compute-0 sudo[175733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbuxsvcvrzwkcwlsfagcnnscjumarhlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038576.107877-2211-69241265747965/AnsiballZ_file.py'
Jan 21 23:36:16 compute-0 sudo[175733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:16 compute-0 python3.9[175735]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:16 compute-0 sudo[175733]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:17 compute-0 sudo[175885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnjdpdesqadnfjksohxkbresdlzfgwct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038576.8799438-2211-98068889911208/AnsiballZ_file.py'
Jan 21 23:36:17 compute-0 sudo[175885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:17 compute-0 python3.9[175887]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:17 compute-0 sudo[175885]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:17 compute-0 sudo[176037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyovnaiqgwgnmbhwarlgnczzsibhnjrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038577.6424184-2211-4278814879370/AnsiballZ_file.py'
Jan 21 23:36:17 compute-0 sudo[176037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:18 compute-0 python3.9[176039]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:18 compute-0 sudo[176037]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:18 compute-0 sudo[176189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oonafokojxluzmrardxqextfbbkepvjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038578.346045-2211-161861322882731/AnsiballZ_file.py'
Jan 21 23:36:18 compute-0 sudo[176189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:18 compute-0 python3.9[176191]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:18 compute-0 sudo[176189]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:19 compute-0 sudo[176341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enoxwxtmojnmxydeeurmdlmdcajzznid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038579.023127-2211-197262009849114/AnsiballZ_file.py'
Jan 21 23:36:19 compute-0 sudo[176341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:19 compute-0 python3.9[176343]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:19 compute-0 sudo[176341]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:19 compute-0 sudo[176493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xihjwdfjgooygpovgnyhrevfmqtslpwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038579.6328418-2211-94357720065882/AnsiballZ_file.py'
Jan 21 23:36:19 compute-0 sudo[176493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:20 compute-0 python3.9[176495]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:20 compute-0 sudo[176493]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:26 compute-0 sudo[176645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdzwqksiunrawyptmzfrfuhqpfxtllvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038585.4163373-2516-196485557417386/AnsiballZ_getent.py'
Jan 21 23:36:26 compute-0 sudo[176645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:26 compute-0 python3.9[176647]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 21 23:36:26 compute-0 sudo[176645]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:27 compute-0 sudo[176798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etlayzxajudkrsvyabcqppszawshufdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038586.5679214-2540-168746350866763/AnsiballZ_group.py'
Jan 21 23:36:27 compute-0 sudo[176798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:27 compute-0 python3.9[176800]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 23:36:27 compute-0 groupadd[176801]: group added to /etc/group: name=nova, GID=42436
Jan 21 23:36:27 compute-0 groupadd[176801]: group added to /etc/gshadow: name=nova
Jan 21 23:36:27 compute-0 groupadd[176801]: new group: name=nova, GID=42436
Jan 21 23:36:27 compute-0 sudo[176798]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:28 compute-0 sudo[176956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbnkvseesljppajyzpyprbeidtbvhttn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038587.7569618-2564-71160526669904/AnsiballZ_user.py'
Jan 21 23:36:28 compute-0 sudo[176956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:28 compute-0 python3.9[176958]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 23:36:28 compute-0 useradd[176960]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 21 23:36:28 compute-0 useradd[176960]: add 'nova' to group 'libvirt'
Jan 21 23:36:28 compute-0 useradd[176960]: add 'nova' to shadow group 'libvirt'
Jan 21 23:36:28 compute-0 sudo[176956]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:29 compute-0 podman[176991]: 2026-01-21 23:36:29.761253341 +0000 UTC m=+0.130062732 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:36:29 compute-0 sshd-session[177017]: Accepted publickey for zuul from 192.168.122.30 port 50878 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:36:29 compute-0 systemd-logind[784]: New session 24 of user zuul.
Jan 21 23:36:29 compute-0 systemd[1]: Started Session 24 of User zuul.
Jan 21 23:36:29 compute-0 sshd-session[177017]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:36:30 compute-0 sshd-session[177020]: Received disconnect from 192.168.122.30 port 50878:11: disconnected by user
Jan 21 23:36:30 compute-0 sshd-session[177020]: Disconnected from user zuul 192.168.122.30 port 50878
Jan 21 23:36:30 compute-0 sshd-session[177017]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:36:30 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Jan 21 23:36:30 compute-0 systemd-logind[784]: Session 24 logged out. Waiting for processes to exit.
Jan 21 23:36:30 compute-0 systemd-logind[784]: Removed session 24.
Jan 21 23:36:30 compute-0 python3.9[177170]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:31 compute-0 python3.9[177291]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038590.297005-2639-2411075105911/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:32 compute-0 python3.9[177441]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:32 compute-0 python3.9[177517]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:33 compute-0 python3.9[177669]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:33 compute-0 sshd-session[177617]: Invalid user webmaster from 188.166.69.60 port 43380
Jan 21 23:36:33 compute-0 sshd-session[177617]: Connection closed by invalid user webmaster 188.166.69.60 port 43380 [preauth]
Jan 21 23:36:33 compute-0 python3.9[177790]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038592.693326-2639-55392702006369/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:34 compute-0 python3.9[177940]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:34 compute-0 python3.9[178061]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038593.9058821-2639-120868483540870/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:35 compute-0 python3.9[178211]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:36 compute-0 python3.9[178332]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038595.1211338-2639-217216935370768/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:36 compute-0 python3.9[178482]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:37 compute-0 python3.9[178603]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038596.3129854-2639-173187536683530/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:39 compute-0 sudo[178753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljexanfetuhesssypfbmgznbdgrclulw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038599.3615992-2888-195470247723455/AnsiballZ_file.py'
Jan 21 23:36:39 compute-0 sudo[178753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:39 compute-0 python3.9[178755]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:36:39 compute-0 sudo[178753]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:40 compute-0 sudo[178911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qenzaxuugiqcjlapmaklwwmeftqqupyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038600.3231213-2912-269508014160019/AnsiballZ_copy.py'
Jan 21 23:36:40 compute-0 sudo[178911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:40 compute-0 podman[178879]: 2026-01-21 23:36:40.735897349 +0000 UTC m=+0.090902823 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 21 23:36:40 compute-0 python3.9[178918]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:36:40 compute-0 sudo[178911]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:41 compute-0 sudo[179076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txpvxzwpypxclwaxbwxowoveovbauwlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038601.1660368-2936-31088238580578/AnsiballZ_stat.py'
Jan 21 23:36:41 compute-0 sudo[179076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:41 compute-0 python3.9[179078]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:36:41 compute-0 sudo[179076]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:42 compute-0 sudo[179228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leciyxalwbfnkrdsuxbnywuzmejuxfeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038602.0840852-2960-582299458169/AnsiballZ_stat.py'
Jan 21 23:36:42 compute-0 sudo[179228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:42 compute-0 python3.9[179230]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:42 compute-0 sudo[179228]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:42 compute-0 sudo[179351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wygubqkprrogfhkbrhwpzmdbmmnccpyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038602.0840852-2960-582299458169/AnsiballZ_copy.py'
Jan 21 23:36:42 compute-0 sudo[179351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:43 compute-0 python3.9[179353]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769038602.0840852-2960-582299458169/.source _original_basename=.r3ums8_p follow=False checksum=580023fa8e7277a349afcdb331e326e9dd49d7d9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 21 23:36:43 compute-0 sudo[179351]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:44 compute-0 python3.9[179505]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:36:45 compute-0 python3.9[179657]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:46 compute-0 python3.9[179778]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038604.8928165-3038-156559721121357/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:46 compute-0 python3.9[179928]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:36:47 compute-0 python3.9[180049]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038606.3081431-3083-88257318408653/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:36:48 compute-0 sudo[180199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqdeklbnfdvkeqqmuuocfekrjcjtsigc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038607.9506466-3134-151521722432773/AnsiballZ_container_config_data.py'
Jan 21 23:36:48 compute-0 sudo[180199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:48 compute-0 python3.9[180201]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 21 23:36:48 compute-0 sudo[180199]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:49 compute-0 sudo[180351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgnlkwmrefwrgsldfuifuwpyidtfxmgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038609.1721473-3167-120598022398803/AnsiballZ_container_config_hash.py'
Jan 21 23:36:49 compute-0 sudo[180351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:50 compute-0 python3.9[180353]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:36:50 compute-0 sudo[180351]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:51 compute-0 sudo[180503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctuuuilljdjvdjexwwvkdtkpdzoankwm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038610.4720547-3197-157884269198450/AnsiballZ_edpm_container_manage.py'
Jan 21 23:36:51 compute-0 sudo[180503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:51 compute-0 python3[180505]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:36:51 compute-0 podman[180541]: 2026-01-21 23:36:51.645186396 +0000 UTC m=+0.071529942 container create 57a2793b9507afbbae2f0fa7609bf86cfb9c78449891b23b28ed40274e2afb2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:36:51 compute-0 podman[180541]: 2026-01-21 23:36:51.606136719 +0000 UTC m=+0.032480355 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 21 23:36:51 compute-0 python3[180505]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 21 23:36:51 compute-0 sudo[180503]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:52 compute-0 sudo[180729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkswjorvyfflqegkjjfkvekvinfrxsvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038612.0824893-3221-187120595969327/AnsiballZ_stat.py'
Jan 21 23:36:52 compute-0 sudo[180729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:52 compute-0 python3.9[180731]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:36:52 compute-0 sudo[180729]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:53 compute-0 sudo[180883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmyweaywqwpspglotkdxmpfjdchfhnzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038613.445242-3257-253457312127341/AnsiballZ_container_config_data.py'
Jan 21 23:36:53 compute-0 sudo[180883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:54 compute-0 python3.9[180885]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 21 23:36:54 compute-0 sudo[180883]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:54 compute-0 sudo[181035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpxvuzfqyesbxtjzljpakdsddcqteine ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038614.5535629-3290-87955756450461/AnsiballZ_container_config_hash.py'
Jan 21 23:36:54 compute-0 sudo[181035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:55 compute-0 python3.9[181037]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:36:55 compute-0 sudo[181035]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:56 compute-0 sudo[181187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykttqywugwirgqahldpzohmxelkwonym ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038615.6877275-3320-152406208087851/AnsiballZ_edpm_container_manage.py'
Jan 21 23:36:56 compute-0 sudo[181187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:56 compute-0 python3[181189]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:36:56 compute-0 podman[181227]: 2026-01-21 23:36:56.534569321 +0000 UTC m=+0.062184050 container create 7993ee99205093b4f1302d55d9d75ff9bf0d4afb00cc9b357f8dc805265d5033 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:36:56 compute-0 podman[181227]: 2026-01-21 23:36:56.502867357 +0000 UTC m=+0.030482156 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 21 23:36:56 compute-0 python3[181189]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 21 23:36:56 compute-0 sudo[181187]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:57 compute-0 sudo[181415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqzwukintvfamvolqqabnunhbosjirui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038617.0714564-3344-187391933058543/AnsiballZ_stat.py'
Jan 21 23:36:57 compute-0 sudo[181415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:57 compute-0 python3.9[181417]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:36:57 compute-0 sudo[181415]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:58 compute-0 sudo[181569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwpevsnqwjxljqrjyylkgqqhoegektfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038618.0382907-3371-174016120847235/AnsiballZ_file.py'
Jan 21 23:36:58 compute-0 sudo[181569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:58 compute-0 python3.9[181571]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:36:58 compute-0 sudo[181569]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:59 compute-0 sudo[181720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baeferosauasqudklijixnqnyudteqys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038618.5941317-3371-246664137985275/AnsiballZ_copy.py'
Jan 21 23:36:59 compute-0 sudo[181720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:59 compute-0 python3.9[181722]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038618.5941317-3371-246664137985275/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:36:59 compute-0 sudo[181720]: pam_unix(sudo:session): session closed for user root
Jan 21 23:36:59 compute-0 sudo[181796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sglkgmcedzwpmwdbtecwqplemyrjemvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038618.5941317-3371-246664137985275/AnsiballZ_systemd.py'
Jan 21 23:36:59 compute-0 sudo[181796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:36:59 compute-0 python3.9[181798]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:36:59 compute-0 systemd[1]: Reloading.
Jan 21 23:36:59 compute-0 systemd-rc-local-generator[181841]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:36:59 compute-0 systemd-sysv-generator[181846]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:37:00 compute-0 podman[181800]: 2026-01-21 23:36:59.99949856 +0000 UTC m=+0.098055109 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:37:00 compute-0 sudo[181796]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:00 compute-0 sudo[181933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnrrgfzeauzvlebgfyiyzewtlkdvuzvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038618.5941317-3371-246664137985275/AnsiballZ_systemd.py'
Jan 21 23:37:00 compute-0 sudo[181933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:01 compute-0 python3.9[181935]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:37:01 compute-0 systemd[1]: Reloading.
Jan 21 23:37:01 compute-0 systemd-sysv-generator[181964]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:37:01 compute-0 systemd-rc-local-generator[181959]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:37:01 compute-0 systemd[1]: Starting nova_compute container...
Jan 21 23:37:01 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:37:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7fb6dd22f2e05b868bb41c5affbb188ce414cd0960e3efd1f66221689553a8c/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7fb6dd22f2e05b868bb41c5affbb188ce414cd0960e3efd1f66221689553a8c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7fb6dd22f2e05b868bb41c5affbb188ce414cd0960e3efd1f66221689553a8c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7fb6dd22f2e05b868bb41c5affbb188ce414cd0960e3efd1f66221689553a8c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7fb6dd22f2e05b868bb41c5affbb188ce414cd0960e3efd1f66221689553a8c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:01 compute-0 podman[181974]: 2026-01-21 23:37:01.533273865 +0000 UTC m=+0.110238032 container init 7993ee99205093b4f1302d55d9d75ff9bf0d4afb00cc9b357f8dc805265d5033 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:37:01 compute-0 podman[181974]: 2026-01-21 23:37:01.545288692 +0000 UTC m=+0.122252809 container start 7993ee99205093b4f1302d55d9d75ff9bf0d4afb00cc9b357f8dc805265d5033 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 21 23:37:01 compute-0 podman[181974]: nova_compute
Jan 21 23:37:01 compute-0 nova_compute[181990]: + sudo -E kolla_set_configs
Jan 21 23:37:01 compute-0 systemd[1]: Started nova_compute container.
Jan 21 23:37:01 compute-0 sudo[181933]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Validating config file
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Copying service configuration files
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Deleting /etc/ceph
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Creating directory /etc/ceph
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Setting permission for /etc/ceph
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Writing out command to execute
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:01 compute-0 nova_compute[181990]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 23:37:01 compute-0 nova_compute[181990]: ++ cat /run_command
Jan 21 23:37:01 compute-0 nova_compute[181990]: + CMD=nova-compute
Jan 21 23:37:01 compute-0 nova_compute[181990]: + ARGS=
Jan 21 23:37:01 compute-0 nova_compute[181990]: + sudo kolla_copy_cacerts
Jan 21 23:37:01 compute-0 nova_compute[181990]: + [[ ! -n '' ]]
Jan 21 23:37:01 compute-0 nova_compute[181990]: + . kolla_extend_start
Jan 21 23:37:01 compute-0 nova_compute[181990]: Running command: 'nova-compute'
Jan 21 23:37:01 compute-0 nova_compute[181990]: + echo 'Running command: '\''nova-compute'\'''
Jan 21 23:37:01 compute-0 nova_compute[181990]: + umask 0022
Jan 21 23:37:01 compute-0 nova_compute[181990]: + exec nova-compute
Jan 21 23:37:03 compute-0 python3.9[182152]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:37:03.168 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:37:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:37:03.168 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:37:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:37:03.169 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:37:03 compute-0 nova_compute[181990]: 2026-01-21 23:37:03.895 181994 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 23:37:03 compute-0 nova_compute[181990]: 2026-01-21 23:37:03.896 181994 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 23:37:03 compute-0 nova_compute[181990]: 2026-01-21 23:37:03.896 181994 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 23:37:03 compute-0 nova_compute[181990]: 2026-01-21 23:37:03.897 181994 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 21 23:37:04 compute-0 nova_compute[181990]: 2026-01-21 23:37:04.082 181994 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:37:04 compute-0 nova_compute[181990]: 2026-01-21 23:37:04.109 181994 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:37:04 compute-0 nova_compute[181990]: 2026-01-21 23:37:04.110 181994 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 21 23:37:04 compute-0 python3.9[182306]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:04 compute-0 nova_compute[181990]: 2026-01-21 23:37:04.866 181994 INFO nova.virt.driver [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 21 23:37:04 compute-0 nova_compute[181990]: 2026-01-21 23:37:04.979 181994 INFO nova.compute.provider_config [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 21 23:37:04 compute-0 nova_compute[181990]: 2026-01-21 23:37:04.999 181994 DEBUG oslo_concurrency.lockutils [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:37:04 compute-0 nova_compute[181990]: 2026-01-21 23:37:04.999 181994 DEBUG oslo_concurrency.lockutils [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:04.999 181994 DEBUG oslo_concurrency.lockutils [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.000 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.000 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.000 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.000 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.000 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.000 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.001 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.001 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.001 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.001 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.001 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.001 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.001 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.002 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.002 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.002 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.002 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.002 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.002 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.002 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.003 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.003 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.003 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.003 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.003 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.003 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.004 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.004 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.004 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.004 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.004 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.004 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.004 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.005 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.005 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.005 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.005 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.005 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.005 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.006 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.006 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.006 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.006 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.006 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.006 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.007 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.007 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.007 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.007 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.007 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.007 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.008 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.008 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.008 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.008 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.008 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.008 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.009 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.009 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.009 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.009 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.009 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.009 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.009 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.010 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.010 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.010 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.010 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.010 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.010 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.010 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.011 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.011 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.011 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.011 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.011 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.011 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.011 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.012 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.012 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.012 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.012 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.012 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.012 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.012 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.013 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.013 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.013 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.013 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.013 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.013 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.013 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.014 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.014 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.014 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.014 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.014 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.015 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.015 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.015 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.015 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.016 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.016 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.016 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.016 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.016 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.016 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.016 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.017 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.017 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.017 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.017 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.017 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.017 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.017 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.018 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.018 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.018 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.018 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.018 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.018 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.019 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.019 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.019 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.019 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.019 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.019 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.019 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.020 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.020 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.020 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.020 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.020 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.020 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.020 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.021 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.021 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.021 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.021 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.021 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.021 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.021 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.022 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.022 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.022 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.022 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.022 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.022 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.022 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.023 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.023 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.023 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.023 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.023 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.023 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.023 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.024 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.024 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.024 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.024 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.024 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.024 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.025 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.025 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.025 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.025 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.025 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.025 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.025 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.026 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.026 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.026 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.026 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.026 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.026 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.026 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.027 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.027 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.027 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.027 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.027 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.027 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.028 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.028 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.028 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.028 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.028 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.028 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.028 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.029 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.029 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.029 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.029 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.029 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.029 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.029 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.030 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.030 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.030 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.030 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.030 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.030 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.031 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.031 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.031 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.031 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.031 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.031 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.031 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.032 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.032 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.032 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.032 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.032 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.032 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.032 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.033 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.033 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.033 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.033 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.033 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.033 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.033 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.034 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.034 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.034 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.034 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.034 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.034 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.034 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.035 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.035 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.035 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.035 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.035 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.035 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.036 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.036 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.036 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.036 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.036 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.037 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.037 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.037 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.037 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.037 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.038 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.038 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.038 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.038 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.038 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.038 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.038 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.039 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.039 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.039 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.039 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.039 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.039 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.040 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.040 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.040 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.040 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.040 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.040 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.041 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.041 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.041 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.041 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.041 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.042 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.042 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.042 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.042 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.042 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.042 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.043 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.043 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.043 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.043 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.043 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.043 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.044 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.044 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.044 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.044 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.044 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.045 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.045 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.045 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.045 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.045 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.045 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.046 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.046 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.046 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.046 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.046 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.046 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.046 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.047 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.047 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.047 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.047 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.047 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.047 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.048 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.048 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.048 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.048 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.048 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.048 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.049 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.049 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.049 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.049 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.049 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.049 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.049 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.049 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.050 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.050 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.050 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.050 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.050 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.051 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.051 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.051 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.051 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.051 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.051 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.052 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.052 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.052 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.052 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.052 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.053 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.053 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.053 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.053 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.053 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.054 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.054 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.054 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.054 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.054 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.054 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.055 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.055 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.055 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.055 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.055 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.056 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.056 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.056 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.057 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.057 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.057 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.057 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.057 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.058 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.058 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.058 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.058 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.058 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.059 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.059 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.059 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.059 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.059 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.060 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.060 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.060 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.060 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.060 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.060 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.061 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.061 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.061 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.061 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.061 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.062 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.062 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.062 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.062 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.062 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.062 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.063 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.063 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.063 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.063 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.064 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.064 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.064 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.064 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.064 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.065 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.065 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.065 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.065 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.065 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.065 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.066 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.066 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.066 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.066 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.066 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.066 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.067 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.067 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.067 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.067 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.067 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.067 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.068 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.068 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.068 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.068 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.069 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.069 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.069 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.069 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.069 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.069 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.070 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.070 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.070 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.070 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.070 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.071 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.071 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.071 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.071 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.071 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.072 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.072 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.072 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.072 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.072 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.072 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.073 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.073 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.073 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.073 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.073 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.073 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.073 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.074 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.074 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.074 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.074 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.074 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.075 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.075 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.075 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.075 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.075 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.076 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.076 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.076 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.076 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.076 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.077 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.077 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.077 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.077 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.077 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.078 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.078 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.078 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.078 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.078 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.079 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.079 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.079 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.079 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.079 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.080 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.080 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.080 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.080 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.080 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.081 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.081 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.081 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.081 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.082 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.082 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.082 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.082 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.082 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.083 181994 WARNING oslo_config.cfg [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 21 23:37:05 compute-0 nova_compute[181990]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 21 23:37:05 compute-0 nova_compute[181990]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 21 23:37:05 compute-0 nova_compute[181990]: and ``live_migration_inbound_addr`` respectively.
Jan 21 23:37:05 compute-0 nova_compute[181990]: ).  Its value may be silently ignored in the future.
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.083 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.083 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.083 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.083 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.084 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.084 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.084 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.084 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.085 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.085 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.085 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.085 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.085 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.086 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.086 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.086 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.086 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.086 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.087 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.087 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.087 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.087 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.088 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.088 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.088 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.088 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.088 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.089 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.089 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.089 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.089 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.090 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.090 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.090 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.090 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.090 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.090 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.090 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.091 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.091 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.091 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.091 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.091 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.092 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.092 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.092 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.092 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.092 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.092 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.093 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.093 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.093 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.093 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.093 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.093 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.093 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.094 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.094 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.094 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.094 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.094 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.094 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.095 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.095 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.095 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.095 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.095 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.095 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.096 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.096 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.096 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.096 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.096 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.096 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.096 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.097 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.097 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.097 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.097 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.097 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.097 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.097 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.098 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.098 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.098 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.098 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.098 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.098 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.099 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.099 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.099 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.099 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.099 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.099 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.099 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.100 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.100 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.100 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.100 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.100 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.100 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.100 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.101 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.101 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.101 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.101 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.101 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.101 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.101 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.102 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.102 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.102 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.102 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.102 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.102 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.102 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.103 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.103 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.103 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.103 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.103 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.103 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.103 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.104 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.104 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.104 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.104 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.104 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.104 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.105 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.105 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.105 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.105 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.105 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.105 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.106 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.106 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.106 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.106 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.106 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.107 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.107 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.107 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.107 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.107 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.107 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.108 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.108 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.108 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.108 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.108 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.109 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.109 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.109 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.109 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.109 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.110 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.110 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.110 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.110 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.110 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.111 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.111 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.111 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.111 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.111 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.112 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.112 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.112 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.112 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.112 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.112 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.112 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.113 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.113 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.113 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.113 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.113 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.113 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.114 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.114 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.114 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.114 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.114 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.114 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.114 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.115 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.115 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.115 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.115 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.115 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.115 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.115 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.116 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.116 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.116 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.116 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.116 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.116 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.117 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.117 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.117 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.117 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.117 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.117 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.117 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.117 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.118 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.118 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.118 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.118 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.118 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.118 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.118 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.119 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.119 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.119 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.119 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.119 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.119 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.119 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.120 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.120 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.120 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.120 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.120 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.120 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.120 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.121 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.121 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.121 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.121 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.121 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.121 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.121 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.121 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.122 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.122 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.122 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.122 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.122 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.122 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.122 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.123 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.123 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.123 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.123 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.123 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.123 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.124 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.124 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.124 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.124 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.124 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.124 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.125 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.125 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.125 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.125 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.125 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.126 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.126 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.126 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.126 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.126 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.126 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.126 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.127 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.127 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.127 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.127 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.127 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.127 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.128 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.128 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.128 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.128 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.128 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.128 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.128 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.129 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.129 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.129 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.129 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.129 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.129 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.129 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.130 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.130 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.130 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.130 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.130 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.130 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.131 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.131 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.131 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.131 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.131 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.131 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.131 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.132 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.132 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.132 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.132 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.132 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.132 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.133 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.133 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.133 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.133 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.133 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.133 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.133 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.133 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.134 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.134 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.134 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.134 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.134 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.134 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.134 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.135 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.135 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.135 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.135 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.135 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.135 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.135 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.136 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.136 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.136 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.136 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.136 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.136 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.137 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.137 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.137 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.137 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.137 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.137 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.137 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.138 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.138 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.138 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.138 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.138 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.139 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.139 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.139 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.139 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.139 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.139 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.139 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.139 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.140 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.140 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.140 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.140 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.140 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.140 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.140 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.141 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.141 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.141 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.141 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.141 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.141 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.142 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.142 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.142 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.142 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.142 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.142 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.142 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.142 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.143 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.143 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.143 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.143 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.143 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.143 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.143 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.144 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.144 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.144 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.144 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.144 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.145 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.145 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.145 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.145 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.145 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.146 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.146 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.146 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.146 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.146 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.146 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.147 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.147 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.147 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.147 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.147 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.147 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.147 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.148 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.148 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.148 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.148 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.148 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.148 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.148 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.148 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.149 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.149 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.149 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.149 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.149 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.149 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.149 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.150 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.150 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.150 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.150 181994 DEBUG oslo_service.service [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.151 181994 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.212 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.214 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.214 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.214 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 21 23:37:05 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 21 23:37:05 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.287 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f1ecc4bdd30> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.290 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f1ecc4bdd30> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.291 181994 INFO nova.virt.libvirt.driver [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Connection event '1' reason 'None'
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.364 181994 WARNING nova.virt.libvirt.driver [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 21 23:37:05 compute-0 nova_compute[181990]: 2026-01-21 23:37:05.365 181994 DEBUG nova.virt.libvirt.volume.mount [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 21 23:37:05 compute-0 python3.9[182478]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.110 181994 INFO nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Libvirt host capabilities <capabilities>
Jan 21 23:37:06 compute-0 nova_compute[181990]: 
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <host>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <uuid>cf9153dc-a08f-47ce-9ee5-55ef01f58da9</uuid>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <cpu>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <arch>x86_64</arch>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model>EPYC-Rome-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <vendor>AMD</vendor>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <microcode version='16777317'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <signature family='23' model='49' stepping='0'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='x2apic'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='tsc-deadline'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='osxsave'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='hypervisor'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='tsc_adjust'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='spec-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='stibp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='arch-capabilities'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='cmp_legacy'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='topoext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='virt-ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='lbrv'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='tsc-scale'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='vmcb-clean'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='pause-filter'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='pfthreshold'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='svme-addr-chk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='rdctl-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='skip-l1dfl-vmentry'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='mds-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature name='pschange-mc-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <pages unit='KiB' size='4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <pages unit='KiB' size='2048'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <pages unit='KiB' size='1048576'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </cpu>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <power_management>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <suspend_mem/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <suspend_disk/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <suspend_hybrid/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </power_management>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <iommu support='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <migration_features>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <live/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <uri_transports>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <uri_transport>tcp</uri_transport>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <uri_transport>rdma</uri_transport>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </uri_transports>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </migration_features>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <topology>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <cells num='1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <cell id='0'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:           <memory unit='KiB'>7864316</memory>
Jan 21 23:37:06 compute-0 nova_compute[181990]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 21 23:37:06 compute-0 nova_compute[181990]:           <pages unit='KiB' size='2048'>0</pages>
Jan 21 23:37:06 compute-0 nova_compute[181990]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 21 23:37:06 compute-0 nova_compute[181990]:           <distances>
Jan 21 23:37:06 compute-0 nova_compute[181990]:             <sibling id='0' value='10'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:           </distances>
Jan 21 23:37:06 compute-0 nova_compute[181990]:           <cpus num='8'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:           </cpus>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         </cell>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </cells>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </topology>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <cache>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </cache>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <secmodel>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model>selinux</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <doi>0</doi>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </secmodel>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <secmodel>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model>dac</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <doi>0</doi>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </secmodel>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </host>
Jan 21 23:37:06 compute-0 nova_compute[181990]: 
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <guest>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <os_type>hvm</os_type>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <arch name='i686'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <wordsize>32</wordsize>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <domain type='qemu'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <domain type='kvm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </arch>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <features>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <pae/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <nonpae/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <acpi default='on' toggle='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <apic default='on' toggle='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <cpuselection/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <deviceboot/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <disksnapshot default='on' toggle='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <externalSnapshot/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </features>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </guest>
Jan 21 23:37:06 compute-0 nova_compute[181990]: 
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <guest>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <os_type>hvm</os_type>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <arch name='x86_64'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <wordsize>64</wordsize>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <domain type='qemu'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <domain type='kvm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </arch>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <features>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <acpi default='on' toggle='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <apic default='on' toggle='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <cpuselection/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <deviceboot/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <disksnapshot default='on' toggle='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <externalSnapshot/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </features>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </guest>
Jan 21 23:37:06 compute-0 nova_compute[181990]: 
Jan 21 23:37:06 compute-0 nova_compute[181990]: </capabilities>
Jan 21 23:37:06 compute-0 nova_compute[181990]: 
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.118 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.136 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 21 23:37:06 compute-0 nova_compute[181990]: <domainCapabilities>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <domain>kvm</domain>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <arch>i686</arch>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <vcpu max='4096'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <iothreads supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <os supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <enum name='firmware'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <loader supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>rom</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pflash</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='readonly'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>yes</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>no</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='secure'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>no</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </loader>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </os>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <cpu>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>on</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>off</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='maximumMigratable'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>on</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>off</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <vendor>AMD</vendor>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='succor'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='custom' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cooperlake'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='KnightsMill'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='athlon'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='athlon-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='core2duo'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='core2duo-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='coreduo'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='coreduo-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='n270'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='n270-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='phenom'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='phenom-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </cpu>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <memoryBacking supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <enum name='sourceType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>file</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>anonymous</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>memfd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </memoryBacking>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <devices>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <disk supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='diskDevice'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>disk</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>cdrom</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>floppy</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>lun</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='bus'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>fdc</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>scsi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>usb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>sata</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </disk>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <graphics supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vnc</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>egl-headless</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>dbus</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </graphics>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <video supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='modelType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vga</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>cirrus</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>none</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>bochs</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>ramfb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </video>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <hostdev supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='mode'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>subsystem</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='startupPolicy'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>default</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>mandatory</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>requisite</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>optional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='subsysType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>usb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pci</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>scsi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='capsType'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='pciBackend'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </hostdev>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <rng supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>random</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>egd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>builtin</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </rng>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <filesystem supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='driverType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>path</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>handle</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtiofs</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </filesystem>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <tpm supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tpm-tis</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tpm-crb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>emulator</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>external</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendVersion'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>2.0</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </tpm>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <redirdev supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='bus'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>usb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </redirdev>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <channel supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pty</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>unix</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </channel>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <crypto supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>qemu</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>builtin</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </crypto>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <interface supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>default</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>passt</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </interface>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <panic supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>isa</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>hyperv</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </panic>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <console supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>null</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vc</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pty</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>dev</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>file</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pipe</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>stdio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>udp</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tcp</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>unix</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>qemu-vdagent</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>dbus</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </console>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </devices>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <features>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <gic supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <genid supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <backup supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <async-teardown supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <s390-pv supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <ps2 supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <tdx supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <sev supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <sgx supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <hyperv supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='features'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>relaxed</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vapic</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>spinlocks</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vpindex</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>runtime</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>synic</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>stimer</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>reset</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vendor_id</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>frequencies</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>reenlightenment</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tlbflush</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>ipi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>avic</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>emsr_bitmap</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>xmm_input</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <defaults>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </defaults>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </hyperv>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <launchSecurity supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </features>
Jan 21 23:37:06 compute-0 nova_compute[181990]: </domainCapabilities>
Jan 21 23:37:06 compute-0 nova_compute[181990]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.144 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 21 23:37:06 compute-0 nova_compute[181990]: <domainCapabilities>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <domain>kvm</domain>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <arch>i686</arch>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <vcpu max='240'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <iothreads supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <os supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <enum name='firmware'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <loader supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>rom</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pflash</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='readonly'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>yes</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>no</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='secure'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>no</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </loader>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </os>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <cpu>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>on</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>off</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='maximumMigratable'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>on</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>off</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <vendor>AMD</vendor>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='succor'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='custom' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cooperlake'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='KnightsMill'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='athlon'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='athlon-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='core2duo'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='core2duo-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='coreduo'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='coreduo-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='n270'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='n270-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='phenom'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='phenom-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </cpu>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <memoryBacking supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <enum name='sourceType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>file</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>anonymous</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>memfd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </memoryBacking>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <devices>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <disk supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='diskDevice'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>disk</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>cdrom</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>floppy</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>lun</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='bus'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>ide</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>fdc</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>scsi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>usb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>sata</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </disk>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <graphics supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vnc</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>egl-headless</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>dbus</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </graphics>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <video supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='modelType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vga</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>cirrus</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>none</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>bochs</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>ramfb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </video>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <hostdev supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='mode'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>subsystem</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='startupPolicy'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>default</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>mandatory</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>requisite</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>optional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='subsysType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>usb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pci</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>scsi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='capsType'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='pciBackend'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </hostdev>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <rng supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>random</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>egd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>builtin</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </rng>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <filesystem supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='driverType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>path</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>handle</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtiofs</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </filesystem>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <tpm supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tpm-tis</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tpm-crb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>emulator</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>external</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendVersion'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>2.0</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </tpm>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <redirdev supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='bus'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>usb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </redirdev>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <channel supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pty</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>unix</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </channel>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <crypto supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>qemu</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>builtin</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </crypto>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <interface supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>default</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>passt</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </interface>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <panic supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>isa</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>hyperv</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </panic>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <console supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>null</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vc</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pty</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>dev</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>file</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pipe</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>stdio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>udp</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tcp</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>unix</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>qemu-vdagent</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>dbus</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </console>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </devices>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <features>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <gic supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <genid supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <backup supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <async-teardown supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <s390-pv supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <ps2 supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <tdx supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <sev supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <sgx supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <hyperv supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='features'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>relaxed</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vapic</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>spinlocks</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vpindex</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>runtime</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>synic</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>stimer</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>reset</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vendor_id</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>frequencies</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>reenlightenment</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tlbflush</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>ipi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>avic</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>emsr_bitmap</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>xmm_input</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <defaults>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </defaults>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </hyperv>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <launchSecurity supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </features>
Jan 21 23:37:06 compute-0 nova_compute[181990]: </domainCapabilities>
Jan 21 23:37:06 compute-0 nova_compute[181990]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.197 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.202 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 21 23:37:06 compute-0 nova_compute[181990]: <domainCapabilities>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <domain>kvm</domain>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <arch>x86_64</arch>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <vcpu max='4096'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <iothreads supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <os supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <enum name='firmware'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>efi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <loader supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>rom</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pflash</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='readonly'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>yes</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>no</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='secure'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>yes</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>no</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </loader>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </os>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <cpu>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>on</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>off</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='maximumMigratable'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>on</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>off</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <vendor>AMD</vendor>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='succor'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='custom' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cooperlake'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='KnightsMill'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='athlon'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='athlon-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='core2duo'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='core2duo-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='coreduo'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='coreduo-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='n270'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='n270-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='phenom'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='phenom-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </cpu>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <memoryBacking supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <enum name='sourceType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>file</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>anonymous</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>memfd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </memoryBacking>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <devices>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <disk supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='diskDevice'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>disk</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>cdrom</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>floppy</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>lun</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='bus'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>fdc</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>scsi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>usb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>sata</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </disk>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <graphics supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vnc</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>egl-headless</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>dbus</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </graphics>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <video supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='modelType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vga</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>cirrus</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>none</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>bochs</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>ramfb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </video>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <hostdev supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='mode'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>subsystem</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='startupPolicy'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>default</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>mandatory</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>requisite</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>optional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='subsysType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>usb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pci</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>scsi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='capsType'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='pciBackend'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </hostdev>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <rng supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>random</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>egd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>builtin</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </rng>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <filesystem supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='driverType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>path</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>handle</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtiofs</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </filesystem>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <tpm supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tpm-tis</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tpm-crb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>emulator</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>external</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendVersion'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>2.0</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </tpm>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <redirdev supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='bus'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>usb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </redirdev>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <channel supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pty</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>unix</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </channel>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <crypto supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>qemu</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>builtin</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </crypto>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <interface supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>default</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>passt</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </interface>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <panic supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>isa</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>hyperv</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </panic>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <console supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>null</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vc</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pty</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>dev</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>file</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pipe</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>stdio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>udp</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tcp</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>unix</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>qemu-vdagent</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>dbus</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </console>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </devices>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <features>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <gic supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <genid supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <backup supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <async-teardown supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <s390-pv supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <ps2 supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <tdx supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <sev supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <sgx supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <hyperv supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='features'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>relaxed</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vapic</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>spinlocks</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vpindex</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>runtime</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>synic</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>stimer</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>reset</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vendor_id</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>frequencies</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>reenlightenment</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tlbflush</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>ipi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>avic</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>emsr_bitmap</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>xmm_input</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <defaults>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </defaults>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </hyperv>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <launchSecurity supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </features>
Jan 21 23:37:06 compute-0 nova_compute[181990]: </domainCapabilities>
Jan 21 23:37:06 compute-0 nova_compute[181990]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.281 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 21 23:37:06 compute-0 nova_compute[181990]: <domainCapabilities>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <domain>kvm</domain>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <arch>x86_64</arch>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <vcpu max='240'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <iothreads supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <os supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <enum name='firmware'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <loader supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>rom</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pflash</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='readonly'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>yes</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>no</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='secure'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>no</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </loader>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </os>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <cpu>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>on</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>off</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='maximumMigratable'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>on</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>off</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <vendor>AMD</vendor>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='succor'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <mode name='custom' supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ddpd-u'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sha512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm3'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sm4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cooperlake'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Denverton-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amd-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='auto-ibrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='perfmon-v2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbpb'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='stibp-always-on'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='EPYC-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-128'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-256'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx10-512'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='prefetchiti'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Haswell-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='KnightsMill'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512er'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512pf'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 sudo[182670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwokttztwnznamapuxyvbwgsqaohjcjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038625.95311-3551-7968881365833/AnsiballZ_podman_container.py'
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fma4'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tbm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xop'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 sudo[182670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='amx-tile'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-bf16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-fp16'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bitalg'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrc'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fzrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='la57'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='taa-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ifma'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cmpccxadd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fbsdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='fsrs'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ibrs-all'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='intel-psfd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='lam'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mcdt-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pbrsb-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='psdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='serialize'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vaes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='hle'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='rtm'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512bw'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512cd'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512dq'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512f'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='avx512vl'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='invpcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pcid'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='pku'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='mpx'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='core-capability'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='split-lock-detect'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='cldemote'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='erms'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='gfni'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdir64b'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='movdiri'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='xsaves'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='athlon'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='athlon-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='core2duo'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='core2duo-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='coreduo'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='coreduo-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='n270'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='n270-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='ss'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='phenom'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <blockers model='phenom-v1'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnow'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <feature name='3dnowext'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </blockers>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </mode>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </cpu>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <memoryBacking supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <enum name='sourceType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>file</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>anonymous</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <value>memfd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </memoryBacking>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <devices>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <disk supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='diskDevice'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>disk</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>cdrom</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>floppy</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>lun</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='bus'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>ide</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>fdc</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>scsi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>usb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>sata</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </disk>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <graphics supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vnc</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>egl-headless</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>dbus</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </graphics>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <video supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='modelType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vga</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>cirrus</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>none</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>bochs</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>ramfb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </video>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <hostdev supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='mode'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>subsystem</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='startupPolicy'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>default</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>mandatory</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>requisite</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>optional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='subsysType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>usb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pci</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>scsi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='capsType'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='pciBackend'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </hostdev>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <rng supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtio-non-transitional</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>random</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>egd</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>builtin</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </rng>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <filesystem supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='driverType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>path</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>handle</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>virtiofs</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </filesystem>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <tpm supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tpm-tis</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tpm-crb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>emulator</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>external</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendVersion'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>2.0</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </tpm>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <redirdev supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='bus'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>usb</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </redirdev>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <channel supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pty</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>unix</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </channel>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <crypto supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>qemu</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendModel'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>builtin</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </crypto>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <interface supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='backendType'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>default</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>passt</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </interface>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <panic supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='model'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>isa</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>hyperv</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </panic>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <console supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='type'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>null</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vc</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pty</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>dev</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>file</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>pipe</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>stdio</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>udp</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tcp</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>unix</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>qemu-vdagent</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>dbus</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </console>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </devices>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <features>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <gic supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <genid supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <backup supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <async-teardown supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <s390-pv supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <ps2 supported='yes'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <tdx supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <sev supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <sgx supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <hyperv supported='yes'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <enum name='features'>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>relaxed</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vapic</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>spinlocks</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vpindex</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>runtime</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>synic</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>stimer</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>reset</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>vendor_id</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>frequencies</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>reenlightenment</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>tlbflush</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>ipi</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>avic</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>emsr_bitmap</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <value>xmm_input</value>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </enum>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       <defaults>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:06 compute-0 nova_compute[181990]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:06 compute-0 nova_compute[181990]:       </defaults>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     </hyperv>
Jan 21 23:37:06 compute-0 nova_compute[181990]:     <launchSecurity supported='no'/>
Jan 21 23:37:06 compute-0 nova_compute[181990]:   </features>
Jan 21 23:37:06 compute-0 nova_compute[181990]: </domainCapabilities>
Jan 21 23:37:06 compute-0 nova_compute[181990]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.358 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.358 181994 INFO nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Secure Boot support detected
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.360 181994 INFO nova.virt.libvirt.driver [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.360 181994 INFO nova.virt.libvirt.driver [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.376 181994 DEBUG nova.virt.libvirt.driver [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] cpu compare xml: <cpu match="exact">
Jan 21 23:37:06 compute-0 nova_compute[181990]:   <model>Nehalem</model>
Jan 21 23:37:06 compute-0 nova_compute[181990]: </cpu>
Jan 21 23:37:06 compute-0 nova_compute[181990]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.380 181994 DEBUG nova.virt.libvirt.driver [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.463 181994 INFO nova.virt.node [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Determined node identity 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from /var/lib/nova/compute_id
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.507 181994 WARNING nova.compute.manager [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Compute nodes ['5f09a77c-505f-4bd3-ac26-41f43ebdf535'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.563 181994 INFO nova.compute.manager [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 21 23:37:06 compute-0 python3.9[182672]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.697 181994 WARNING nova.compute.manager [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.697 181994 DEBUG oslo_concurrency.lockutils [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.698 181994 DEBUG oslo_concurrency.lockutils [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.698 181994 DEBUG oslo_concurrency.lockutils [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.698 181994 DEBUG nova.compute.resource_tracker [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:37:06 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 21 23:37:06 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 21 23:37:06 compute-0 sudo[182670]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.977 181994 WARNING nova.virt.libvirt.driver [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.978 181994 DEBUG nova.compute.resource_tracker [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6169MB free_disk=73.5879135131836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.978 181994 DEBUG oslo_concurrency.lockutils [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:37:06 compute-0 nova_compute[181990]: 2026-01-21 23:37:06.978 181994 DEBUG oslo_concurrency.lockutils [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.001 181994 WARNING nova.compute.resource_tracker [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] No compute node record for compute-0.ctlplane.example.com:5f09a77c-505f-4bd3-ac26-41f43ebdf535: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 5f09a77c-505f-4bd3-ac26-41f43ebdf535 could not be found.
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.025 181994 INFO nova.compute.resource_tracker [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 5f09a77c-505f-4bd3-ac26-41f43ebdf535
Jan 21 23:37:07 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.098 181994 DEBUG nova.compute.resource_tracker [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.099 181994 DEBUG nova.compute.resource_tracker [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:37:07 compute-0 sudo[182869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crxhjnxxvuyzbcslkyudxuqgetmyvrvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038627.2912693-3575-166274939618584/AnsiballZ_systemd.py'
Jan 21 23:37:07 compute-0 sudo[182869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.665 181994 INFO nova.scheduler.client.report [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] [req-6b96f8a9-c873-453f-a2f0-e47594e4d546] Created resource provider record via placement API for resource provider with UUID 5f09a77c-505f-4bd3-ac26-41f43ebdf535 and name compute-0.ctlplane.example.com.
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.692 181994 DEBUG nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 21 23:37:07 compute-0 nova_compute[181990]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.692 181994 INFO nova.virt.libvirt.host [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] kernel doesn't support AMD SEV
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.693 181994 DEBUG nova.compute.provider_tree [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.693 181994 DEBUG nova.virt.libvirt.driver [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.695 181994 DEBUG nova.virt.libvirt.driver [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Libvirt baseline CPU <cpu>
Jan 21 23:37:07 compute-0 nova_compute[181990]:   <arch>x86_64</arch>
Jan 21 23:37:07 compute-0 nova_compute[181990]:   <model>Nehalem</model>
Jan 21 23:37:07 compute-0 nova_compute[181990]:   <vendor>AMD</vendor>
Jan 21 23:37:07 compute-0 nova_compute[181990]:   <topology sockets="8" cores="1" threads="1"/>
Jan 21 23:37:07 compute-0 nova_compute[181990]: </cpu>
Jan 21 23:37:07 compute-0 nova_compute[181990]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.760 181994 DEBUG nova.scheduler.client.report [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Updated inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.760 181994 DEBUG nova.compute.provider_tree [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Updating resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 23:37:07 compute-0 nova_compute[181990]: 2026-01-21 23:37:07.760 181994 DEBUG nova.compute.provider_tree [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:37:07 compute-0 python3.9[182871]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 21 23:37:08 compute-0 nova_compute[181990]: 2026-01-21 23:37:08.355 181994 DEBUG nova.compute.provider_tree [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Updating resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 23:37:08 compute-0 systemd[1]: Stopping nova_compute container...
Jan 21 23:37:08 compute-0 nova_compute[181990]: 2026-01-21 23:37:08.389 181994 DEBUG nova.compute.resource_tracker [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:37:08 compute-0 nova_compute[181990]: 2026-01-21 23:37:08.389 181994 DEBUG oslo_concurrency.lockutils [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:37:08 compute-0 nova_compute[181990]: 2026-01-21 23:37:08.389 181994 DEBUG nova.service [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 21 23:37:08 compute-0 nova_compute[181990]: 2026-01-21 23:37:08.522 181994 DEBUG nova.service [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 21 23:37:08 compute-0 nova_compute[181990]: 2026-01-21 23:37:08.523 181994 DEBUG nova.servicegroup.drivers.db [None req-36c5b623-1d7e-4d6a-b97f-8eddc1051f2d - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 21 23:37:08 compute-0 nova_compute[181990]: 2026-01-21 23:37:08.535 181994 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Jan 21 23:37:08 compute-0 nova_compute[181990]: 2026-01-21 23:37:08.537 181994 DEBUG oslo_concurrency.lockutils [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:37:08 compute-0 nova_compute[181990]: 2026-01-21 23:37:08.538 181994 DEBUG oslo_concurrency.lockutils [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:37:08 compute-0 nova_compute[181990]: 2026-01-21 23:37:08.538 181994 DEBUG oslo_concurrency.lockutils [None req-e72e8314-353f-4afc-99f8-cb44cd7b9ef3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:37:09 compute-0 virtqemud[182477]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 21 23:37:09 compute-0 virtqemud[182477]: hostname: compute-0
Jan 21 23:37:09 compute-0 virtqemud[182477]: End of file while reading data: Input/output error
Jan 21 23:37:09 compute-0 systemd[1]: libpod-7993ee99205093b4f1302d55d9d75ff9bf0d4afb00cc9b357f8dc805265d5033.scope: Deactivated successfully.
Jan 21 23:37:09 compute-0 systemd[1]: libpod-7993ee99205093b4f1302d55d9d75ff9bf0d4afb00cc9b357f8dc805265d5033.scope: Consumed 3.747s CPU time.
Jan 21 23:37:09 compute-0 podman[182875]: 2026-01-21 23:37:09.109185816 +0000 UTC m=+0.728566845 container died 7993ee99205093b4f1302d55d9d75ff9bf0d4afb00cc9b357f8dc805265d5033 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 23:37:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7993ee99205093b4f1302d55d9d75ff9bf0d4afb00cc9b357f8dc805265d5033-userdata-shm.mount: Deactivated successfully.
Jan 21 23:37:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-b7fb6dd22f2e05b868bb41c5affbb188ce414cd0960e3efd1f66221689553a8c-merged.mount: Deactivated successfully.
Jan 21 23:37:09 compute-0 podman[182875]: 2026-01-21 23:37:09.16936379 +0000 UTC m=+0.788744819 container cleanup 7993ee99205093b4f1302d55d9d75ff9bf0d4afb00cc9b357f8dc805265d5033 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:37:09 compute-0 podman[182875]: nova_compute
Jan 21 23:37:09 compute-0 podman[182906]: nova_compute
Jan 21 23:37:09 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 21 23:37:09 compute-0 systemd[1]: Stopped nova_compute container.
Jan 21 23:37:09 compute-0 systemd[1]: Starting nova_compute container...
Jan 21 23:37:09 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:37:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7fb6dd22f2e05b868bb41c5affbb188ce414cd0960e3efd1f66221689553a8c/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7fb6dd22f2e05b868bb41c5affbb188ce414cd0960e3efd1f66221689553a8c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7fb6dd22f2e05b868bb41c5affbb188ce414cd0960e3efd1f66221689553a8c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7fb6dd22f2e05b868bb41c5affbb188ce414cd0960e3efd1f66221689553a8c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7fb6dd22f2e05b868bb41c5affbb188ce414cd0960e3efd1f66221689553a8c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:09 compute-0 podman[182919]: 2026-01-21 23:37:09.352971231 +0000 UTC m=+0.085349558 container init 7993ee99205093b4f1302d55d9d75ff9bf0d4afb00cc9b357f8dc805265d5033 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:37:09 compute-0 podman[182919]: 2026-01-21 23:37:09.362265255 +0000 UTC m=+0.094643572 container start 7993ee99205093b4f1302d55d9d75ff9bf0d4afb00cc9b357f8dc805265d5033 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 23:37:09 compute-0 podman[182919]: nova_compute
Jan 21 23:37:09 compute-0 nova_compute[182935]: + sudo -E kolla_set_configs
Jan 21 23:37:09 compute-0 systemd[1]: Started nova_compute container.
Jan 21 23:37:09 compute-0 sudo[182869]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Validating config file
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Copying service configuration files
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Deleting /etc/ceph
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Creating directory /etc/ceph
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Setting permission for /etc/ceph
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Writing out command to execute
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:09 compute-0 nova_compute[182935]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 21 23:37:09 compute-0 nova_compute[182935]: ++ cat /run_command
Jan 21 23:37:09 compute-0 nova_compute[182935]: + CMD=nova-compute
Jan 21 23:37:09 compute-0 nova_compute[182935]: + ARGS=
Jan 21 23:37:09 compute-0 nova_compute[182935]: + sudo kolla_copy_cacerts
Jan 21 23:37:09 compute-0 nova_compute[182935]: + [[ ! -n '' ]]
Jan 21 23:37:09 compute-0 nova_compute[182935]: + . kolla_extend_start
Jan 21 23:37:09 compute-0 nova_compute[182935]: + echo 'Running command: '\''nova-compute'\'''
Jan 21 23:37:09 compute-0 nova_compute[182935]: + umask 0022
Jan 21 23:37:09 compute-0 nova_compute[182935]: + exec nova-compute
Jan 21 23:37:09 compute-0 nova_compute[182935]: Running command: 'nova-compute'
Jan 21 23:37:10 compute-0 sudo[183096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgxnqldwawmpftatocvzyurazhyhtcem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038629.7845628-3602-280981294839707/AnsiballZ_podman_container.py'
Jan 21 23:37:10 compute-0 sudo[183096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:10 compute-0 python3.9[183098]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 21 23:37:10 compute-0 systemd[1]: Started libpod-conmon-57a2793b9507afbbae2f0fa7609bf86cfb9c78449891b23b28ed40274e2afb2d.scope.
Jan 21 23:37:10 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:37:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15eb8038dd3a5928db46305c6e12779f9cad7a47a58ad4b7a6a0884ad564a1fb/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15eb8038dd3a5928db46305c6e12779f9cad7a47a58ad4b7a6a0884ad564a1fb/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15eb8038dd3a5928db46305c6e12779f9cad7a47a58ad4b7a6a0884ad564a1fb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 21 23:37:10 compute-0 podman[183121]: 2026-01-21 23:37:10.574689537 +0000 UTC m=+0.144625000 container init 57a2793b9507afbbae2f0fa7609bf86cfb9c78449891b23b28ed40274e2afb2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.build-date=20251202, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:37:10 compute-0 podman[183121]: 2026-01-21 23:37:10.582422002 +0000 UTC m=+0.152357445 container start 57a2793b9507afbbae2f0fa7609bf86cfb9c78449891b23b28ed40274e2afb2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:37:10 compute-0 python3.9[183098]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Applying nova statedir ownership
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 21 23:37:10 compute-0 nova_compute_init[183142]: INFO:nova_statedir:Nova statedir ownership complete
Jan 21 23:37:10 compute-0 systemd[1]: libpod-57a2793b9507afbbae2f0fa7609bf86cfb9c78449891b23b28ed40274e2afb2d.scope: Deactivated successfully.
Jan 21 23:37:10 compute-0 podman[183143]: 2026-01-21 23:37:10.637709153 +0000 UTC m=+0.029080063 container died 57a2793b9507afbbae2f0fa7609bf86cfb9c78449891b23b28ed40274e2afb2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute_init, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 21 23:37:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-57a2793b9507afbbae2f0fa7609bf86cfb9c78449891b23b28ed40274e2afb2d-userdata-shm.mount: Deactivated successfully.
Jan 21 23:37:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-15eb8038dd3a5928db46305c6e12779f9cad7a47a58ad4b7a6a0884ad564a1fb-merged.mount: Deactivated successfully.
Jan 21 23:37:10 compute-0 podman[183153]: 2026-01-21 23:37:10.703039567 +0000 UTC m=+0.053185959 container cleanup 57a2793b9507afbbae2f0fa7609bf86cfb9c78449891b23b28ed40274e2afb2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Jan 21 23:37:10 compute-0 systemd[1]: libpod-conmon-57a2793b9507afbbae2f0fa7609bf86cfb9c78449891b23b28ed40274e2afb2d.scope: Deactivated successfully.
Jan 21 23:37:10 compute-0 sudo[183096]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:11 compute-0 nova_compute[182935]: 2026-01-21 23:37:11.559 182939 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 23:37:11 compute-0 nova_compute[182935]: 2026-01-21 23:37:11.559 182939 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 23:37:11 compute-0 nova_compute[182935]: 2026-01-21 23:37:11.560 182939 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 21 23:37:11 compute-0 nova_compute[182935]: 2026-01-21 23:37:11.560 182939 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 21 23:37:11 compute-0 podman[183208]: 2026-01-21 23:37:11.728874382 +0000 UTC m=+0.092734874 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:37:11 compute-0 nova_compute[182935]: 2026-01-21 23:37:11.746 182939 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:37:11 compute-0 sshd-session[159794]: Connection closed by 192.168.122.30 port 41928
Jan 21 23:37:11 compute-0 sshd-session[159791]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:37:11 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Jan 21 23:37:11 compute-0 systemd[1]: session-23.scope: Consumed 1min 40.569s CPU time.
Jan 21 23:37:11 compute-0 systemd-logind[784]: Session 23 logged out. Waiting for processes to exit.
Jan 21 23:37:11 compute-0 nova_compute[182935]: 2026-01-21 23:37:11.780 182939 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:37:11 compute-0 nova_compute[182935]: 2026-01-21 23:37:11.781 182939 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 21 23:37:11 compute-0 systemd-logind[784]: Removed session 23.
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.283 182939 INFO nova.virt.driver [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.386 182939 INFO nova.compute.provider_config [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.406 182939 DEBUG oslo_concurrency.lockutils [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.407 182939 DEBUG oslo_concurrency.lockutils [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.407 182939 DEBUG oslo_concurrency.lockutils [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.408 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.408 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.408 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.409 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.409 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.409 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.409 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.410 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.410 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.410 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.410 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.411 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.411 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.411 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.412 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.412 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.412 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.413 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.413 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.413 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.414 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.414 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.414 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.415 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.415 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.415 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.416 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.416 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.416 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.416 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.417 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.417 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.417 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.417 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.418 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.418 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.418 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.418 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.419 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.419 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.419 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.420 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.420 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.420 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.420 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.421 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.421 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.421 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.421 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.422 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.422 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.422 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.422 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.423 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.423 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.423 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.424 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.424 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.424 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.424 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.425 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.425 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.425 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.425 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.426 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.426 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.426 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.426 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.427 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.427 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.427 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.427 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.428 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.428 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.428 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.429 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.429 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.429 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.430 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.430 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.430 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.431 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.431 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.431 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.431 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.432 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.432 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.432 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.432 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.433 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.433 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.433 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.433 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.434 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.434 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.434 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.434 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.434 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.435 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.435 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.435 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.435 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.436 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.436 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.436 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.436 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.437 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.437 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.437 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.438 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.438 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.438 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.438 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.439 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.439 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.439 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.439 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.440 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.440 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.440 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.440 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.441 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.441 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.441 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.442 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.442 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.442 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.443 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.444 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.444 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.444 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.444 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.445 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.445 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.445 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.445 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.445 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.446 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.446 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.446 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.446 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.446 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.446 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.447 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.447 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.447 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.447 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.447 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.447 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.448 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.448 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.448 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.448 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.448 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.449 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.449 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.449 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.449 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.449 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.449 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.449 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.450 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.450 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.450 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.450 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.450 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.450 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.450 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.451 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.451 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.451 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.451 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.451 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.451 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.452 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.452 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.452 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.452 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.452 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.452 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.452 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.453 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.453 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.453 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.453 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.453 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.453 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.453 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.454 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.454 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.454 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.454 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.454 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.454 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.455 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.455 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.455 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.455 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.455 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.455 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.455 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.456 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.456 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.456 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.456 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.456 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.456 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.456 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.456 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.457 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.457 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.457 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.457 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.457 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.457 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.458 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.458 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.458 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.458 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.458 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.458 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.458 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.459 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.459 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.459 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.459 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.459 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.459 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.460 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.460 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.460 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.460 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.460 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.460 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.460 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.461 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.461 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.461 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.461 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.461 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.461 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.461 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.462 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.462 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.462 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.462 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.462 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.462 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.462 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.463 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.463 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.463 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.463 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.463 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.463 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.463 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.464 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.464 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.464 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.464 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.464 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.464 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.464 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.465 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.465 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.465 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.465 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.465 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.465 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.466 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.466 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.466 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.466 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.466 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.466 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.466 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.467 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.467 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.467 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.467 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.467 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.467 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.467 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.468 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.468 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.468 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.468 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.468 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.468 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.468 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.469 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.469 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.469 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.469 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.469 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.469 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.469 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.470 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.470 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.470 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.470 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.470 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.470 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.470 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.471 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.471 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.471 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.471 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.471 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.471 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.471 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.472 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.472 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.472 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.472 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.472 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.472 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.472 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.473 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.473 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.473 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.473 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.473 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.473 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.473 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.474 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.474 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.474 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.474 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.474 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.474 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.474 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.475 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.475 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.475 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.475 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.475 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.475 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.476 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.476 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.476 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.476 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.476 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.476 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.476 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.476 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.477 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.477 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.477 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.477 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.477 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.477 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.477 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.478 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.478 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.478 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.478 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.478 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.479 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.479 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.479 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.479 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.479 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.479 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.479 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.480 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.480 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.480 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.480 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.480 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.480 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.480 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.481 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.481 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.481 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.481 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.481 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.481 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.481 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.482 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.482 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.482 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.482 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.482 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.482 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.482 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.483 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.483 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.483 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.483 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.483 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.483 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.484 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.484 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.484 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.484 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.484 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.484 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.484 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.485 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.485 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.485 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.485 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.485 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.485 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.485 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.486 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.486 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.486 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.486 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.486 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.486 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.487 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.487 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.487 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.487 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.487 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.487 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.487 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.488 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.488 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.488 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.488 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.488 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.488 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.489 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.489 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.489 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.489 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.489 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.489 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.490 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.490 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.490 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.490 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.490 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.490 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.490 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.491 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.491 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.491 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.491 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.491 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.491 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.491 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.492 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.492 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.492 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.492 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.492 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.492 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.493 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.493 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.493 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.493 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.493 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.493 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.494 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.494 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.494 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.494 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.494 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.494 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.495 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.495 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.495 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.495 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.495 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.495 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.496 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.496 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.496 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.496 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.496 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.496 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.497 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.497 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.497 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.497 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.497 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.497 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.498 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.498 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.498 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.498 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.498 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.498 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.499 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.499 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.499 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.499 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.500 182939 WARNING oslo_config.cfg [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 21 23:37:12 compute-0 nova_compute[182935]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 21 23:37:12 compute-0 nova_compute[182935]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 21 23:37:12 compute-0 nova_compute[182935]: and ``live_migration_inbound_addr`` respectively.
Jan 21 23:37:12 compute-0 nova_compute[182935]: ).  Its value may be silently ignored in the future.
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.500 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.500 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.500 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.500 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.501 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.501 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.501 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.501 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.501 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.501 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.502 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.502 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.502 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.502 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.502 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.502 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.503 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.503 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.503 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.503 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.503 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.503 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.504 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.504 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.504 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.504 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.504 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.504 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.505 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.505 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.505 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.505 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.505 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.505 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.506 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.506 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.506 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.506 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.506 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.506 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.507 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.507 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.507 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.507 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.507 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.507 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.508 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.508 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.508 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.508 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.508 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.509 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.509 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.509 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.509 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.509 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.509 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.510 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.510 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.510 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.510 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.510 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.510 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.511 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.511 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.511 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.511 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.511 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.512 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.512 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.512 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.512 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.512 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.512 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.513 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.513 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.513 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.513 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.513 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.514 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.514 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.514 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.514 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.514 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.514 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.515 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.515 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.515 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.515 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.515 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.516 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.516 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.516 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.516 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.516 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.517 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.517 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.517 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.517 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.517 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.517 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.518 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.518 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.518 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.518 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.518 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.518 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.518 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.519 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.519 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.519 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.519 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.519 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.519 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.519 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.520 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.520 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.520 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.520 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.520 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.520 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.520 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.521 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.521 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.521 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.521 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.521 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.521 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.521 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.522 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.522 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.522 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.522 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.522 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.522 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.522 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.523 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.523 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.523 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.523 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.523 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.523 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.524 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.524 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.524 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.524 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.524 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.524 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.524 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.525 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.525 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.525 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.525 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.525 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.525 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.525 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.526 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.526 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.526 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.526 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.526 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.526 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.526 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.527 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.527 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.527 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.527 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.527 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.527 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.527 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.528 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.528 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.528 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.528 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.528 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.528 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.528 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.529 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.529 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.529 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.529 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.529 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.529 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.529 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.530 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.530 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.530 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.530 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.530 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.530 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.530 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.531 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.531 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.531 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.531 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.531 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.531 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.532 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.532 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.532 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.532 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.532 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.532 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.532 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.533 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.533 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.533 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.533 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.533 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.533 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.533 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.534 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.534 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.534 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.538 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.538 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.538 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.538 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.538 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.538 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.540 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.540 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.541 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.541 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.541 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.541 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.541 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.542 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.542 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.542 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.542 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.542 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.542 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.543 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.543 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.543 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.543 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.543 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.543 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.544 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.544 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.544 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.544 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.544 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.544 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.545 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.545 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.545 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.545 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.545 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.546 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.546 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.546 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.546 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.546 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.546 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.547 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.547 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.547 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.547 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.547 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.547 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.548 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.548 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.548 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.548 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.548 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.548 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.548 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.549 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.549 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.549 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.549 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.549 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.549 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.550 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.550 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.550 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.550 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.550 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.551 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.551 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.551 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.551 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.551 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.551 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.552 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.552 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.552 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.552 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.552 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.552 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.553 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.553 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.553 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.553 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.553 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.554 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.554 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.554 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.554 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.554 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.554 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.555 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.555 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.555 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.555 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.555 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.555 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.556 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.556 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.556 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.556 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.557 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.557 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.557 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.557 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.557 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.558 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.558 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.558 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.558 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.558 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.558 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.559 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.559 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.559 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.559 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.559 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.559 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.559 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.560 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.560 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.560 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.560 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.560 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.560 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.561 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.561 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.561 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.561 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.561 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.561 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.562 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.562 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.562 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.562 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.562 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.563 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.563 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.563 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.563 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.563 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.564 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.564 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.564 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.564 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.564 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.564 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.564 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.565 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.565 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.565 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.565 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.565 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.565 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.565 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.566 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.566 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.566 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.566 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.567 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.567 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.567 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.567 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.567 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.568 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.568 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.568 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.568 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.568 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.569 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.569 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.569 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.569 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.569 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.569 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.570 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.570 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.570 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.570 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.570 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.571 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.571 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.571 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.571 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.571 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.571 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.572 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.572 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.572 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.572 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.572 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.572 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.572 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.573 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.573 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.573 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.573 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.573 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.573 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.573 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.573 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.574 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.574 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.574 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.574 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.574 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.574 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.574 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.575 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.575 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.575 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.575 182939 DEBUG oslo_service.service [None req-7793d959-5569-4d3c-8ca9-b80a4f3fbd55 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.577 182939 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.617 182939 INFO nova.virt.node [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Determined node identity 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from /var/lib/nova/compute_id
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.618 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.619 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.619 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.620 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.638 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f56f30fb3d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.643 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f56f30fb3d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.644 182939 INFO nova.virt.libvirt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Connection event '1' reason 'None'
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.655 182939 INFO nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Libvirt host capabilities <capabilities>
Jan 21 23:37:12 compute-0 nova_compute[182935]: 
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <host>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <uuid>cf9153dc-a08f-47ce-9ee5-55ef01f58da9</uuid>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <cpu>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <arch>x86_64</arch>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model>EPYC-Rome-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <vendor>AMD</vendor>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <microcode version='16777317'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <signature family='23' model='49' stepping='0'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='x2apic'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='tsc-deadline'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='osxsave'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='hypervisor'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='tsc_adjust'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='spec-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='stibp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='arch-capabilities'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='cmp_legacy'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='topoext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='virt-ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='lbrv'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='tsc-scale'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='vmcb-clean'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='pause-filter'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='pfthreshold'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='svme-addr-chk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='rdctl-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='skip-l1dfl-vmentry'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='mds-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature name='pschange-mc-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <pages unit='KiB' size='4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <pages unit='KiB' size='2048'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <pages unit='KiB' size='1048576'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </cpu>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <power_management>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <suspend_mem/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <suspend_disk/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <suspend_hybrid/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </power_management>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <iommu support='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <migration_features>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <live/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <uri_transports>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <uri_transport>tcp</uri_transport>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <uri_transport>rdma</uri_transport>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </uri_transports>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </migration_features>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <topology>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <cells num='1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <cell id='0'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:           <memory unit='KiB'>7864316</memory>
Jan 21 23:37:12 compute-0 nova_compute[182935]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 21 23:37:12 compute-0 nova_compute[182935]:           <pages unit='KiB' size='2048'>0</pages>
Jan 21 23:37:12 compute-0 nova_compute[182935]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 21 23:37:12 compute-0 nova_compute[182935]:           <distances>
Jan 21 23:37:12 compute-0 nova_compute[182935]:             <sibling id='0' value='10'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:           </distances>
Jan 21 23:37:12 compute-0 nova_compute[182935]:           <cpus num='8'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:           </cpus>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         </cell>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </cells>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </topology>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <cache>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </cache>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <secmodel>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model>selinux</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <doi>0</doi>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </secmodel>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <secmodel>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model>dac</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <doi>0</doi>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </secmodel>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </host>
Jan 21 23:37:12 compute-0 nova_compute[182935]: 
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <guest>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <os_type>hvm</os_type>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <arch name='i686'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <wordsize>32</wordsize>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <domain type='qemu'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <domain type='kvm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </arch>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <features>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <pae/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <nonpae/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <acpi default='on' toggle='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <apic default='on' toggle='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <cpuselection/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <deviceboot/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <disksnapshot default='on' toggle='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <externalSnapshot/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </features>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </guest>
Jan 21 23:37:12 compute-0 nova_compute[182935]: 
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <guest>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <os_type>hvm</os_type>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <arch name='x86_64'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <wordsize>64</wordsize>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <domain type='qemu'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <domain type='kvm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </arch>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <features>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <acpi default='on' toggle='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <apic default='on' toggle='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <cpuselection/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <deviceboot/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <disksnapshot default='on' toggle='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <externalSnapshot/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </features>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </guest>
Jan 21 23:37:12 compute-0 nova_compute[182935]: 
Jan 21 23:37:12 compute-0 nova_compute[182935]: </capabilities>
Jan 21 23:37:12 compute-0 nova_compute[182935]: 
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.669 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.673 182939 DEBUG nova.virt.libvirt.volume.mount [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.675 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 21 23:37:12 compute-0 nova_compute[182935]: <domainCapabilities>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <domain>kvm</domain>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <arch>i686</arch>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <vcpu max='240'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <iothreads supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <os supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <enum name='firmware'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <loader supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>rom</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pflash</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='readonly'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>yes</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>no</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='secure'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>no</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </loader>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </os>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <cpu>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>on</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>off</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='maximumMigratable'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>on</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>off</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <vendor>AMD</vendor>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='succor'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='custom' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ddpd-u'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sha512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm3'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ddpd-u'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sha512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm3'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cooperlake'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='perfmon-v2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='perfmon-v2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbpb'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='perfmon-v2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbpb'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-128'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-256'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-128'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-256'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='IvyBridge'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='KnightsMill'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512er'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512pf'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512er'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512pf'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Opteron_G4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Opteron_G5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tbm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tbm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SierraForest'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='athlon'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='athlon-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='core2duo'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='core2duo-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='coreduo'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='coreduo-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='n270'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='n270-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='phenom'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='phenom-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <memoryBacking supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <enum name='sourceType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>file</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>anonymous</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>memfd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </memoryBacking>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <disk supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='diskDevice'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>disk</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>cdrom</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>floppy</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>lun</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='bus'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>ide</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>fdc</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>scsi</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>usb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>sata</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio-transitional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio-non-transitional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <graphics supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vnc</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>egl-headless</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>dbus</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </graphics>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <video supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='modelType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vga</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>cirrus</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>none</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>bochs</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>ramfb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </video>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <hostdev supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='mode'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>subsystem</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='startupPolicy'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>default</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>mandatory</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>requisite</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>optional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='subsysType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>usb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pci</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>scsi</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='capsType'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='pciBackend'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </hostdev>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <rng supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio-transitional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio-non-transitional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendModel'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>random</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>egd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>builtin</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <filesystem supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='driverType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>path</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>handle</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtiofs</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </filesystem>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <tpm supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>tpm-tis</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>tpm-crb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendModel'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>emulator</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>external</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendVersion'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>2.0</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </tpm>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <redirdev supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='bus'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>usb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </redirdev>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <channel supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pty</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>unix</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </channel>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <crypto supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>qemu</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendModel'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>builtin</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </crypto>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <interface supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>default</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>passt</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <panic supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>isa</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>hyperv</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </panic>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <console supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>null</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vc</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pty</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>dev</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>file</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pipe</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>stdio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>udp</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>tcp</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>unix</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>qemu-vdagent</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>dbus</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </console>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <features>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <gic supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <genid supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <backup supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <async-teardown supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <s390-pv supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <ps2 supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <tdx supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <sev supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <sgx supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <hyperv supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='features'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>relaxed</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vapic</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>spinlocks</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vpindex</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>runtime</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>synic</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>stimer</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>reset</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vendor_id</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>frequencies</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>reenlightenment</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>tlbflush</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>ipi</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>avic</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>emsr_bitmap</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>xmm_input</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <defaults>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </defaults>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </hyperv>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <launchSecurity supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </features>
Jan 21 23:37:12 compute-0 nova_compute[182935]: </domainCapabilities>
Jan 21 23:37:12 compute-0 nova_compute[182935]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.686 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 21 23:37:12 compute-0 nova_compute[182935]: <domainCapabilities>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <domain>kvm</domain>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <arch>i686</arch>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <vcpu max='4096'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <iothreads supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <os supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <enum name='firmware'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <loader supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>rom</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pflash</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='readonly'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>yes</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>no</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='secure'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>no</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </loader>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </os>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <cpu>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>on</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>off</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='maximumMigratable'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>on</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>off</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <vendor>AMD</vendor>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='succor'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='custom' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ddpd-u'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sha512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm3'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ddpd-u'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sha512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm3'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cooperlake'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='perfmon-v2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='perfmon-v2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbpb'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='perfmon-v2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbpb'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-128'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-256'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-128'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-256'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='IvyBridge'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='KnightsMill'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512er'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512pf'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512er'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512pf'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Opteron_G4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Opteron_G5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tbm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tbm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SierraForest'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='athlon'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='athlon-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='core2duo'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='core2duo-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='coreduo'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='coreduo-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='n270'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='n270-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='phenom'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='phenom-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <memoryBacking supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <enum name='sourceType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>file</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>anonymous</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>memfd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </memoryBacking>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <disk supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='diskDevice'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>disk</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>cdrom</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>floppy</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>lun</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='bus'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>fdc</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>scsi</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>usb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>sata</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio-transitional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio-non-transitional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <graphics supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vnc</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>egl-headless</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>dbus</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </graphics>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <video supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='modelType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vga</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>cirrus</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>none</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>bochs</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>ramfb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </video>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <hostdev supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='mode'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>subsystem</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='startupPolicy'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>default</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>mandatory</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>requisite</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>optional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='subsysType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>usb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pci</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>scsi</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='capsType'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='pciBackend'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </hostdev>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <rng supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio-transitional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio-non-transitional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendModel'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>random</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>egd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>builtin</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <filesystem supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='driverType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>path</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>handle</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtiofs</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </filesystem>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <tpm supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>tpm-tis</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>tpm-crb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendModel'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>emulator</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>external</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendVersion'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>2.0</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </tpm>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <redirdev supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='bus'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>usb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </redirdev>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <channel supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pty</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>unix</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </channel>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <crypto supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>qemu</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendModel'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>builtin</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </crypto>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <interface supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>default</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>passt</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <panic supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>isa</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>hyperv</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </panic>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <console supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>null</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vc</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pty</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>dev</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>file</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pipe</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>stdio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>udp</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>tcp</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>unix</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>qemu-vdagent</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>dbus</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </console>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <features>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <gic supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <genid supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <backup supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <async-teardown supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <s390-pv supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <ps2 supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <tdx supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <sev supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <sgx supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <hyperv supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='features'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>relaxed</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vapic</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>spinlocks</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vpindex</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>runtime</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>synic</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>stimer</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>reset</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vendor_id</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>frequencies</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>reenlightenment</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>tlbflush</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>ipi</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>avic</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>emsr_bitmap</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>xmm_input</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <defaults>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </defaults>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </hyperv>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <launchSecurity supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </features>
Jan 21 23:37:12 compute-0 nova_compute[182935]: </domainCapabilities>
Jan 21 23:37:12 compute-0 nova_compute[182935]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.780 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.787 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 21 23:37:12 compute-0 nova_compute[182935]: <domainCapabilities>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <domain>kvm</domain>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <arch>x86_64</arch>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <vcpu max='4096'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <iothreads supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <os supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <enum name='firmware'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>efi</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <loader supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>rom</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pflash</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='readonly'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>yes</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>no</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='secure'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>yes</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>no</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </loader>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </os>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <cpu>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>on</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>off</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='maximumMigratable'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>on</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>off</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <vendor>AMD</vendor>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='succor'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='custom' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ddpd-u'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sha512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm3'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ddpd-u'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sha512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm3'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cooperlake'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='perfmon-v2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='perfmon-v2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbpb'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='perfmon-v2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbpb'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-128'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-256'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-128'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-256'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Haswell-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='IvyBridge'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='KnightsMill'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512er'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512pf'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512er'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512pf'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Opteron_G4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Opteron_G5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tbm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tbm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SierraForest'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='athlon'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='athlon-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='core2duo'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='core2duo-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='coreduo'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='coreduo-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='n270'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='n270-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='phenom'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='phenom-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <memoryBacking supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <enum name='sourceType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>file</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>anonymous</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>memfd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </memoryBacking>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <disk supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='diskDevice'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>disk</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>cdrom</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>floppy</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>lun</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='bus'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>fdc</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>scsi</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>usb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>sata</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio-transitional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio-non-transitional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <graphics supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vnc</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>egl-headless</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>dbus</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </graphics>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <video supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='modelType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vga</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>cirrus</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>none</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>bochs</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>ramfb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </video>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <hostdev supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='mode'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>subsystem</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='startupPolicy'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>default</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>mandatory</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>requisite</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>optional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='subsysType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>usb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pci</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>scsi</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='capsType'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='pciBackend'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </hostdev>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <rng supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio-transitional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtio-non-transitional</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendModel'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>random</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>egd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>builtin</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <filesystem supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='driverType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>path</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>handle</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>virtiofs</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </filesystem>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <tpm supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>tpm-tis</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>tpm-crb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendModel'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>emulator</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>external</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendVersion'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>2.0</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </tpm>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <redirdev supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='bus'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>usb</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </redirdev>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <channel supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pty</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>unix</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </channel>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <crypto supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>qemu</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendModel'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>builtin</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </crypto>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <interface supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='backendType'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>default</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>passt</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <panic supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>isa</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>hyperv</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </panic>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <console supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>null</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vc</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pty</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>dev</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>file</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pipe</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>stdio</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>udp</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>tcp</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>unix</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>qemu-vdagent</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>dbus</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </console>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <features>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <gic supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <genid supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <backup supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <async-teardown supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <s390-pv supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <ps2 supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <tdx supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <sev supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <sgx supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <hyperv supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='features'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>relaxed</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vapic</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>spinlocks</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vpindex</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>runtime</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>synic</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>stimer</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>reset</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>vendor_id</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>frequencies</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>reenlightenment</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>tlbflush</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>ipi</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>avic</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>emsr_bitmap</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>xmm_input</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <defaults>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </defaults>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </hyperv>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <launchSecurity supported='no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </features>
Jan 21 23:37:12 compute-0 nova_compute[182935]: </domainCapabilities>
Jan 21 23:37:12 compute-0 nova_compute[182935]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:12 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.877 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 21 23:37:12 compute-0 nova_compute[182935]: <domainCapabilities>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <path>/usr/libexec/qemu-kvm</path>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <domain>kvm</domain>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <arch>x86_64</arch>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <vcpu max='240'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <iothreads supported='yes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <os supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <enum name='firmware'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <loader supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>rom</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>pflash</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='readonly'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>yes</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>no</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='secure'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>no</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </loader>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   </os>
Jan 21 23:37:12 compute-0 nova_compute[182935]:   <cpu>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='host-passthrough' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='hostPassthroughMigratable'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>on</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>off</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='maximum' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <enum name='maximumMigratable'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>on</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <value>off</value>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='host-model' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <vendor>AMD</vendor>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='x2apic'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='tsc-deadline'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='hypervisor'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='tsc_adjust'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='spec-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='stibp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='cmp_legacy'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='overflow-recov'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='succor'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='amd-ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='virt-ssbd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='lbrv'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='tsc-scale'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='vmcb-clean'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='flushbyasid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='pause-filter'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='pfthreshold'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='svme-addr-chk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <feature policy='disable' name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:12 compute-0 nova_compute[182935]:     <mode name='custom' supported='yes'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Broadwell-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cascadelake-Server-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='ClearwaterForest'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ddpd-u'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sha512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm3'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='ClearwaterForest-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bhi-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ddpd-u'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sha512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm3'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sm4'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cooperlake'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cooperlake-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Cooperlake-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Denverton-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='Dhyana-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Genoa'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Genoa-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Genoa-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='perfmon-v2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Milan-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Rome-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Turin'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='perfmon-v2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbpb'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-Turin-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amd-psfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='auto-ibrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vp2intersect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fs-gs-base-ns'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibpb-brtype'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='no-nested-data-bp'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='null-sel-clr-base'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='perfmon-v2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbpb'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='srso-user-kernel-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='stibp-always-on'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-v4'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='EPYC-v5'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids-v1'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids-v2'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-128'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-256'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10-512'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 21 23:37:12 compute-0 nova_compute[182935]:       <blockers model='GraniteRapids-v3'>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-fp16'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:12 compute-0 nova_compute[182935]:         <feature name='avx10'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx10-128'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx10-256'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx10-512'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='prefetchiti'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Haswell'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Haswell-IBRS'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Haswell-noTSX'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Haswell-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Haswell-v2'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Haswell-v3'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Haswell-v4'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-noTSX'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v2'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v3'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v4'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v5'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v6'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Icelake-Server-v7'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='IvyBridge'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='IvyBridge-IBRS'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='IvyBridge-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='IvyBridge-v2'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='KnightsMill'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512er'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512pf'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='KnightsMill-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-4fmaps'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-4vnniw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512er'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512pf'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Opteron_G4'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Opteron_G4-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Opteron_G5'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='tbm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Opteron_G5-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fma4'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='tbm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xop'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v2'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v3'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='SapphireRapids-v4'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-bf16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-int8'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='amx-tile'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-bf16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-fp16'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512-vpopcntdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bitalg'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vbmi2'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrc'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fzrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='la57'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='taa-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='tsx-ldtrk'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xfd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='SierraForest'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='SierraForest-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='SierraForest-v2'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='SierraForest-v3'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-ifma'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-ne-convert'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx-vnni-int8'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='bhi-ctrl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='bus-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cmpccxadd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fbsdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='fsrs'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ibrs-all'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='intel-psfd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ipred-ctrl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='lam'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='mcdt-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pbrsb-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='psdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rrsba-ctrl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='sbdr-ssdp-no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='serialize'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vaes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='vpclmulqdq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-IBRS'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v2'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v3'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Client-v4'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-IBRS'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v2'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='hle'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='rtm'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v3'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v4'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Skylake-Server-v5'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512bw'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512cd'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512dq'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512f'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='avx512vl'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='invpcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pcid'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='pku'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Snowridge'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='mpx'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v2'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v3'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='core-capability'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='split-lock-detect'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='Snowridge-v4'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='cldemote'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='erms'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='gfni'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdir64b'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='movdiri'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='xsaves'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='athlon'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='athlon-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='core2duo'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='core2duo-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='coreduo'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='coreduo-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='n270'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='n270-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='ss'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='phenom'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <blockers model='phenom-v1'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='3dnow'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <feature name='3dnowext'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </blockers>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </mode>
Jan 21 23:37:13 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:37:13 compute-0 nova_compute[182935]:   <memoryBacking supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <enum name='sourceType'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <value>file</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <value>anonymous</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <value>memfd</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:   </memoryBacking>
Jan 21 23:37:13 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <disk supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='diskDevice'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>disk</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>cdrom</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>floppy</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>lun</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='bus'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>ide</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>fdc</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>scsi</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>usb</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>sata</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>virtio-transitional</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>virtio-non-transitional</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <graphics supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>vnc</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>egl-headless</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>dbus</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </graphics>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <video supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='modelType'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>vga</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>cirrus</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>none</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>bochs</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>ramfb</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </video>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <hostdev supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='mode'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>subsystem</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='startupPolicy'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>default</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>mandatory</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>requisite</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>optional</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='subsysType'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>usb</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>pci</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>scsi</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='capsType'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='pciBackend'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </hostdev>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <rng supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>virtio</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>virtio-transitional</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>virtio-non-transitional</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='backendModel'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>random</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>egd</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>builtin</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <filesystem supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='driverType'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>path</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>handle</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>virtiofs</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </filesystem>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <tpm supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>tpm-tis</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>tpm-crb</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='backendModel'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>emulator</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>external</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='backendVersion'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>2.0</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </tpm>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <redirdev supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='bus'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>usb</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </redirdev>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <channel supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>pty</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>unix</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </channel>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <crypto supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='model'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>qemu</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='backendModel'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>builtin</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </crypto>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <interface supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='backendType'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>default</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>passt</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <panic supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='model'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>isa</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>hyperv</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </panic>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <console supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='type'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>null</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>vc</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>pty</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>dev</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>file</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>pipe</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>stdio</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>udp</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>tcp</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>unix</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>qemu-vdagent</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>dbus</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </console>
Jan 21 23:37:13 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:37:13 compute-0 nova_compute[182935]:   <features>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <gic supported='no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <vmcoreinfo supported='yes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <genid supported='yes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <backingStoreInput supported='yes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <backup supported='yes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <async-teardown supported='yes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <s390-pv supported='no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <ps2 supported='yes'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <tdx supported='no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <sev supported='no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <sgx supported='no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <hyperv supported='yes'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <enum name='features'>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>relaxed</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>vapic</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>spinlocks</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>vpindex</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>runtime</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>synic</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>stimer</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>reset</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>vendor_id</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>frequencies</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>reenlightenment</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>tlbflush</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>ipi</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>avic</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>emsr_bitmap</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <value>xmm_input</value>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </enum>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       <defaults>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <spinlocks>4095</spinlocks>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <stimer_direct>on</stimer_direct>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <tlbflush_direct>on</tlbflush_direct>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <tlbflush_extended>on</tlbflush_extended>
Jan 21 23:37:13 compute-0 nova_compute[182935]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 21 23:37:13 compute-0 nova_compute[182935]:       </defaults>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     </hyperv>
Jan 21 23:37:13 compute-0 nova_compute[182935]:     <launchSecurity supported='no'/>
Jan 21 23:37:13 compute-0 nova_compute[182935]:   </features>
Jan 21 23:37:13 compute-0 nova_compute[182935]: </domainCapabilities>
Jan 21 23:37:13 compute-0 nova_compute[182935]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.963 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.963 182939 INFO nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Secure Boot support detected
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.967 182939 INFO nova.virt.libvirt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.967 182939 INFO nova.virt.libvirt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.980 182939 DEBUG nova.virt.libvirt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 21 23:37:13 compute-0 nova_compute[182935]:   <model>Nehalem</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]: </cpu>
Jan 21 23:37:13 compute-0 nova_compute[182935]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:12.983 182939 DEBUG nova.virt.libvirt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.019 182939 INFO nova.virt.node [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Determined node identity 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from /var/lib/nova/compute_id
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.052 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Verified node 5f09a77c-505f-4bd3-ac26-41f43ebdf535 matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.129 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.265 182939 DEBUG oslo_concurrency.lockutils [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.266 182939 DEBUG oslo_concurrency.lockutils [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.266 182939 DEBUG oslo_concurrency.lockutils [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.266 182939 DEBUG nova.compute.resource_tracker [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.479 182939 WARNING nova.virt.libvirt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.481 182939 DEBUG nova.compute.resource_tracker [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6178MB free_disk=73.58822250366211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.482 182939 DEBUG oslo_concurrency.lockutils [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.482 182939 DEBUG oslo_concurrency.lockutils [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.686 182939 DEBUG nova.compute.resource_tracker [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.686 182939 DEBUG nova.compute.resource_tracker [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.708 182939 DEBUG nova.scheduler.client.report [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.788 182939 DEBUG nova.scheduler.client.report [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.788 182939 DEBUG nova.compute.provider_tree [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.816 182939 DEBUG nova.scheduler.client.report [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.839 182939 DEBUG nova.scheduler.client.report [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.874 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 21 23:37:13 compute-0 nova_compute[182935]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.875 182939 INFO nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] kernel doesn't support AMD SEV
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.876 182939 DEBUG nova.compute.provider_tree [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.876 182939 DEBUG nova.virt.libvirt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.879 182939 DEBUG nova.virt.libvirt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Libvirt baseline CPU <cpu>
Jan 21 23:37:13 compute-0 nova_compute[182935]:   <arch>x86_64</arch>
Jan 21 23:37:13 compute-0 nova_compute[182935]:   <model>Nehalem</model>
Jan 21 23:37:13 compute-0 nova_compute[182935]:   <vendor>AMD</vendor>
Jan 21 23:37:13 compute-0 nova_compute[182935]:   <topology sockets="8" cores="1" threads="1"/>
Jan 21 23:37:13 compute-0 nova_compute[182935]: </cpu>
Jan 21 23:37:13 compute-0 nova_compute[182935]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.932 182939 DEBUG nova.scheduler.client.report [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.994 182939 DEBUG nova.compute.resource_tracker [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.994 182939 DEBUG oslo_concurrency.lockutils [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.512s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:37:13 compute-0 nova_compute[182935]: 2026-01-21 23:37:13.995 182939 DEBUG nova.service [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 21 23:37:14 compute-0 nova_compute[182935]: 2026-01-21 23:37:14.059 182939 DEBUG nova.service [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 21 23:37:14 compute-0 nova_compute[182935]: 2026-01-21 23:37:14.060 182939 DEBUG nova.servicegroup.drivers.db [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 21 23:37:15 compute-0 sshd-session[183250]: Invalid user webmaster from 188.166.69.60 port 47334
Jan 21 23:37:16 compute-0 sshd-session[183250]: Connection closed by invalid user webmaster 188.166.69.60 port 47334 [preauth]
Jan 21 23:37:17 compute-0 sshd-session[183252]: Accepted publickey for zuul from 192.168.122.30 port 53020 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 21 23:37:17 compute-0 systemd-logind[784]: New session 25 of user zuul.
Jan 21 23:37:17 compute-0 systemd[1]: Started Session 25 of User zuul.
Jan 21 23:37:17 compute-0 sshd-session[183252]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 21 23:37:18 compute-0 python3.9[183405]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 21 23:37:19 compute-0 sudo[183559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjygzhnephheclmkxuueglxlnnhdkdew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038639.234361-68-132108813486702/AnsiballZ_systemd_service.py'
Jan 21 23:37:19 compute-0 sudo[183559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:20 compute-0 python3.9[183561]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:37:20 compute-0 systemd[1]: Reloading.
Jan 21 23:37:20 compute-0 systemd-rc-local-generator[183588]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:37:20 compute-0 systemd-sysv-generator[183593]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:37:20 compute-0 sudo[183559]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:21 compute-0 python3.9[183747]: ansible-ansible.builtin.service_facts Invoked
Jan 21 23:37:21 compute-0 network[183764]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 21 23:37:21 compute-0 network[183765]: 'network-scripts' will be removed from distribution in near future.
Jan 21 23:37:21 compute-0 network[183766]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 21 23:37:27 compute-0 sudo[184036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhgawfmviypkrzvyrrjgvmjjtglnkptr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038647.33631-125-173681180086381/AnsiballZ_systemd_service.py'
Jan 21 23:37:27 compute-0 sudo[184036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:27 compute-0 python3.9[184038]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:37:28 compute-0 sudo[184036]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:29 compute-0 sudo[184189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzojdrxdltmhnrtgfgfrneykztkmevbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038648.7356393-155-226079945359841/AnsiballZ_file.py'
Jan 21 23:37:29 compute-0 sudo[184189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:29 compute-0 python3.9[184191]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:29 compute-0 sudo[184189]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:29 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:37:29 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:37:30 compute-0 sudo[184342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypfendfxifxqgeclouekvdyspouthbzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038649.7580059-179-49616449183818/AnsiballZ_file.py'
Jan 21 23:37:30 compute-0 sudo[184342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:30 compute-0 python3.9[184344]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:30 compute-0 sudo[184342]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:30 compute-0 podman[184369]: 2026-01-21 23:37:30.743737054 +0000 UTC m=+0.111313361 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:37:31 compute-0 sudo[184520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfklsdrtghblvhyyrheunkpddxgtsqrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038650.8294983-206-22716992939894/AnsiballZ_command.py'
Jan 21 23:37:31 compute-0 sudo[184520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:31 compute-0 python3.9[184522]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:37:31 compute-0 sudo[184520]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:32 compute-0 python3.9[184674]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 23:37:33 compute-0 sudo[184824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpkmvazsdsrrgljzkicgdnblixvdcdfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038652.9552183-260-68079810921672/AnsiballZ_systemd_service.py'
Jan 21 23:37:33 compute-0 sudo[184824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:33 compute-0 python3.9[184826]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:37:33 compute-0 systemd[1]: Reloading.
Jan 21 23:37:33 compute-0 systemd-rc-local-generator[184854]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:37:33 compute-0 systemd-sysv-generator[184857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:37:33 compute-0 sudo[184824]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:34 compute-0 sudo[185011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxvbbqwjyfecbbriuzlpxszieckfmvgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038654.2531812-284-277761873352797/AnsiballZ_command.py'
Jan 21 23:37:34 compute-0 sudo[185011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:34 compute-0 python3.9[185013]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:37:34 compute-0 sudo[185011]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:35 compute-0 sudo[185164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khwryrbdlckcipeethzxyoaqqvjwzihb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038655.2415864-311-35310483662879/AnsiballZ_file.py'
Jan 21 23:37:35 compute-0 sudo[185164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:35 compute-0 python3.9[185166]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:37:35 compute-0 sudo[185164]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:36 compute-0 python3.9[185316]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:37 compute-0 sudo[185468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlspvkkpkucnhbaiqcfdlfnqogfqifkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038657.0270646-359-185853554885286/AnsiballZ_group.py'
Jan 21 23:37:37 compute-0 sudo[185468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:37 compute-0 python3.9[185470]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 21 23:37:37 compute-0 sudo[185468]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:38 compute-0 sudo[185620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghaynvfxisnaqaklerzhobsvjaaelkrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038658.3323512-392-72381715734191/AnsiballZ_getent.py'
Jan 21 23:37:38 compute-0 sudo[185620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:38 compute-0 python3.9[185622]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 21 23:37:38 compute-0 sudo[185620]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:39 compute-0 sudo[185773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afvuzigbiznmewadylsoibvxcnkgqnyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038659.3066998-416-197774219119194/AnsiballZ_group.py'
Jan 21 23:37:39 compute-0 sudo[185773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:39 compute-0 python3.9[185775]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 21 23:37:39 compute-0 groupadd[185776]: group added to /etc/group: name=ceilometer, GID=42405
Jan 21 23:37:39 compute-0 groupadd[185776]: group added to /etc/gshadow: name=ceilometer
Jan 21 23:37:39 compute-0 groupadd[185776]: new group: name=ceilometer, GID=42405
Jan 21 23:37:39 compute-0 sudo[185773]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:40 compute-0 nova_compute[182935]: 2026-01-21 23:37:40.061 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:37:40 compute-0 nova_compute[182935]: 2026-01-21 23:37:40.124 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:37:40 compute-0 sudo[185931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldjbihyuhvomnblhpgjvirylwfiewnvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038660.1984766-440-277199504535324/AnsiballZ_user.py'
Jan 21 23:37:40 compute-0 sudo[185931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:40 compute-0 python3.9[185933]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 21 23:37:40 compute-0 useradd[185935]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Jan 21 23:37:40 compute-0 useradd[185935]: add 'ceilometer' to group 'libvirt'
Jan 21 23:37:40 compute-0 useradd[185935]: add 'ceilometer' to shadow group 'libvirt'
Jan 21 23:37:41 compute-0 sudo[185931]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:42 compute-0 podman[186065]: 2026-01-21 23:37:42.410989439 +0000 UTC m=+0.101283530 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:37:42 compute-0 python3.9[186104]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:43 compute-0 python3.9[186232]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769038662.0686378-518-50613816868742/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:43 compute-0 python3.9[186382]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:44 compute-0 python3.9[186503]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769038663.3825235-518-244322032207100/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:44 compute-0 python3.9[186653]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:45 compute-0 python3.9[186774]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769038664.4674222-518-197235755908308/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:46 compute-0 python3.9[186924]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:47 compute-0 python3.9[187076]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:48 compute-0 python3.9[187228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:48 compute-0 python3.9[187349]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038667.675706-695-152076575422544/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:37:49 compute-0 python3.9[187499]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:49 compute-0 python3.9[187620]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038668.936851-695-33073092068723/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:37:50 compute-0 python3.9[187770]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:51 compute-0 python3.9[187891]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038670.3373876-782-144350431915173/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:37:52 compute-0 python3.9[188041]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:52 compute-0 python3.9[188162]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038671.8744583-830-105476753926522/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:53 compute-0 python3.9[188312]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:54 compute-0 python3.9[188433]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038673.1397223-875-202969300476062/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:55 compute-0 python3.9[188583]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:37:55 compute-0 python3.9[188704]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038674.634501-920-255619439651534/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:56 compute-0 sudo[188854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrdwcvwjwmciwemgyodwbhvgsvthmxan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038676.0388656-965-275304734266189/AnsiballZ_file.py'
Jan 21 23:37:56 compute-0 sudo[188854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:56 compute-0 python3.9[188856]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:56 compute-0 sudo[188854]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:57 compute-0 sudo[189006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxfihhvsvvsrxiorwzdchddpaqtrquwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038676.8479793-989-95807425291367/AnsiballZ_file.py'
Jan 21 23:37:57 compute-0 sudo[189006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:37:57 compute-0 python3.9[189008]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:37:57 compute-0 sudo[189006]: pam_unix(sudo:session): session closed for user root
Jan 21 23:37:58 compute-0 python3.9[189158]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:58 compute-0 sshd-session[189159]: Invalid user webmaster from 188.166.69.60 port 43836
Jan 21 23:37:58 compute-0 sshd-session[189159]: Connection closed by invalid user webmaster 188.166.69.60 port 43836 [preauth]
Jan 21 23:37:58 compute-0 python3.9[189312]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:37:59 compute-0 python3.9[189464]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:00 compute-0 sudo[189616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grdtnewwosvdkjksmqisepadrrzyvjoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038680.1707644-1085-215534577804526/AnsiballZ_file.py'
Jan 21 23:38:00 compute-0 sudo[189616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:00 compute-0 python3.9[189618]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:00 compute-0 sudo[189616]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:01 compute-0 sudo[189780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnmmujxooqpouyyzmhyvpdrjgynnysqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038681.0159233-1109-76289026441063/AnsiballZ_systemd_service.py'
Jan 21 23:38:01 compute-0 sudo[189780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:01 compute-0 podman[189742]: 2026-01-21 23:38:01.406089493 +0000 UTC m=+0.111312182 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:38:01 compute-0 python3.9[189787]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:38:01 compute-0 systemd[1]: Reloading.
Jan 21 23:38:01 compute-0 systemd-sysv-generator[189828]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:38:01 compute-0 systemd-rc-local-generator[189825]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:38:01 compute-0 systemd[1]: Listening on Podman API Socket.
Jan 21 23:38:02 compute-0 sudo[189780]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:02 compute-0 sudo[189984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szbsevnmqxpoglppmbieirsmeyqyckiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038682.6535268-1136-36915948508639/AnsiballZ_stat.py'
Jan 21 23:38:02 compute-0 sudo[189984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:03 compute-0 python3.9[189986]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:03 compute-0 sudo[189984]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:38:03.169 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:38:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:38:03.171 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:38:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:38:03.171 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:38:03 compute-0 sudo[190107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-magvueyyafnnnfxpukpijqrmkqirufvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038682.6535268-1136-36915948508639/AnsiballZ_copy.py'
Jan 21 23:38:03 compute-0 sudo[190107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:03 compute-0 python3.9[190109]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038682.6535268-1136-36915948508639/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:03 compute-0 sudo[190107]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:04 compute-0 sudo[190183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzuvvtvmwtoesobzmtxcqbozmbbltupi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038682.6535268-1136-36915948508639/AnsiballZ_stat.py'
Jan 21 23:38:04 compute-0 sudo[190183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:04 compute-0 python3.9[190185]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:04 compute-0 sudo[190183]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:04 compute-0 sudo[190306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grrbdovszlqndymdwueimexwjwzykdlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038682.6535268-1136-36915948508639/AnsiballZ_copy.py'
Jan 21 23:38:04 compute-0 sudo[190306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:04 compute-0 python3.9[190308]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038682.6535268-1136-36915948508639/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:05 compute-0 sudo[190306]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:06 compute-0 sudo[190458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsvzoejcnwbqnwqsqhvvtqygnpcrydpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038686.3732536-1232-30198832776894/AnsiballZ_file.py'
Jan 21 23:38:06 compute-0 sudo[190458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:06 compute-0 python3.9[190460]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:06 compute-0 sudo[190458]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:07 compute-0 sudo[190610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cumhxcgjwwjpiamnnvqdzqjibqmjcxwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038687.214243-1256-3760225762130/AnsiballZ_file.py'
Jan 21 23:38:07 compute-0 sudo[190610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:07 compute-0 python3.9[190612]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:07 compute-0 sudo[190610]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:08 compute-0 sudo[190762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcsfeguxwdacujuhigjfviuetwixwqin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038688.0158021-1280-246034443360867/AnsiballZ_stat.py'
Jan 21 23:38:08 compute-0 sudo[190762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:08 compute-0 python3.9[190764]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:08 compute-0 sudo[190762]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:09 compute-0 sudo[190885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcvmksacbmystwocwnmhxkekxftcwbwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038688.0158021-1280-246034443360867/AnsiballZ_copy.py'
Jan 21 23:38:09 compute-0 sudo[190885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:09 compute-0 python3.9[190887]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038688.0158021-1280-246034443360867/.source.json _original_basename=.7vqr546h follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:09 compute-0 sudo[190885]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:10 compute-0 python3.9[191037]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.813 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.814 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.815 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.815 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.815 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.816 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.816 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.817 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.818 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.851 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.852 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.852 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:38:11 compute-0 nova_compute[182935]: 2026-01-21 23:38:11.852 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:38:12 compute-0 nova_compute[182935]: 2026-01-21 23:38:12.059 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:38:12 compute-0 nova_compute[182935]: 2026-01-21 23:38:12.060 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6118MB free_disk=73.58622741699219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:38:12 compute-0 nova_compute[182935]: 2026-01-21 23:38:12.061 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:38:12 compute-0 nova_compute[182935]: 2026-01-21 23:38:12.061 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:38:12 compute-0 nova_compute[182935]: 2026-01-21 23:38:12.171 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:38:12 compute-0 nova_compute[182935]: 2026-01-21 23:38:12.171 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:38:12 compute-0 nova_compute[182935]: 2026-01-21 23:38:12.202 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:38:12 compute-0 nova_compute[182935]: 2026-01-21 23:38:12.221 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:38:12 compute-0 nova_compute[182935]: 2026-01-21 23:38:12.223 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:38:12 compute-0 nova_compute[182935]: 2026-01-21 23:38:12.223 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:38:12 compute-0 podman[191393]: 2026-01-21 23:38:12.737340059 +0000 UTC m=+0.095514459 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Jan 21 23:38:12 compute-0 sudo[191478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwjsncicmdjvmoaxzllsgygbqdccrypr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038692.2989204-1400-226605437134843/AnsiballZ_container_config_data.py'
Jan 21 23:38:12 compute-0 sudo[191478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:13 compute-0 python3.9[191480]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Jan 21 23:38:13 compute-0 sudo[191478]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:14 compute-0 sudo[191630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnaswsxajdaiancgwbhhjdvnpsnhxjlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038693.571165-1433-25111366260918/AnsiballZ_container_config_hash.py'
Jan 21 23:38:14 compute-0 sudo[191630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:14 compute-0 python3.9[191632]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:38:14 compute-0 sudo[191630]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:15 compute-0 sudo[191782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzrlugpkgrlfhfamerrsjahtvarvbbzp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038694.7606668-1463-46073931098399/AnsiballZ_edpm_container_manage.py'
Jan 21 23:38:15 compute-0 sudo[191782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:15 compute-0 python3[191784]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:38:15 compute-0 podman[191822]: 2026-01-21 23:38:15.714360959 +0000 UTC m=+0.055921073 container create 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 21 23:38:15 compute-0 podman[191822]: 2026-01-21 23:38:15.682315391 +0000 UTC m=+0.023875485 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 21 23:38:15 compute-0 python3[191784]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Jan 21 23:38:15 compute-0 sudo[191782]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:16 compute-0 sshd-session[191938]: error: kex_exchange_identification: read: Connection reset by peer
Jan 21 23:38:16 compute-0 sshd-session[191938]: Connection reset by 176.120.22.52 port 32025
Jan 21 23:38:16 compute-0 sudo[192012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spfffoaxhtwqokrndwlbtfyffxxbjhnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038696.3379877-1487-127939971814292/AnsiballZ_stat.py'
Jan 21 23:38:16 compute-0 sudo[192012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:16 compute-0 python3.9[192014]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:16 compute-0 sudo[192012]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:17 compute-0 sudo[192166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwyamjyhkguywqbgdbhwfmkcbvdiofsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038697.319377-1514-93429644004822/AnsiballZ_file.py'
Jan 21 23:38:17 compute-0 sudo[192166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:17 compute-0 python3.9[192168]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:17 compute-0 sudo[192166]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:18 compute-0 sudo[192242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxxgtnglrlcfiltpqcndxuhlgwbtmwqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038697.319377-1514-93429644004822/AnsiballZ_stat.py'
Jan 21 23:38:18 compute-0 sudo[192242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:18 compute-0 python3.9[192244]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:18 compute-0 sudo[192242]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:18 compute-0 sudo[192393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtdbyivvaejbgbasjbsjptrxknyldypx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038698.4012117-1514-242864309176796/AnsiballZ_copy.py'
Jan 21 23:38:18 compute-0 sudo[192393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:19 compute-0 python3.9[192395]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038698.4012117-1514-242864309176796/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:19 compute-0 sudo[192393]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:19 compute-0 sudo[192469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvtdiuwbgwdpqtserdrlnxyfrrtiupvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038698.4012117-1514-242864309176796/AnsiballZ_systemd.py'
Jan 21 23:38:19 compute-0 sudo[192469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:20 compute-0 python3.9[192471]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:38:20 compute-0 systemd[1]: Reloading.
Jan 21 23:38:20 compute-0 systemd-rc-local-generator[192495]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:38:20 compute-0 systemd-sysv-generator[192498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:38:20 compute-0 sudo[192469]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:20 compute-0 sudo[192581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttjqorufdzpmcgrfevjzqmxxylxfeirs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038698.4012117-1514-242864309176796/AnsiballZ_systemd.py'
Jan 21 23:38:20 compute-0 sudo[192581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:21 compute-0 python3.9[192583]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:38:21 compute-0 systemd[1]: Reloading.
Jan 21 23:38:21 compute-0 systemd-rc-local-generator[192611]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:38:21 compute-0 systemd-sysv-generator[192614]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:38:21 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Jan 21 23:38:21 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:38:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7abeb03ce24354a4c2cdca0c4e1da72331ad534248572bdb3fb0925cf2025d61/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 23:38:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7abeb03ce24354a4c2cdca0c4e1da72331ad534248572bdb3fb0925cf2025d61/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 23:38:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7abeb03ce24354a4c2cdca0c4e1da72331ad534248572bdb3fb0925cf2025d61/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Jan 21 23:38:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7abeb03ce24354a4c2cdca0c4e1da72331ad534248572bdb3fb0925cf2025d61/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Jan 21 23:38:21 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d.
Jan 21 23:38:21 compute-0 podman[192623]: 2026-01-21 23:38:21.872353928 +0000 UTC m=+0.146350115 container init 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: + sudo -E kolla_set_configs
Jan 21 23:38:21 compute-0 podman[192623]: 2026-01-21 23:38:21.900529809 +0000 UTC m=+0.174525986 container start 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 23:38:21 compute-0 podman[192623]: ceilometer_agent_compute
Jan 21 23:38:21 compute-0 sudo[192644]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: sudo: unable to send audit message: Operation not permitted
Jan 21 23:38:21 compute-0 sudo[192644]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 21 23:38:21 compute-0 sudo[192644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 21 23:38:21 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Jan 21 23:38:21 compute-0 sudo[192581]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Validating config file
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Copying service configuration files
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: INFO:__main__:Writing out command to execute
Jan 21 23:38:21 compute-0 sudo[192644]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: ++ cat /run_command
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: + ARGS=
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: + sudo kolla_copy_cacerts
Jan 21 23:38:21 compute-0 sudo[192665]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Jan 21 23:38:21 compute-0 ceilometer_agent_compute[192638]: sudo: unable to send audit message: Operation not permitted
Jan 21 23:38:21 compute-0 sudo[192665]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 21 23:38:21 compute-0 sudo[192665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 21 23:38:22 compute-0 sudo[192665]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:22 compute-0 podman[192645]: 2026-01-21 23:38:22.005483602 +0000 UTC m=+0.082011393 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: + [[ ! -n '' ]]
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: + . kolla_extend_start
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: + umask 0022
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Jan 21 23:38:22 compute-0 systemd[1]: 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d-155ce864f9758a7f.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 23:38:22 compute-0 systemd[1]: 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d-155ce864f9758a7f.service: Failed with result 'exit-code'.
Jan 21 23:38:22 compute-0 python3.9[192819]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.991 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.991 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.991 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.991 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.991 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.992 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.992 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.992 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.992 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.992 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.992 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.992 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.992 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.992 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.992 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.992 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.993 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.993 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.993 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.993 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.993 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.993 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.993 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.993 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.993 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.993 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.993 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.993 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.994 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.994 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.994 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.994 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.994 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.994 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.994 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.994 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.994 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.994 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.994 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.995 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.995 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.995 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.995 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.995 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.995 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.995 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.995 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.995 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.995 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.996 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.996 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.996 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.996 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.996 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.996 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.996 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.996 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.996 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.996 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.997 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.997 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.997 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.997 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.997 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.997 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.997 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.997 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.997 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.997 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.998 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.998 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.998 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.998 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.998 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.998 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.998 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.998 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.998 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.998 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.999 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.999 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.999 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.999 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.999 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.999 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.999 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.999 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.999 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:22 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.999 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:22.999 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.000 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.000 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.000 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.000 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.000 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.000 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.000 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.000 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.000 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.000 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.001 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.001 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.001 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.001 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.001 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.001 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.001 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.001 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.001 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.002 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.002 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.002 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.002 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.002 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.002 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.002 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.002 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.002 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.002 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.002 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.003 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.003 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.003 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.003 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.003 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.003 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.003 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.003 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.003 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.003 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.004 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.004 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.004 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.004 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.004 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.004 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.004 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.004 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.004 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.004 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.004 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.004 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.005 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.005 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.005 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.005 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.005 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.005 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.005 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.005 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.005 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.005 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.006 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.006 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.006 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.006 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.006 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.006 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.006 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.006 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.006 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.006 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.006 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.006 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.007 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.028 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.030 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.031 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.130 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.210 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.211 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.211 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.211 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.212 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.212 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.212 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.212 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.213 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.213 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.213 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.213 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.213 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.214 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.214 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.214 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.215 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.215 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.215 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.215 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.215 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.216 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.216 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.216 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.216 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.216 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.217 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.217 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.217 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.217 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.217 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.218 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.218 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.218 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.218 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.219 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.219 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.219 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.219 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.220 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.220 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.220 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.220 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.221 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.221 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.221 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.221 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.221 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.222 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.222 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.222 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.222 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.222 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.223 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.223 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.223 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.223 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.223 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.224 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.224 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.224 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.224 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.224 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.225 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.225 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.225 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.225 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.225 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.226 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.226 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.226 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.226 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.227 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.227 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.227 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.227 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.228 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.228 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.228 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.228 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.228 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.229 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.229 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.229 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.229 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.229 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.230 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.230 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.230 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.230 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.230 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.231 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.231 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.231 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.231 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.231 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.232 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.232 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.232 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.232 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.232 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.232 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.233 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.233 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.233 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.233 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.234 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.234 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.234 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.234 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.234 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.235 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.235 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.235 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.235 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.235 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.236 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.236 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.236 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.236 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.236 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.237 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.237 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.237 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.237 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.237 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.238 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.238 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.238 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.238 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.239 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.239 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.239 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.239 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.239 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.240 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.240 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.240 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.240 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.240 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.240 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.241 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.241 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.241 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.241 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.241 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.242 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.242 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.242 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.242 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.242 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.242 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.244 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.244 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.246 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.246 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.246 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.246 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.246 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.247 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.247 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.247 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.247 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.247 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.247 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.248 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.248 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.248 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.248 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.250 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.250 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.250 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.250 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.250 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.254 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.254 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.254 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.254 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.254 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.254 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.255 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.255 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.255 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.259 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.268 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:38:23.279 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:38:24 compute-0 sudo[192975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwejxyfapaqwcfohfhjucrjigzbbcqtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038703.7423003-1649-252186319852743/AnsiballZ_stat.py'
Jan 21 23:38:24 compute-0 sudo[192975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:24 compute-0 python3.9[192977]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:24 compute-0 sudo[192975]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:24 compute-0 sudo[193100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxuevsknozdsakwjqndbegjrnsffqgdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038703.7423003-1649-252186319852743/AnsiballZ_copy.py'
Jan 21 23:38:24 compute-0 sudo[193100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:24 compute-0 python3.9[193102]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038703.7423003-1649-252186319852743/.source.yaml _original_basename=.g0dajw8t follow=False checksum=9afa6966295a519b2180701325b87786c9fac371 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:24 compute-0 sudo[193100]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:25 compute-0 sudo[193252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwcsfzjtaegbaeewxxlkwqqbyjvvcqpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038705.3225784-1694-215509225318163/AnsiballZ_stat.py'
Jan 21 23:38:25 compute-0 sudo[193252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:25 compute-0 python3.9[193254]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:25 compute-0 sudo[193252]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:26 compute-0 sudo[193375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alwytlokbnddkozfwlxxfllvpuqfnzoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038705.3225784-1694-215509225318163/AnsiballZ_copy.py'
Jan 21 23:38:26 compute-0 sudo[193375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:26 compute-0 python3.9[193377]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038705.3225784-1694-215509225318163/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:26 compute-0 sudo[193375]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:27 compute-0 sudo[193527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jooyndfpkzukgrnmssbzzcfxaibfzlag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038707.3777966-1757-36277962529025/AnsiballZ_file.py'
Jan 21 23:38:27 compute-0 sudo[193527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:27 compute-0 python3.9[193529]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:27 compute-0 sudo[193527]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:28 compute-0 sudo[193679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqisoharsbkihatxezhzurgwmwkekvav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038708.2233932-1781-176376589993186/AnsiballZ_file.py'
Jan 21 23:38:28 compute-0 sudo[193679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:28 compute-0 python3.9[193681]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:28 compute-0 sudo[193679]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:29 compute-0 sudo[193831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxllmwwefxyayozvddnzphxooqkuenpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038709.1105683-1805-3048013431259/AnsiballZ_stat.py'
Jan 21 23:38:29 compute-0 sudo[193831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:29 compute-0 python3.9[193833]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:29 compute-0 sudo[193831]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:29 compute-0 sudo[193909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akcihjuyqefrcdfvirfqhdfhrprxmahm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038709.1105683-1805-3048013431259/AnsiballZ_file.py'
Jan 21 23:38:29 compute-0 sudo[193909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:30 compute-0 python3.9[193911]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.qynz7ao9 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:30 compute-0 sudo[193909]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:30 compute-0 python3.9[194061]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:31 compute-0 podman[194086]: 2026-01-21 23:38:31.736211701 +0000 UTC m=+0.103560530 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:38:34 compute-0 sudo[194509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiulpwoiknbwbpqhiiuqgqrdixpdmjza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038713.7601135-1916-255834603867077/AnsiballZ_container_config_data.py'
Jan 21 23:38:34 compute-0 sudo[194509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:34 compute-0 python3.9[194511]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Jan 21 23:38:34 compute-0 sudo[194509]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:35 compute-0 sudo[194661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stwyqqgtocqakgkxjewivtzsdtesttjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038714.9111793-1949-19276765184430/AnsiballZ_container_config_hash.py'
Jan 21 23:38:35 compute-0 sudo[194661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:35 compute-0 python3.9[194663]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:38:35 compute-0 sudo[194661]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:36 compute-0 sudo[194813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lofoztxraanjzcaqjptvlefvzdjvsqvm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038715.946319-1979-252657974966741/AnsiballZ_edpm_container_manage.py'
Jan 21 23:38:36 compute-0 sudo[194813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:36 compute-0 python3[194815]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:38:36 compute-0 podman[194852]: 2026-01-21 23:38:36.668687431 +0000 UTC m=+0.046191511 container create 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=node_exporter)
Jan 21 23:38:36 compute-0 podman[194852]: 2026-01-21 23:38:36.645259098 +0000 UTC m=+0.022763208 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 21 23:38:36 compute-0 python3[194815]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Jan 21 23:38:36 compute-0 sudo[194813]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:37 compute-0 sudo[195042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydbxdrazritergqqclvclmofqnyqskuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038717.2974396-2003-87581697851580/AnsiballZ_stat.py'
Jan 21 23:38:37 compute-0 sudo[195042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:37 compute-0 python3.9[195044]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:37 compute-0 sudo[195042]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:37 compute-0 sshd-session[194916]: Received disconnect from 91.224.92.190 port 45514:11:  [preauth]
Jan 21 23:38:37 compute-0 sshd-session[194916]: Disconnected from authenticating user root 91.224.92.190 port 45514 [preauth]
Jan 21 23:38:38 compute-0 sudo[195196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twelzdaqjrwzejxfcjcqchaqcxvvecpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038718.1444666-2030-258817331757594/AnsiballZ_file.py'
Jan 21 23:38:38 compute-0 sudo[195196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:38 compute-0 python3.9[195198]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:38 compute-0 sudo[195196]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:38 compute-0 sudo[195272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-limxyvwswenrpjtdhisuhwnnjeewfrjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038718.1444666-2030-258817331757594/AnsiballZ_stat.py'
Jan 21 23:38:38 compute-0 sudo[195272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:39 compute-0 python3.9[195274]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:39 compute-0 sudo[195272]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:39 compute-0 sudo[195423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxfxvxezdtjnwlmgwumljnkwgbuusuyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038719.1306984-2030-166271207113899/AnsiballZ_copy.py'
Jan 21 23:38:39 compute-0 sudo[195423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:39 compute-0 python3.9[195425]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038719.1306984-2030-166271207113899/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:39 compute-0 sudo[195423]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:39 compute-0 sudo[195499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fssyeuyqajdhjjmefpaoepyenhhfwmxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038719.1306984-2030-166271207113899/AnsiballZ_systemd.py'
Jan 21 23:38:39 compute-0 sudo[195499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:40 compute-0 python3.9[195501]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:38:40 compute-0 systemd[1]: Reloading.
Jan 21 23:38:40 compute-0 systemd-sysv-generator[195531]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:38:40 compute-0 systemd-rc-local-generator[195527]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:38:40 compute-0 sudo[195499]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:40 compute-0 sudo[195611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yivqxlqeokwcanhakurolrzyfccbwlee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038719.1306984-2030-166271207113899/AnsiballZ_systemd.py'
Jan 21 23:38:40 compute-0 sudo[195611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:41 compute-0 python3.9[195613]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:38:41 compute-0 systemd[1]: Reloading.
Jan 21 23:38:41 compute-0 systemd-rc-local-generator[195644]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:38:41 compute-0 systemd-sysv-generator[195647]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:38:41 compute-0 systemd[1]: Starting node_exporter container...
Jan 21 23:38:41 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:38:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d75ab12c1b3be660741a2ce8c42276aff2ded2d9ad8eb1dade58c86506aec0e/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 23:38:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d75ab12c1b3be660741a2ce8c42276aff2ded2d9ad8eb1dade58c86506aec0e/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 23:38:41 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5.
Jan 21 23:38:41 compute-0 podman[195653]: 2026-01-21 23:38:41.929562016 +0000 UTC m=+0.344720322 container init 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.943Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.943Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.944Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.944Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.944Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.944Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.944Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.944Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.944Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=arp
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=bcache
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=bonding
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=cpu
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=edac
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=filefd
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=netclass
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=netdev
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=netstat
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=nfs
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=nvme
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=softnet
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=systemd
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=xfs
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.945Z caller=node_exporter.go:117 level=info collector=zfs
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.946Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Jan 21 23:38:41 compute-0 node_exporter[195668]: ts=2026-01-21T23:38:41.946Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Jan 21 23:38:41 compute-0 podman[195653]: 2026-01-21 23:38:41.957391029 +0000 UTC m=+0.372549305 container start 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 23:38:41 compute-0 podman[195653]: node_exporter
Jan 21 23:38:41 compute-0 systemd[1]: Started node_exporter container.
Jan 21 23:38:42 compute-0 sudo[195611]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:42 compute-0 podman[195679]: 2026-01-21 23:38:42.030584051 +0000 UTC m=+0.060266991 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 23:38:42 compute-0 sshd-session[195671]: Invalid user nagios from 188.166.69.60 port 49116
Jan 21 23:38:42 compute-0 sshd-session[195671]: Connection closed by invalid user nagios 188.166.69.60 port 49116 [preauth]
Jan 21 23:38:42 compute-0 podman[195827]: 2026-01-21 23:38:42.846836731 +0000 UTC m=+0.057375740 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:38:43 compute-0 python3.9[195872]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 23:38:44 compute-0 sudo[196022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdggesaxdwkqatlezysnaelzeuujhwag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038723.976331-2165-223786694460264/AnsiballZ_stat.py'
Jan 21 23:38:44 compute-0 sudo[196022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:44 compute-0 python3.9[196024]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:44 compute-0 sudo[196022]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:44 compute-0 sudo[196147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqkvetxslpqcewzsppqzpcrykefvvdol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038723.976331-2165-223786694460264/AnsiballZ_copy.py'
Jan 21 23:38:44 compute-0 sudo[196147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:45 compute-0 python3.9[196149]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038723.976331-2165-223786694460264/.source.yaml _original_basename=.5_kevatw follow=False checksum=f7f2a63c4b6d9ab32b6599b5ceeeebe015d9558b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:45 compute-0 sudo[196147]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:45 compute-0 sudo[196299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsjhzpzujibdhiztuvzqaoberlshhbml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038725.4501193-2210-86097577567677/AnsiballZ_stat.py'
Jan 21 23:38:45 compute-0 sudo[196299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:45 compute-0 python3.9[196301]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:45 compute-0 sudo[196299]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:46 compute-0 sudo[196422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwjlukxvnmtqexpvpfhagaiivrgkdivi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038725.4501193-2210-86097577567677/AnsiballZ_copy.py'
Jan 21 23:38:46 compute-0 sudo[196422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:46 compute-0 python3.9[196424]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038725.4501193-2210-86097577567677/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:46 compute-0 sudo[196422]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:47 compute-0 sudo[196574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffgjgqdugunrfvdvvbymqzewrxkirbxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038727.5664973-2273-183746895940813/AnsiballZ_file.py'
Jan 21 23:38:47 compute-0 sudo[196574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:48 compute-0 python3.9[196576]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:48 compute-0 sudo[196574]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:48 compute-0 sudo[196726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiskowtqwaejjfcabvcvocqvwdnltuab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038728.3436167-2297-221077994248801/AnsiballZ_file.py'
Jan 21 23:38:48 compute-0 sudo[196726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:48 compute-0 python3.9[196728]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:38:48 compute-0 sudo[196726]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:49 compute-0 sudo[196878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvdwmwhttmgwckfdyyruwvxxpefhlrmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038729.1009703-2321-83193089708082/AnsiballZ_stat.py'
Jan 21 23:38:49 compute-0 sudo[196878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:49 compute-0 python3.9[196880]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:38:49 compute-0 sudo[196878]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:49 compute-0 sudo[196956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpbxeamwbqtetsscdjjpwnashjhuqpoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038729.1009703-2321-83193089708082/AnsiballZ_file.py'
Jan 21 23:38:49 compute-0 sudo[196956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:50 compute-0 python3.9[196958]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.sflpjde4 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:50 compute-0 sudo[196956]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:50 compute-0 python3.9[197108]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:52 compute-0 podman[197353]: 2026-01-21 23:38:52.248599051 +0000 UTC m=+0.071015800 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 23:38:52 compute-0 systemd[1]: 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d-155ce864f9758a7f.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 23:38:52 compute-0 systemd[1]: 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d-155ce864f9758a7f.service: Failed with result 'exit-code'.
Jan 21 23:38:53 compute-0 sudo[197548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hubobtnioopihfyjjpioclkrpssbsrgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038732.91919-2432-135868580943130/AnsiballZ_container_config_data.py'
Jan 21 23:38:53 compute-0 sudo[197548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:53 compute-0 python3.9[197550]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 21 23:38:53 compute-0 sudo[197548]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:54 compute-0 sudo[197700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnaxerxmpjchytljptgtawcgtfivypab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038734.1191728-2465-68124119151413/AnsiballZ_container_config_hash.py'
Jan 21 23:38:54 compute-0 sudo[197700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:54 compute-0 python3.9[197702]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:38:54 compute-0 sudo[197700]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:55 compute-0 sudo[197852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxltepgbrrgvbowusbbchylsgzoxgjtv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038735.216026-2495-79492386807664/AnsiballZ_edpm_container_manage.py'
Jan 21 23:38:55 compute-0 sudo[197852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:55 compute-0 python3[197854]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:38:57 compute-0 podman[197866]: 2026-01-21 23:38:57.232048619 +0000 UTC m=+1.355083356 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 21 23:38:57 compute-0 podman[197964]: 2026-01-21 23:38:57.383012147 +0000 UTC m=+0.057948303 container create ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:38:57 compute-0 podman[197964]: 2026-01-21 23:38:57.358100767 +0000 UTC m=+0.033036943 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 21 23:38:57 compute-0 python3[197854]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 21 23:38:57 compute-0 sudo[197852]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:58 compute-0 sudo[198153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usosfdsfohjrtioaeqfyvluwlsfqegqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038737.916247-2519-78887751422390/AnsiballZ_stat.py'
Jan 21 23:38:58 compute-0 sudo[198153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:58 compute-0 python3.9[198155]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:58 compute-0 sudo[198153]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:59 compute-0 sudo[198307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzthxtkowoojlzdkdrfmwsxqxzszyzlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038738.820374-2546-77097713386403/AnsiballZ_file.py'
Jan 21 23:38:59 compute-0 sudo[198307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:59 compute-0 python3.9[198309]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:38:59 compute-0 sudo[198307]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:59 compute-0 sudo[198383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxegyifarvwjszwgxtkasnowfxdxwsiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038738.820374-2546-77097713386403/AnsiballZ_stat.py'
Jan 21 23:38:59 compute-0 sudo[198383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:38:59 compute-0 python3.9[198385]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:38:59 compute-0 sudo[198383]: pam_unix(sudo:session): session closed for user root
Jan 21 23:38:59 compute-0 auditd[702]: Audit daemon rotating log files
Jan 21 23:39:00 compute-0 sudo[198534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgawdxstfyldztiiqiqmtbissucybhyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038739.9570773-2546-163157608478645/AnsiballZ_copy.py'
Jan 21 23:39:00 compute-0 sudo[198534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:00 compute-0 python3.9[198536]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038739.9570773-2546-163157608478645/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:00 compute-0 sudo[198534]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:00 compute-0 sudo[198610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsvxqjalecpuzezcsqamitjnpijtatng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038739.9570773-2546-163157608478645/AnsiballZ_systemd.py'
Jan 21 23:39:00 compute-0 sudo[198610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:01 compute-0 python3.9[198612]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:39:01 compute-0 systemd[1]: Reloading.
Jan 21 23:39:01 compute-0 systemd-rc-local-generator[198639]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:39:01 compute-0 systemd-sysv-generator[198643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:39:01 compute-0 sudo[198610]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:02 compute-0 sudo[198732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmftsaoyjselqammtsgxmwpkfsdrtgso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038739.9570773-2546-163157608478645/AnsiballZ_systemd.py'
Jan 21 23:39:02 compute-0 sudo[198732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:02 compute-0 podman[198696]: 2026-01-21 23:39:02.222098922 +0000 UTC m=+0.287869637 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 23:39:02 compute-0 python3.9[198734]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:39:02 compute-0 systemd[1]: Reloading.
Jan 21 23:39:02 compute-0 systemd-rc-local-generator[198774]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:39:02 compute-0 systemd-sysv-generator[198777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:39:02 compute-0 systemd[1]: Starting podman_exporter container...
Jan 21 23:39:02 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:39:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/21aa1e635fce7196d16b4dea75b1061131d4ee5d36818b5d22abb2af1cc750b3/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 23:39:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/21aa1e635fce7196d16b4dea75b1061131d4ee5d36818b5d22abb2af1cc750b3/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 23:39:02 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88.
Jan 21 23:39:02 compute-0 podman[198788]: 2026-01-21 23:39:02.959183972 +0000 UTC m=+0.119821524 container init ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:39:02 compute-0 podman_exporter[198804]: ts=2026-01-21T23:39:02.977Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 21 23:39:02 compute-0 podman_exporter[198804]: ts=2026-01-21T23:39:02.977Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 21 23:39:02 compute-0 podman_exporter[198804]: ts=2026-01-21T23:39:02.977Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 21 23:39:02 compute-0 podman_exporter[198804]: ts=2026-01-21T23:39:02.977Z caller=handler.go:105 level=info collector=container
Jan 21 23:39:02 compute-0 podman[198788]: 2026-01-21 23:39:02.984363979 +0000 UTC m=+0.145001521 container start ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:39:02 compute-0 podman[198788]: podman_exporter
Jan 21 23:39:02 compute-0 systemd[1]: Starting Podman API Service...
Jan 21 23:39:02 compute-0 systemd[1]: Started Podman API Service.
Jan 21 23:39:02 compute-0 systemd[1]: Started podman_exporter container.
Jan 21 23:39:03 compute-0 podman[198815]: time="2026-01-21T23:39:03Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 21 23:39:03 compute-0 podman[198815]: time="2026-01-21T23:39:03Z" level=info msg="Setting parallel job count to 25"
Jan 21 23:39:03 compute-0 podman[198815]: time="2026-01-21T23:39:03Z" level=info msg="Using sqlite as database backend"
Jan 21 23:39:03 compute-0 podman[198815]: time="2026-01-21T23:39:03Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 21 23:39:03 compute-0 podman[198815]: time="2026-01-21T23:39:03Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 21 23:39:03 compute-0 podman[198815]: time="2026-01-21T23:39:03Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 21 23:39:03 compute-0 sudo[198732]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:03 compute-0 podman[198815]: @ - - [21/Jan/2026:23:39:03 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 21 23:39:03 compute-0 podman[198815]: time="2026-01-21T23:39:03Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 21 23:39:03 compute-0 podman[198813]: 2026-01-21 23:39:03.073068647 +0000 UTC m=+0.076729642 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:39:03 compute-0 podman[198815]: @ - - [21/Jan/2026:23:39:03 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18076 "" "Go-http-client/1.1"
Jan 21 23:39:03 compute-0 podman_exporter[198804]: ts=2026-01-21T23:39:03.078Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 21 23:39:03 compute-0 podman_exporter[198804]: ts=2026-01-21T23:39:03.078Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 21 23:39:03 compute-0 podman_exporter[198804]: ts=2026-01-21T23:39:03.079Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 21 23:39:03 compute-0 systemd[1]: ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88-426c3f2a07c6d0a6.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 23:39:03 compute-0 systemd[1]: ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88-426c3f2a07c6d0a6.service: Failed with result 'exit-code'.
Jan 21 23:39:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:39:03.171 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:39:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:39:03.171 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:39:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:39:03.172 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:39:04 compute-0 python3.9[198999]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 23:39:05 compute-0 sudo[199149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-howbmgyitsicnphqcrqwqujwvbenbvdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038745.1017852-2681-175571370542806/AnsiballZ_stat.py'
Jan 21 23:39:05 compute-0 sudo[199149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:05 compute-0 python3.9[199151]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:39:05 compute-0 sudo[199149]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:06 compute-0 sudo[199274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzkakrlouudyffckhfsrautkxywobaqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038745.1017852-2681-175571370542806/AnsiballZ_copy.py'
Jan 21 23:39:06 compute-0 sudo[199274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:06 compute-0 python3.9[199276]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038745.1017852-2681-175571370542806/.source.yaml _original_basename=.9fhj887t follow=False checksum=f3d2380b7b2b83f386aaaefa3dff58bab4ad332f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:06 compute-0 sudo[199274]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:06 compute-0 sudo[199426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcwhuttzxjdqothveioiaqxsophmnbho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038746.5679698-2726-224949613332122/AnsiballZ_stat.py'
Jan 21 23:39:06 compute-0 sudo[199426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:07 compute-0 python3.9[199428]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:39:07 compute-0 sudo[199426]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:07 compute-0 sudo[199549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdezqvcrhllqoliizxijryksivjzxcmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038746.5679698-2726-224949613332122/AnsiballZ_copy.py'
Jan 21 23:39:07 compute-0 sudo[199549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:07 compute-0 python3.9[199551]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769038746.5679698-2726-224949613332122/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:39:07 compute-0 sudo[199549]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:09 compute-0 sudo[199701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nikymugbhmylinmasxxcxzvnogmlmcmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038748.8419602-2789-115349703219648/AnsiballZ_file.py'
Jan 21 23:39:09 compute-0 sudo[199701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:09 compute-0 python3.9[199703]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:09 compute-0 sudo[199701]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:09 compute-0 sudo[199853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnodupdbvcjyfwfiobegasmtedpnblcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038749.6222103-2813-333602668061/AnsiballZ_file.py'
Jan 21 23:39:09 compute-0 sudo[199853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:10 compute-0 python3.9[199855]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 21 23:39:10 compute-0 sudo[199853]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:10 compute-0 sudo[200005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jspykescvnzqwvzgtfwbwdvawybcflud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038750.461468-2837-52227362361179/AnsiballZ_stat.py'
Jan 21 23:39:10 compute-0 sudo[200005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:10 compute-0 python3.9[200007]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:39:10 compute-0 sudo[200005]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:11 compute-0 sudo[200083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-actwzehltnnzbposflufwnddkxujqaez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038750.461468-2837-52227362361179/AnsiballZ_file.py'
Jan 21 23:39:11 compute-0 sudo[200083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:11 compute-0 python3.9[200085]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.931k_ucq recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:11 compute-0 sudo[200083]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:12 compute-0 python3.9[200235]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.216 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.231 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.231 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.231 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.244 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.244 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.277 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.277 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.278 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.278 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:39:12 compute-0 podman[200236]: 2026-01-21 23:39:12.286715353 +0000 UTC m=+0.052849996 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.429 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.431 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5962MB free_disk=73.55148315429688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.431 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.431 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.511 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.512 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.541 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.559 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.560 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:39:12 compute-0 nova_compute[182935]: 2026-01-21 23:39:12.560 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:39:13 compute-0 nova_compute[182935]: 2026-01-21 23:39:13.109 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:13 compute-0 nova_compute[182935]: 2026-01-21 23:39:13.109 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:13 compute-0 nova_compute[182935]: 2026-01-21 23:39:13.109 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:13 compute-0 nova_compute[182935]: 2026-01-21 23:39:13.109 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:39:13 compute-0 podman[200504]: 2026-01-21 23:39:13.45274654 +0000 UTC m=+0.091712663 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:39:13 compute-0 nova_compute[182935]: 2026-01-21 23:39:13.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:13 compute-0 nova_compute[182935]: 2026-01-21 23:39:13.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:13 compute-0 nova_compute[182935]: 2026-01-21 23:39:13.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:13 compute-0 nova_compute[182935]: 2026-01-21 23:39:13.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:39:14 compute-0 sudo[200699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-payniluhnokwccaicdemtxjrdxwtxwgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038754.209221-2948-254441750076020/AnsiballZ_container_config_data.py'
Jan 21 23:39:14 compute-0 sudo[200699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:14 compute-0 python3.9[200701]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 21 23:39:14 compute-0 sudo[200699]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:15 compute-0 sudo[200851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyevjuksaedforxobpltzietngsiavme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038755.289088-2981-83938764590943/AnsiballZ_container_config_hash.py'
Jan 21 23:39:15 compute-0 sudo[200851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:15 compute-0 python3.9[200853]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 21 23:39:15 compute-0 sudo[200851]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:16 compute-0 sudo[201003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxlfdapwtpmojikbzzhfahdfjqrqesxu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038756.3703778-3011-103839641081132/AnsiballZ_edpm_container_manage.py'
Jan 21 23:39:16 compute-0 sudo[201003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:16 compute-0 python3[201005]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 21 23:39:20 compute-0 podman[201019]: 2026-01-21 23:39:20.294675941 +0000 UTC m=+3.201981178 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 21 23:39:20 compute-0 podman[201112]: 2026-01-21 23:39:20.437662226 +0000 UTC m=+0.052318762 container create 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 21 23:39:20 compute-0 podman[201112]: 2026-01-21 23:39:20.410902053 +0000 UTC m=+0.025558619 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 21 23:39:20 compute-0 python3[201005]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 21 23:39:20 compute-0 sudo[201003]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:20 compute-0 sudo[201300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgvmgylllvhjksnoyvzmtzyauexnkfwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038760.7264798-3035-164614544628961/AnsiballZ_stat.py'
Jan 21 23:39:20 compute-0 sudo[201300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:21 compute-0 python3.9[201302]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:39:21 compute-0 sudo[201300]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:21 compute-0 sudo[201454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slhskystbiwcuwmmovrjvltklhdyzfcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038761.5971687-3062-275424644585541/AnsiballZ_file.py'
Jan 21 23:39:21 compute-0 sudo[201454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:22 compute-0 python3.9[201456]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:22 compute-0 sudo[201454]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:22 compute-0 sudo[201530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buopwwmqzfyxqphhjzyeftriawitcvdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038761.5971687-3062-275424644585541/AnsiballZ_stat.py'
Jan 21 23:39:22 compute-0 sudo[201530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:22 compute-0 python3.9[201532]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:39:22 compute-0 sudo[201530]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:22 compute-0 podman[201585]: 2026-01-21 23:39:22.674582437 +0000 UTC m=+0.048743896 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 23:39:22 compute-0 systemd[1]: 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d-155ce864f9758a7f.service: Main process exited, code=exited, status=1/FAILURE
Jan 21 23:39:22 compute-0 systemd[1]: 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d-155ce864f9758a7f.service: Failed with result 'exit-code'.
Jan 21 23:39:22 compute-0 sudo[201700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpcrlcuebvnzuaaepwmenvmgdvojlgrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038762.4810147-3062-158748587523126/AnsiballZ_copy.py'
Jan 21 23:39:22 compute-0 sudo[201700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:23 compute-0 python3.9[201702]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769038762.4810147-3062-158748587523126/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:23 compute-0 sudo[201700]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:23 compute-0 sudo[201776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngzcoullubiwtsvsozaenwbjghkzakuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038762.4810147-3062-158748587523126/AnsiballZ_systemd.py'
Jan 21 23:39:23 compute-0 sudo[201776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:23 compute-0 python3.9[201778]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 21 23:39:23 compute-0 systemd[1]: Reloading.
Jan 21 23:39:23 compute-0 systemd-rc-local-generator[201806]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:39:23 compute-0 systemd-sysv-generator[201810]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:39:24 compute-0 sudo[201776]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:24 compute-0 sudo[201887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vssvomnjivqjvbkuexdqpigblsrofmxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038762.4810147-3062-158748587523126/AnsiballZ_systemd.py'
Jan 21 23:39:24 compute-0 sudo[201887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:24 compute-0 python3.9[201889]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 21 23:39:24 compute-0 systemd[1]: Reloading.
Jan 21 23:39:24 compute-0 systemd-rc-local-generator[201917]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 21 23:39:24 compute-0 systemd-sysv-generator[201920]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 21 23:39:24 compute-0 systemd[1]: Starting openstack_network_exporter container...
Jan 21 23:39:24 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:39:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a576eff36058470934c1f5a74f905ef24e2074276a82dcce3b6040133dcf3650/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 21 23:39:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a576eff36058470934c1f5a74f905ef24e2074276a82dcce3b6040133dcf3650/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 21 23:39:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a576eff36058470934c1f5a74f905ef24e2074276a82dcce3b6040133dcf3650/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 21 23:39:25 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53.
Jan 21 23:39:25 compute-0 podman[201928]: 2026-01-21 23:39:25.037484766 +0000 UTC m=+0.125343537 container init 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, version=9.6, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc.)
Jan 21 23:39:25 compute-0 openstack_network_exporter[201943]: INFO    23:39:25 main.go:48: registering *bridge.Collector
Jan 21 23:39:25 compute-0 openstack_network_exporter[201943]: INFO    23:39:25 main.go:48: registering *coverage.Collector
Jan 21 23:39:25 compute-0 openstack_network_exporter[201943]: INFO    23:39:25 main.go:48: registering *datapath.Collector
Jan 21 23:39:25 compute-0 openstack_network_exporter[201943]: INFO    23:39:25 main.go:48: registering *iface.Collector
Jan 21 23:39:25 compute-0 openstack_network_exporter[201943]: INFO    23:39:25 main.go:48: registering *memory.Collector
Jan 21 23:39:25 compute-0 openstack_network_exporter[201943]: INFO    23:39:25 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 21 23:39:25 compute-0 openstack_network_exporter[201943]: INFO    23:39:25 main.go:48: registering *ovn.Collector
Jan 21 23:39:25 compute-0 openstack_network_exporter[201943]: INFO    23:39:25 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 21 23:39:25 compute-0 openstack_network_exporter[201943]: INFO    23:39:25 main.go:48: registering *pmd_perf.Collector
Jan 21 23:39:25 compute-0 openstack_network_exporter[201943]: INFO    23:39:25 main.go:48: registering *pmd_rxq.Collector
Jan 21 23:39:25 compute-0 openstack_network_exporter[201943]: INFO    23:39:25 main.go:48: registering *vswitch.Collector
Jan 21 23:39:25 compute-0 openstack_network_exporter[201943]: NOTICE  23:39:25 main.go:76: listening on https://:9105/metrics
Jan 21 23:39:25 compute-0 podman[201928]: 2026-01-21 23:39:25.060235074 +0000 UTC m=+0.148093825 container start 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, config_id=openstack_network_exporter, architecture=x86_64, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=)
Jan 21 23:39:25 compute-0 podman[201928]: openstack_network_exporter
Jan 21 23:39:25 compute-0 systemd[1]: Started openstack_network_exporter container.
Jan 21 23:39:25 compute-0 sudo[201887]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:25 compute-0 podman[201948]: 2026-01-21 23:39:25.172682765 +0000 UTC m=+0.090909980 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, release=1755695350, container_name=openstack_network_exporter)
Jan 21 23:39:26 compute-0 python3.9[202126]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 21 23:39:27 compute-0 sudo[202278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxxjmemspiozlkyjcwjgfotwalehsbyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038767.5870616-3197-47848881230460/AnsiballZ_stat.py'
Jan 21 23:39:27 compute-0 sudo[202278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:28 compute-0 python3.9[202280]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:39:28 compute-0 sudo[202278]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:28 compute-0 sshd-session[202157]: Invalid user nagios from 188.166.69.60 port 45476
Jan 21 23:39:28 compute-0 sshd-session[202157]: Connection closed by invalid user nagios 188.166.69.60 port 45476 [preauth]
Jan 21 23:39:28 compute-0 sudo[202403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwweocxmbmmvkrtnkjdpqhbnupfjombx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038767.5870616-3197-47848881230460/AnsiballZ_copy.py'
Jan 21 23:39:28 compute-0 sudo[202403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:28 compute-0 python3.9[202405]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038767.5870616-3197-47848881230460/.source.yaml _original_basename=.l6wfjx1x follow=False checksum=0317da2c639ada97636b9543228efff3cda9d578 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:28 compute-0 sudo[202403]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:29 compute-0 sudo[202555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgenhpvhengzatstbnlntunwxbzmeuyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038769.168482-3242-157353040825693/AnsiballZ_find.py'
Jan 21 23:39:29 compute-0 sudo[202555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:29 compute-0 python3.9[202557]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 21 23:39:29 compute-0 sudo[202555]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:30 compute-0 sudo[202707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejapveedfojfjawaimlbbmnqmfekhgeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038770.3249588-3270-134568686657943/AnsiballZ_podman_container_info.py'
Jan 21 23:39:30 compute-0 sudo[202707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:30 compute-0 python3.9[202709]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 21 23:39:31 compute-0 sudo[202707]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:31 compute-0 sudo[202872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzvgbaqdsqaglqalkaulzdussgikaxdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038771.2432885-3278-209046173112722/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:31 compute-0 sudo[202872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:31 compute-0 python3.9[202874]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:32 compute-0 systemd[1]: Started libpod-conmon-5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61.scope.
Jan 21 23:39:32 compute-0 podman[202875]: 2026-01-21 23:39:32.047324267 +0000 UTC m=+0.091727587 container exec 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:39:32 compute-0 podman[202875]: 2026-01-21 23:39:32.084426412 +0000 UTC m=+0.128829722 container exec_died 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:39:32 compute-0 systemd[1]: libpod-conmon-5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61.scope: Deactivated successfully.
Jan 21 23:39:32 compute-0 sudo[202872]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:32 compute-0 sudo[203066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkgmvrjoclggycogpismudjkzlodbqhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038772.305319-3286-80443477780257/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:32 compute-0 sudo[203066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:32 compute-0 podman[203029]: 2026-01-21 23:39:32.64925096 +0000 UTC m=+0.096778125 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:39:32 compute-0 python3.9[203074]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:32 compute-0 systemd[1]: Started libpod-conmon-5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61.scope.
Jan 21 23:39:32 compute-0 podman[203082]: 2026-01-21 23:39:32.97489018 +0000 UTC m=+0.135906824 container exec 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 23:39:33 compute-0 podman[203082]: 2026-01-21 23:39:33.023128135 +0000 UTC m=+0.184144809 container exec_died 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 21 23:39:33 compute-0 systemd[1]: libpod-conmon-5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61.scope: Deactivated successfully.
Jan 21 23:39:33 compute-0 sudo[203066]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:33 compute-0 podman[203115]: 2026-01-21 23:39:33.215670261 +0000 UTC m=+0.061408857 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:39:33 compute-0 sudo[203289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-allelohfkxceaitjmxswvkyzvmsqkdfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038773.3332875-3294-238127150059649/AnsiballZ_file.py'
Jan 21 23:39:33 compute-0 sudo[203289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:33 compute-0 python3.9[203291]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:33 compute-0 sudo[203289]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:34 compute-0 sudo[203441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xffhfmhcxayjkrrsnyduuzqiktouhukl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038774.1382525-3303-74996706249826/AnsiballZ_podman_container_info.py'
Jan 21 23:39:34 compute-0 sudo[203441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:34 compute-0 python3.9[203443]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 21 23:39:34 compute-0 sudo[203441]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:35 compute-0 sudo[203607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdiyzwsvajufybupstauodddxbzqiobt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038774.8796332-3311-33443334762224/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:35 compute-0 sudo[203607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:35 compute-0 python3.9[203609]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:35 compute-0 systemd[1]: Started libpod-conmon-86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c.scope.
Jan 21 23:39:35 compute-0 podman[203610]: 2026-01-21 23:39:35.475133145 +0000 UTC m=+0.084406970 container exec 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:39:35 compute-0 podman[203610]: 2026-01-21 23:39:35.511260429 +0000 UTC m=+0.120534164 container exec_died 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:39:35 compute-0 systemd[1]: libpod-conmon-86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c.scope: Deactivated successfully.
Jan 21 23:39:35 compute-0 sudo[203607]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:35 compute-0 sudo[203792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqfpdnxlbpuaylgpmymxvcenjqtntbsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038775.72596-3319-227502353715784/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:35 compute-0 sudo[203792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:36 compute-0 python3.9[203794]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:36 compute-0 systemd[1]: Started libpod-conmon-86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c.scope.
Jan 21 23:39:36 compute-0 rsyslogd[1005]: imjournal from <np0005591283:python3.9>: begin to drop messages due to rate-limiting
Jan 21 23:39:36 compute-0 podman[203795]: 2026-01-21 23:39:36.322908377 +0000 UTC m=+0.078901182 container exec 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 21 23:39:36 compute-0 podman[203795]: 2026-01-21 23:39:36.353047883 +0000 UTC m=+0.109040608 container exec_died 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 23:39:36 compute-0 systemd[1]: libpod-conmon-86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c.scope: Deactivated successfully.
Jan 21 23:39:36 compute-0 sudo[203792]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:36 compute-0 sudo[203977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-groztnvcdfqehwqeychcybzqfbhvqfix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038776.5747228-3327-7155370496278/AnsiballZ_file.py'
Jan 21 23:39:36 compute-0 sudo[203977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:37 compute-0 python3.9[203979]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:37 compute-0 sudo[203977]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:37 compute-0 sudo[204129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfnfbmohvojnanvoeprjdnhiwohnrqnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038777.2684972-3336-191897899897616/AnsiballZ_podman_container_info.py'
Jan 21 23:39:37 compute-0 sudo[204129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:37 compute-0 python3.9[204131]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Jan 21 23:39:37 compute-0 sudo[204129]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:38 compute-0 sudo[204294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwdpxqstizupsomdajgarqesykwmbpbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038777.952684-3344-18379614445612/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:38 compute-0 sudo[204294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:38 compute-0 python3.9[204296]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:38 compute-0 systemd[1]: Started libpod-conmon-6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d.scope.
Jan 21 23:39:38 compute-0 podman[204297]: 2026-01-21 23:39:38.569523995 +0000 UTC m=+0.089162863 container exec 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 21 23:39:38 compute-0 podman[204297]: 2026-01-21 23:39:38.577128947 +0000 UTC m=+0.096767815 container exec_died 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:39:38 compute-0 sudo[204294]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:38 compute-0 systemd[1]: libpod-conmon-6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d.scope: Deactivated successfully.
Jan 21 23:39:39 compute-0 sudo[204478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohgiriwxcoiotibkdxezqeiwfyujzzod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038778.7775092-3352-162696438278117/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:39 compute-0 sudo[204478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:39 compute-0 python3.9[204480]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:39 compute-0 systemd[1]: Started libpod-conmon-6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d.scope.
Jan 21 23:39:39 compute-0 podman[204481]: 2026-01-21 23:39:39.353226694 +0000 UTC m=+0.068567441 container exec 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 21 23:39:39 compute-0 podman[204481]: 2026-01-21 23:39:39.388150893 +0000 UTC m=+0.103491640 container exec_died 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:39:39 compute-0 systemd[1]: libpod-conmon-6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d.scope: Deactivated successfully.
Jan 21 23:39:39 compute-0 sudo[204478]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:39 compute-0 sudo[204664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btgogswljdrspiziogylvuijzikxocam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038779.6514494-3360-112713093535577/AnsiballZ_file.py'
Jan 21 23:39:39 compute-0 sudo[204664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:40 compute-0 python3.9[204666]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:40 compute-0 sudo[204664]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:40 compute-0 sudo[204816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyvxnxjozfwcqyruvwheegeuaqmlgtch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038780.4001048-3369-73070810906750/AnsiballZ_podman_container_info.py'
Jan 21 23:39:40 compute-0 sudo[204816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:40 compute-0 python3.9[204818]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Jan 21 23:39:40 compute-0 sudo[204816]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:41 compute-0 sudo[204981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meafeidocyhmpttyhuymcbgseapmkmiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038781.0975134-3377-33878582554373/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:41 compute-0 sudo[204981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:41 compute-0 python3.9[204983]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:41 compute-0 systemd[1]: Started libpod-conmon-06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5.scope.
Jan 21 23:39:41 compute-0 podman[204984]: 2026-01-21 23:39:41.635091266 +0000 UTC m=+0.062216055 container exec 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:39:41 compute-0 podman[204984]: 2026-01-21 23:39:41.664579769 +0000 UTC m=+0.091704548 container exec_died 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 23:39:41 compute-0 systemd[1]: libpod-conmon-06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5.scope: Deactivated successfully.
Jan 21 23:39:41 compute-0 sudo[204981]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:42 compute-0 sudo[205166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkhdzmfncgifuuzgymuepmqyvrwzsmza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038781.8717513-3385-204978826294362/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:42 compute-0 sudo[205166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:42 compute-0 python3.9[205168]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:42 compute-0 systemd[1]: Started libpod-conmon-06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5.scope.
Jan 21 23:39:42 compute-0 podman[205169]: 2026-01-21 23:39:42.450389043 +0000 UTC m=+0.064333330 container exec 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 23:39:42 compute-0 podman[205169]: 2026-01-21 23:39:42.503047471 +0000 UTC m=+0.116991728 container exec_died 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:39:42 compute-0 podman[205186]: 2026-01-21 23:39:42.518587054 +0000 UTC m=+0.067236412 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:39:42 compute-0 systemd[1]: libpod-conmon-06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5.scope: Deactivated successfully.
Jan 21 23:39:42 compute-0 sudo[205166]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:43 compute-0 sudo[205372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjncvczsdzcitotgbjlnwueunggkfcix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038782.7383916-3393-171625216506837/AnsiballZ_file.py'
Jan 21 23:39:43 compute-0 sudo[205372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:43 compute-0 python3.9[205374]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:43 compute-0 sudo[205372]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:43 compute-0 sudo[205537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbthbpyinngrngwnegjhjdocmruwbcss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038783.4123259-3402-238306237899868/AnsiballZ_podman_container_info.py'
Jan 21 23:39:43 compute-0 sudo[205537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:43 compute-0 podman[205498]: 2026-01-21 23:39:43.68791345 +0000 UTC m=+0.054346147 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 23:39:43 compute-0 python3.9[205545]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 21 23:39:43 compute-0 sudo[205537]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:44 compute-0 sshd-session[205575]: Invalid user ubuntu from 38.67.240.124 port 1771
Jan 21 23:39:44 compute-0 sshd-session[205575]: Received disconnect from 38.67.240.124 port 1771:11:  [preauth]
Jan 21 23:39:44 compute-0 sshd-session[205575]: Disconnected from invalid user ubuntu 38.67.240.124 port 1771 [preauth]
Jan 21 23:39:44 compute-0 sudo[205709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkinfxlafzrsetxzhsphrquwuckybpbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038784.1301863-3410-163068142767060/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:44 compute-0 sudo[205709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:44 compute-0 python3.9[205711]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:44 compute-0 systemd[1]: Started libpod-conmon-ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88.scope.
Jan 21 23:39:44 compute-0 podman[205712]: 2026-01-21 23:39:44.711114202 +0000 UTC m=+0.085477243 container exec ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:39:44 compute-0 podman[205712]: 2026-01-21 23:39:44.741943613 +0000 UTC m=+0.116306634 container exec_died ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:39:44 compute-0 systemd[1]: libpod-conmon-ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88.scope: Deactivated successfully.
Jan 21 23:39:44 compute-0 sudo[205709]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:45 compute-0 sudo[205893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myqdrlxkwvggmyzzflkdullqlfxoiecs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038784.9380684-3418-41635105092842/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:45 compute-0 sudo[205893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:45 compute-0 python3.9[205895]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:45 compute-0 systemd[1]: Started libpod-conmon-ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88.scope.
Jan 21 23:39:45 compute-0 podman[205896]: 2026-01-21 23:39:45.555059533 +0000 UTC m=+0.063720557 container exec ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:39:45 compute-0 podman[205896]: 2026-01-21 23:39:45.590298978 +0000 UTC m=+0.098959982 container exec_died ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:39:45 compute-0 systemd[1]: libpod-conmon-ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88.scope: Deactivated successfully.
Jan 21 23:39:45 compute-0 sudo[205893]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:46 compute-0 sudo[206078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogfnkzwhmjaomnwlcsdmaofgedzxuhdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038785.8041308-3426-55396568165119/AnsiballZ_file.py'
Jan 21 23:39:46 compute-0 sudo[206078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:46 compute-0 python3.9[206080]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:46 compute-0 sudo[206078]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:47 compute-0 sudo[206230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezvorefzskhpiimfyiksrsrnmycyncng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038786.7122018-3435-237817619510140/AnsiballZ_podman_container_info.py'
Jan 21 23:39:47 compute-0 sudo[206230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:47 compute-0 python3.9[206232]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 21 23:39:47 compute-0 sudo[206230]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:47 compute-0 sudo[206396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odosmydcljddthydceexzunuhfoeqlfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038787.570085-3443-267051791744400/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:47 compute-0 sudo[206396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:48 compute-0 python3.9[206398]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:48 compute-0 systemd[1]: Started libpod-conmon-40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53.scope.
Jan 21 23:39:48 compute-0 podman[206399]: 2026-01-21 23:39:48.2383014 +0000 UTC m=+0.090854719 container exec 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 23:39:48 compute-0 podman[206399]: 2026-01-21 23:39:48.247228701 +0000 UTC m=+0.099782060 container exec_died 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Jan 21 23:39:48 compute-0 sudo[206396]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:48 compute-0 systemd[1]: libpod-conmon-40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53.scope: Deactivated successfully.
Jan 21 23:39:48 compute-0 sudo[206581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrxvzdopuaorhagncyjvbklyijpllvmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038788.5075068-3451-51664221782951/AnsiballZ_podman_container_exec.py'
Jan 21 23:39:48 compute-0 sudo[206581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:49 compute-0 python3.9[206583]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 21 23:39:49 compute-0 systemd[1]: Started libpod-conmon-40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53.scope.
Jan 21 23:39:49 compute-0 podman[206584]: 2026-01-21 23:39:49.169482121 +0000 UTC m=+0.149751532 container exec 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1755695350, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 23:39:49 compute-0 podman[206584]: 2026-01-21 23:39:49.199637977 +0000 UTC m=+0.179907388 container exec_died 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, config_id=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Jan 21 23:39:49 compute-0 systemd[1]: libpod-conmon-40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53.scope: Deactivated successfully.
Jan 21 23:39:49 compute-0 sudo[206581]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:49 compute-0 sudo[206769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utsrvhujsxvxazyodqcrfphazslbaago ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038789.4028902-3459-55991401867562/AnsiballZ_file.py'
Jan 21 23:39:49 compute-0 sudo[206769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:39:49 compute-0 python3.9[206771]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:39:49 compute-0 sudo[206769]: pam_unix(sudo:session): session closed for user root
Jan 21 23:39:53 compute-0 podman[206796]: 2026-01-21 23:39:53.753160704 +0000 UTC m=+0.107673269 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 21 23:39:55 compute-0 podman[206817]: 2026-01-21 23:39:55.737679244 +0000 UTC m=+0.095375145 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Jan 21 23:40:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:40:03.172 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:40:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:40:03.174 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:40:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:40:03.174 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:40:03 compute-0 podman[206840]: 2026-01-21 23:40:03.695820663 +0000 UTC m=+0.058333872 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:40:03 compute-0 podman[206839]: 2026-01-21 23:40:03.736643938 +0000 UTC m=+0.102691703 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller)
Jan 21 23:40:12 compute-0 sshd-session[206888]: Invalid user nagios from 188.166.69.60 port 41358
Jan 21 23:40:12 compute-0 sshd-session[206888]: Connection closed by invalid user nagios 188.166.69.60 port 41358 [preauth]
Jan 21 23:40:12 compute-0 podman[206890]: 2026-01-21 23:40:12.694726501 +0000 UTC m=+0.058042476 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:40:12 compute-0 nova_compute[182935]: 2026-01-21 23:40:12.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:12 compute-0 nova_compute[182935]: 2026-01-21 23:40:12.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:12 compute-0 nova_compute[182935]: 2026-01-21 23:40:12.823 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:40:12 compute-0 nova_compute[182935]: 2026-01-21 23:40:12.824 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:40:12 compute-0 nova_compute[182935]: 2026-01-21 23:40:12.824 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:40:12 compute-0 nova_compute[182935]: 2026-01-21 23:40:12.824 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:40:12 compute-0 nova_compute[182935]: 2026-01-21 23:40:12.971 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:40:12 compute-0 nova_compute[182935]: 2026-01-21 23:40:12.973 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5964MB free_disk=73.41576385498047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:40:12 compute-0 nova_compute[182935]: 2026-01-21 23:40:12.974 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:40:12 compute-0 nova_compute[182935]: 2026-01-21 23:40:12.974 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:40:13 compute-0 nova_compute[182935]: 2026-01-21 23:40:13.038 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:40:13 compute-0 nova_compute[182935]: 2026-01-21 23:40:13.039 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:40:13 compute-0 nova_compute[182935]: 2026-01-21 23:40:13.067 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:40:13 compute-0 nova_compute[182935]: 2026-01-21 23:40:13.082 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:40:13 compute-0 nova_compute[182935]: 2026-01-21 23:40:13.084 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:40:13 compute-0 nova_compute[182935]: 2026-01-21 23:40:13.084 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:40:14 compute-0 nova_compute[182935]: 2026-01-21 23:40:14.079 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:14 compute-0 nova_compute[182935]: 2026-01-21 23:40:14.080 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:14 compute-0 nova_compute[182935]: 2026-01-21 23:40:14.080 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:40:14 compute-0 nova_compute[182935]: 2026-01-21 23:40:14.080 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:40:14 compute-0 nova_compute[182935]: 2026-01-21 23:40:14.100 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:40:14 compute-0 nova_compute[182935]: 2026-01-21 23:40:14.101 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:14 compute-0 nova_compute[182935]: 2026-01-21 23:40:14.101 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:14 compute-0 nova_compute[182935]: 2026-01-21 23:40:14.102 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:14 compute-0 nova_compute[182935]: 2026-01-21 23:40:14.102 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:14 compute-0 nova_compute[182935]: 2026-01-21 23:40:14.102 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:40:14 compute-0 podman[206914]: 2026-01-21 23:40:14.699159877 +0000 UTC m=+0.069649324 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:40:14 compute-0 nova_compute[182935]: 2026-01-21 23:40:14.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:40:23.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:40:24 compute-0 podman[206933]: 2026-01-21 23:40:24.679583947 +0000 UTC m=+0.055515576 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 23:40:26 compute-0 podman[206953]: 2026-01-21 23:40:26.711924444 +0000 UTC m=+0.085383136 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=openstack_network_exporter, name=ubi9-minimal)
Jan 21 23:40:33 compute-0 sudo[207121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdxlgunpylynauzmlucmyvztxsjpzwwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038833.5635667-3866-61066160552126/AnsiballZ_file.py'
Jan 21 23:40:33 compute-0 sudo[207121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:33 compute-0 podman[207073]: 2026-01-21 23:40:33.939868069 +0000 UTC m=+0.141870433 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 23:40:33 compute-0 podman[207074]: 2026-01-21 23:40:33.951674255 +0000 UTC m=+0.065337537 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:40:34 compute-0 python3.9[207132]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:34 compute-0 sudo[207121]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:34 compute-0 sudo[207299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqlthviakesewbjmogeuzbaehecnkhqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038834.4376278-3890-189970899723432/AnsiballZ_stat.py'
Jan 21 23:40:34 compute-0 sudo[207299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:34 compute-0 python3.9[207301]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:34 compute-0 sudo[207299]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:35 compute-0 sudo[207422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzqdmsujrlnntrohsldaaoljjjgzbkve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038834.4376278-3890-189970899723432/AnsiballZ_copy.py'
Jan 21 23:40:35 compute-0 sudo[207422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:35 compute-0 python3.9[207424]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769038834.4376278-3890-189970899723432/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:35 compute-0 sudo[207422]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:36 compute-0 sudo[207574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkzvbfecjfdyjhfoytjsjxcrlgivuxzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038836.214178-3938-49633898969005/AnsiballZ_file.py'
Jan 21 23:40:36 compute-0 sudo[207574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:36 compute-0 python3.9[207576]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:36 compute-0 sudo[207574]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:37 compute-0 sudo[207726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhchkpoxzsxrygqjfeecfzkiiouvuocj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038837.5471072-3962-39577874667255/AnsiballZ_stat.py'
Jan 21 23:40:37 compute-0 sudo[207726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:38 compute-0 python3.9[207728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:38 compute-0 sudo[207726]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:38 compute-0 sudo[207804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbxajuyypcgkiwjgzhhnljtmungfusfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038837.5471072-3962-39577874667255/AnsiballZ_file.py'
Jan 21 23:40:38 compute-0 sudo[207804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:38 compute-0 python3.9[207806]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:38 compute-0 sudo[207804]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:39 compute-0 sudo[207956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoyjibssxtnpfkmimebxtwqlwmckikcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038839.0154188-3998-188255047206346/AnsiballZ_stat.py'
Jan 21 23:40:39 compute-0 sudo[207956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:39 compute-0 python3.9[207958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:39 compute-0 sudo[207956]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:39 compute-0 sudo[208034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pucgvgyafesttayqpgfcretiztkzzjwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038839.0154188-3998-188255047206346/AnsiballZ_file.py'
Jan 21 23:40:39 compute-0 sudo[208034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:39 compute-0 python3.9[208036]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.c8le1huh recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:40 compute-0 sudo[208034]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:40 compute-0 sudo[208186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmhqzmqvosjptutghxjcmwmfjrtemddf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038840.4743996-4034-25287259625940/AnsiballZ_stat.py'
Jan 21 23:40:40 compute-0 sudo[208186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:40 compute-0 python3.9[208188]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:41 compute-0 sudo[208186]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:41 compute-0 sudo[208264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oolvieteatlrlndoynljdcgmkcugnecp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038840.4743996-4034-25287259625940/AnsiballZ_file.py'
Jan 21 23:40:41 compute-0 sudo[208264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:41 compute-0 python3.9[208266]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:41 compute-0 sudo[208264]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:42 compute-0 sudo[208416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-panqclnzprpyzfxoamdgxboatqojgusp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038841.9521532-4073-57666560230274/AnsiballZ_command.py'
Jan 21 23:40:42 compute-0 sudo[208416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:42 compute-0 python3.9[208418]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:40:42 compute-0 sudo[208416]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:43 compute-0 podman[208543]: 2026-01-21 23:40:43.248219412 +0000 UTC m=+0.058292639 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 23:40:43 compute-0 sudo[208586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxpnrgwommtatkfujnfattrrpvlkeooh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769038842.742835-4097-111710019356211/AnsiballZ_edpm_nftables_from_files.py'
Jan 21 23:40:43 compute-0 sudo[208586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:43 compute-0 python3[208595]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 21 23:40:43 compute-0 sudo[208586]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:44 compute-0 sudo[208745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haxfwfqtksdsgcncoxiqjiwbfygbeyev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038843.7033827-4121-103666536760474/AnsiballZ_stat.py'
Jan 21 23:40:44 compute-0 sudo[208745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:44 compute-0 python3.9[208747]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:44 compute-0 sudo[208745]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:44 compute-0 sudo[208823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxhyvyywwhutunwcdoroznijmlqwiofq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038843.7033827-4121-103666536760474/AnsiballZ_file.py'
Jan 21 23:40:44 compute-0 sudo[208823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:44 compute-0 python3.9[208825]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:44 compute-0 sudo[208823]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:45 compute-0 podman[208925]: 2026-01-21 23:40:45.700755736 +0000 UTC m=+0.066221636 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 23:40:45 compute-0 sudo[208993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urjbotogwsuynzgqvxxxecsupcabwtcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038845.4404337-4157-41493371634354/AnsiballZ_stat.py'
Jan 21 23:40:45 compute-0 sudo[208993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:45 compute-0 python3.9[208995]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:46 compute-0 sudo[208993]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:46 compute-0 sudo[209071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weffapnhdbcgfshnnwbztxpcmsyawuax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038845.4404337-4157-41493371634354/AnsiballZ_file.py'
Jan 21 23:40:46 compute-0 sudo[209071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:46 compute-0 python3.9[209073]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:46 compute-0 sudo[209071]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:47 compute-0 sudo[209223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiphwcgufdlkqwxuayqomqwkhvixaeey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038846.8312368-4193-181221713788843/AnsiballZ_stat.py'
Jan 21 23:40:47 compute-0 sudo[209223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:47 compute-0 python3.9[209225]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:47 compute-0 sudo[209223]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:47 compute-0 sudo[209301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wptknuaevwkawuycelujrrcakrgfcsjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038846.8312368-4193-181221713788843/AnsiballZ_file.py'
Jan 21 23:40:47 compute-0 sudo[209301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:47 compute-0 python3.9[209303]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:47 compute-0 sudo[209301]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:48 compute-0 sudo[209453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nothdfymjojpiqjhrzmwdzbioudyzfri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038848.225912-4229-151045766253410/AnsiballZ_stat.py'
Jan 21 23:40:48 compute-0 sudo[209453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:48 compute-0 python3.9[209455]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:48 compute-0 sudo[209453]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:49 compute-0 sudo[209531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsfxeodbkknafpfvucefdqgbramyfrgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038848.225912-4229-151045766253410/AnsiballZ_file.py'
Jan 21 23:40:49 compute-0 sudo[209531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:49 compute-0 python3.9[209533]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:49 compute-0 sudo[209531]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:49 compute-0 sudo[209683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sulgvgpqefguvlbsxfdjikvnwabrqzfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038849.6253908-4265-180587889730195/AnsiballZ_stat.py'
Jan 21 23:40:49 compute-0 sudo[209683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:50 compute-0 python3.9[209685]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 21 23:40:50 compute-0 sudo[209683]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:50 compute-0 sudo[209808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvzagsdciuwlrqcshoztxkesldemntcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038849.6253908-4265-180587889730195/AnsiballZ_copy.py'
Jan 21 23:40:50 compute-0 sudo[209808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:50 compute-0 python3.9[209810]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769038849.6253908-4265-180587889730195/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:50 compute-0 sudo[209808]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:51 compute-0 sudo[209960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovqoovhgwjpgevkkbxojkguncmdyqlak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038851.243921-4310-203066066101004/AnsiballZ_file.py'
Jan 21 23:40:51 compute-0 sudo[209960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:51 compute-0 python3.9[209962]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:51 compute-0 sudo[209960]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:52 compute-0 sudo[210112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spxkrfwqbvvgrdepdspbrcraffylqlbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038852.1084135-4334-15841024261661/AnsiballZ_command.py'
Jan 21 23:40:52 compute-0 sudo[210112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:52 compute-0 python3.9[210114]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:40:52 compute-0 sudo[210112]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:53 compute-0 sudo[210267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmnbambuazcgjcjddfjikurlstwmopge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038853.0344114-4358-91460906551856/AnsiballZ_blockinfile.py'
Jan 21 23:40:53 compute-0 sudo[210267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:53 compute-0 python3.9[210269]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:53 compute-0 sudo[210267]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:54 compute-0 sudo[210419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xctneshstidfvfoeayiipyjpigbotoka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038854.1045656-4385-211906366640226/AnsiballZ_command.py'
Jan 21 23:40:54 compute-0 sudo[210419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:54 compute-0 python3.9[210421]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:40:54 compute-0 sudo[210419]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:55 compute-0 sudo[210577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okljrbhudfbgescegipkrlutzehpnazp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038854.9712107-4409-155247826027878/AnsiballZ_stat.py'
Jan 21 23:40:55 compute-0 sudo[210577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:55 compute-0 podman[210546]: 2026-01-21 23:40:55.535000746 +0000 UTC m=+0.215745991 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:40:55 compute-0 python3.9[210581]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 21 23:40:55 compute-0 sudo[210577]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:55 compute-0 sshd-session[210595]: Invalid user nagios from 188.166.69.60 port 37898
Jan 21 23:40:56 compute-0 sshd-session[210595]: Connection closed by invalid user nagios 188.166.69.60 port 37898 [preauth]
Jan 21 23:40:56 compute-0 sudo[210748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfbxbywxpupktfwsealjlsgmcubjhwve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038855.9042583-4433-197939528425337/AnsiballZ_command.py'
Jan 21 23:40:56 compute-0 sudo[210748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:56 compute-0 python3.9[210750]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 21 23:40:56 compute-0 sudo[210748]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:57 compute-0 sudo[210920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bymkdrqluemwzwbbggdwrczsigspgtoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769038856.6884289-4457-36907270620215/AnsiballZ_file.py'
Jan 21 23:40:57 compute-0 sudo[210920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 21 23:40:57 compute-0 podman[210877]: 2026-01-21 23:40:57.307783302 +0000 UTC m=+0.071795631 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-type=git)
Jan 21 23:40:57 compute-0 python3.9[210926]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 21 23:40:57 compute-0 sudo[210920]: pam_unix(sudo:session): session closed for user root
Jan 21 23:40:58 compute-0 sshd-session[183255]: Connection closed by 192.168.122.30 port 53020
Jan 21 23:40:58 compute-0 sshd-session[183252]: pam_unix(sshd:session): session closed for user zuul
Jan 21 23:40:58 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Jan 21 23:40:58 compute-0 systemd[1]: session-25.scope: Consumed 1min 54.016s CPU time.
Jan 21 23:40:58 compute-0 systemd-logind[784]: Session 25 logged out. Waiting for processes to exit.
Jan 21 23:40:58 compute-0 systemd-logind[784]: Removed session 25.
Jan 21 23:41:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:41:03.174 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:41:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:41:03.176 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:41:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:41:03.176 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:41:04 compute-0 podman[210952]: 2026-01-21 23:41:04.711352238 +0000 UTC m=+0.079383252 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:41:04 compute-0 podman[210951]: 2026-01-21 23:41:04.765680887 +0000 UTC m=+0.127616064 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:41:12 compute-0 nova_compute[182935]: 2026-01-21 23:41:12.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:12 compute-0 nova_compute[182935]: 2026-01-21 23:41:12.822 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:13 compute-0 nova_compute[182935]: 2026-01-21 23:41:13.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:13 compute-0 nova_compute[182935]: 2026-01-21 23:41:13.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:13 compute-0 nova_compute[182935]: 2026-01-21 23:41:13.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:13 compute-0 nova_compute[182935]: 2026-01-21 23:41:13.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:13 compute-0 podman[211001]: 2026-01-21 23:41:13.820576743 +0000 UTC m=+0.118864166 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 23:41:13 compute-0 nova_compute[182935]: 2026-01-21 23:41:13.825 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:41:13 compute-0 nova_compute[182935]: 2026-01-21 23:41:13.826 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:41:13 compute-0 nova_compute[182935]: 2026-01-21 23:41:13.826 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:41:13 compute-0 nova_compute[182935]: 2026-01-21 23:41:13.827 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:41:14 compute-0 nova_compute[182935]: 2026-01-21 23:41:14.001 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:41:14 compute-0 nova_compute[182935]: 2026-01-21 23:41:14.003 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6000MB free_disk=73.41574096679688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:41:14 compute-0 nova_compute[182935]: 2026-01-21 23:41:14.004 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:41:14 compute-0 nova_compute[182935]: 2026-01-21 23:41:14.004 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:41:14 compute-0 nova_compute[182935]: 2026-01-21 23:41:14.136 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:41:14 compute-0 nova_compute[182935]: 2026-01-21 23:41:14.136 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:41:14 compute-0 nova_compute[182935]: 2026-01-21 23:41:14.160 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:41:14 compute-0 nova_compute[182935]: 2026-01-21 23:41:14.175 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:41:14 compute-0 nova_compute[182935]: 2026-01-21 23:41:14.177 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:41:14 compute-0 nova_compute[182935]: 2026-01-21 23:41:14.177 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:41:15 compute-0 nova_compute[182935]: 2026-01-21 23:41:15.177 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:15 compute-0 nova_compute[182935]: 2026-01-21 23:41:15.178 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:41:15 compute-0 nova_compute[182935]: 2026-01-21 23:41:15.178 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:41:15 compute-0 nova_compute[182935]: 2026-01-21 23:41:15.194 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:41:15 compute-0 nova_compute[182935]: 2026-01-21 23:41:15.194 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:15 compute-0 nova_compute[182935]: 2026-01-21 23:41:15.194 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:41:15 compute-0 nova_compute[182935]: 2026-01-21 23:41:15.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:16 compute-0 podman[211025]: 2026-01-21 23:41:16.691825269 +0000 UTC m=+0.066942521 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:41:16 compute-0 nova_compute[182935]: 2026-01-21 23:41:16.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:41:25 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:41:25.567 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:41:25 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:41:25.568 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:41:25 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:41:25.569 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:41:25 compute-0 podman[211045]: 2026-01-21 23:41:25.720892893 +0000 UTC m=+0.087262567 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:41:27 compute-0 podman[211065]: 2026-01-21 23:41:27.688855877 +0000 UTC m=+0.059344974 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 23:41:35 compute-0 podman[211089]: 2026-01-21 23:41:35.785722385 +0000 UTC m=+0.063328749 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:41:35 compute-0 podman[211088]: 2026-01-21 23:41:35.793927439 +0000 UTC m=+0.161660290 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:41:38 compute-0 sshd-session[211137]: Invalid user nagios from 188.166.69.60 port 42044
Jan 21 23:41:39 compute-0 sshd-session[211137]: Connection closed by invalid user nagios 188.166.69.60 port 42044 [preauth]
Jan 21 23:41:44 compute-0 podman[211139]: 2026-01-21 23:41:44.704878997 +0000 UTC m=+0.063850010 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:41:47 compute-0 podman[211163]: 2026-01-21 23:41:47.692374388 +0000 UTC m=+0.066818003 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:41:56 compute-0 podman[211182]: 2026-01-21 23:41:56.721372226 +0000 UTC m=+0.077926268 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute)
Jan 21 23:41:58 compute-0 podman[211202]: 2026-01-21 23:41:58.69552928 +0000 UTC m=+0.068373816 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter)
Jan 21 23:42:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:42:03.175 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:42:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:42:03.176 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:42:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:42:03.176 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:42:06 compute-0 podman[211225]: 2026-01-21 23:42:06.724886053 +0000 UTC m=+0.090326943 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:42:06 compute-0 podman[211224]: 2026-01-21 23:42:06.770055244 +0000 UTC m=+0.139661882 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 21 23:42:11 compute-0 nova_compute[182935]: 2026-01-21 23:42:11.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:11 compute-0 nova_compute[182935]: 2026-01-21 23:42:11.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 23:42:11 compute-0 nova_compute[182935]: 2026-01-21 23:42:11.927 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 23:42:11 compute-0 nova_compute[182935]: 2026-01-21 23:42:11.928 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:11 compute-0 nova_compute[182935]: 2026-01-21 23:42:11.928 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 23:42:11 compute-0 nova_compute[182935]: 2026-01-21 23:42:11.948 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:13 compute-0 nova_compute[182935]: 2026-01-21 23:42:13.958 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:13 compute-0 nova_compute[182935]: 2026-01-21 23:42:13.958 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.025 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.026 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.027 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.027 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.226 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.228 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6067MB free_disk=73.41742324829102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.228 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.228 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.562 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.562 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.668 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.772 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.773 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.793 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.821 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.845 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.870 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.873 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:42:14 compute-0 nova_compute[182935]: 2026-01-21 23:42:14.873 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:42:15 compute-0 podman[211271]: 2026-01-21 23:42:15.702846585 +0000 UTC m=+0.069867427 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:42:16 compute-0 nova_compute[182935]: 2026-01-21 23:42:16.704 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:16 compute-0 nova_compute[182935]: 2026-01-21 23:42:16.705 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:16 compute-0 nova_compute[182935]: 2026-01-21 23:42:16.705 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:42:16 compute-0 nova_compute[182935]: 2026-01-21 23:42:16.705 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:42:17 compute-0 nova_compute[182935]: 2026-01-21 23:42:17.007 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:42:17 compute-0 nova_compute[182935]: 2026-01-21 23:42:17.007 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:17 compute-0 nova_compute[182935]: 2026-01-21 23:42:17.007 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:17 compute-0 nova_compute[182935]: 2026-01-21 23:42:17.008 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:17 compute-0 nova_compute[182935]: 2026-01-21 23:42:17.008 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:17 compute-0 nova_compute[182935]: 2026-01-21 23:42:17.009 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:42:17 compute-0 nova_compute[182935]: 2026-01-21 23:42:17.009 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:42:18 compute-0 podman[211297]: 2026-01-21 23:42:18.680526117 +0000 UTC m=+0.054566682 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 21 23:42:21 compute-0 sshd-session[211316]: Invalid user nagios from 188.166.69.60 port 33884
Jan 21 23:42:21 compute-0 sshd-session[211316]: Connection closed by invalid user nagios 188.166.69.60 port 33884 [preauth]
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:42:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:42:27 compute-0 podman[211318]: 2026-01-21 23:42:27.717870352 +0000 UTC m=+0.081481944 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:42:29 compute-0 podman[211338]: 2026-01-21 23:42:29.736286057 +0000 UTC m=+0.103705507 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Jan 21 23:42:37 compute-0 podman[211360]: 2026-01-21 23:42:37.689131872 +0000 UTC m=+0.058701800 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:42:37 compute-0 podman[211359]: 2026-01-21 23:42:37.717180861 +0000 UTC m=+0.091286065 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 23:42:46 compute-0 podman[211408]: 2026-01-21 23:42:46.704762583 +0000 UTC m=+0.076430456 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:42:49 compute-0 podman[211431]: 2026-01-21 23:42:49.775947507 +0000 UTC m=+0.151983661 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:42:58 compute-0 podman[211451]: 2026-01-21 23:42:58.695292437 +0000 UTC m=+0.058374532 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 21 23:43:00 compute-0 podman[211471]: 2026-01-21 23:43:00.692087582 +0000 UTC m=+0.066499222 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal)
Jan 21 23:43:02 compute-0 sshd-session[211493]: Invalid user nagios from 188.166.69.60 port 43222
Jan 21 23:43:02 compute-0 sshd-session[211493]: Connection closed by invalid user nagios 188.166.69.60 port 43222 [preauth]
Jan 21 23:43:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:43:03.176 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:43:03.176 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:43:03.177 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:09 compute-0 podman[211496]: 2026-01-21 23:43:09.597409813 +0000 UTC m=+0.056628951 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:43:09 compute-0 podman[211495]: 2026-01-21 23:43:09.650817898 +0000 UTC m=+0.119319684 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 23:43:14 compute-0 nova_compute[182935]: 2026-01-21 23:43:14.092 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:14 compute-0 nova_compute[182935]: 2026-01-21 23:43:14.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:14 compute-0 nova_compute[182935]: 2026-01-21 23:43:14.839 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:14 compute-0 nova_compute[182935]: 2026-01-21 23:43:14.840 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:14 compute-0 nova_compute[182935]: 2026-01-21 23:43:14.840 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:14 compute-0 nova_compute[182935]: 2026-01-21 23:43:14.841 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:43:15 compute-0 nova_compute[182935]: 2026-01-21 23:43:15.046 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:43:15 compute-0 nova_compute[182935]: 2026-01-21 23:43:15.048 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6083MB free_disk=73.41744995117188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:43:15 compute-0 nova_compute[182935]: 2026-01-21 23:43:15.048 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:15 compute-0 nova_compute[182935]: 2026-01-21 23:43:15.048 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:15 compute-0 nova_compute[182935]: 2026-01-21 23:43:15.160 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:43:15 compute-0 nova_compute[182935]: 2026-01-21 23:43:15.160 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:43:15 compute-0 nova_compute[182935]: 2026-01-21 23:43:15.190 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:43:15 compute-0 nova_compute[182935]: 2026-01-21 23:43:15.215 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:43:15 compute-0 nova_compute[182935]: 2026-01-21 23:43:15.218 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:43:15 compute-0 nova_compute[182935]: 2026-01-21 23:43:15.218 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:16 compute-0 nova_compute[182935]: 2026-01-21 23:43:16.219 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:16 compute-0 nova_compute[182935]: 2026-01-21 23:43:16.220 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:16 compute-0 nova_compute[182935]: 2026-01-21 23:43:16.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:16 compute-0 nova_compute[182935]: 2026-01-21 23:43:16.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:43:16 compute-0 nova_compute[182935]: 2026-01-21 23:43:16.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:43:17 compute-0 podman[211544]: 2026-01-21 23:43:17.705713411 +0000 UTC m=+0.073520697 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:43:19 compute-0 nova_compute[182935]: 2026-01-21 23:43:19.109 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:43:19 compute-0 nova_compute[182935]: 2026-01-21 23:43:19.110 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:19 compute-0 nova_compute[182935]: 2026-01-21 23:43:19.110 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:19 compute-0 nova_compute[182935]: 2026-01-21 23:43:19.111 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:19 compute-0 nova_compute[182935]: 2026-01-21 23:43:19.111 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:19 compute-0 nova_compute[182935]: 2026-01-21 23:43:19.111 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:43:20 compute-0 nova_compute[182935]: 2026-01-21 23:43:20.105 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:43:20 compute-0 podman[211570]: 2026-01-21 23:43:20.723962022 +0000 UTC m=+0.094409288 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:43:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:43:26.874 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:43:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:43:26.875 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:43:27 compute-0 nova_compute[182935]: 2026-01-21 23:43:27.669 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "079c4a41-1146-4c56-a278-70fbef0949eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:27 compute-0 nova_compute[182935]: 2026-01-21 23:43:27.670 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "079c4a41-1146-4c56-a278-70fbef0949eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:27 compute-0 nova_compute[182935]: 2026-01-21 23:43:27.697 182939 DEBUG nova.compute.manager [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:43:27 compute-0 nova_compute[182935]: 2026-01-21 23:43:27.918 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:27 compute-0 nova_compute[182935]: 2026-01-21 23:43:27.919 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:27 compute-0 nova_compute[182935]: 2026-01-21 23:43:27.925 182939 DEBUG nova.virt.hardware [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:43:27 compute-0 nova_compute[182935]: 2026-01-21 23:43:27.925 182939 INFO nova.compute.claims [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.060 182939 DEBUG nova.compute.provider_tree [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.094 182939 DEBUG nova.scheduler.client.report [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.131 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.132 182939 DEBUG nova.compute.manager [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.211 182939 DEBUG nova.compute.manager [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.239 182939 INFO nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.264 182939 DEBUG nova.compute.manager [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.388 182939 DEBUG nova.compute.manager [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.389 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.390 182939 INFO nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Creating image(s)
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.390 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "/var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.391 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "/var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.392 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "/var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.392 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:28 compute-0 nova_compute[182935]: 2026-01-21 23:43:28.392 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:29 compute-0 podman[211590]: 2026-01-21 23:43:29.730299415 +0000 UTC m=+0.091146022 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 23:43:30 compute-0 nova_compute[182935]: 2026-01-21 23:43:30.974 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:31 compute-0 nova_compute[182935]: 2026-01-21 23:43:31.062 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.part --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:31 compute-0 nova_compute[182935]: 2026-01-21 23:43:31.065 182939 DEBUG nova.virt.images [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] 9cd98f02-a505-4543-a7ad-04e9a377b456 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 21 23:43:31 compute-0 nova_compute[182935]: 2026-01-21 23:43:31.066 182939 DEBUG nova.privsep.utils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:43:31 compute-0 nova_compute[182935]: 2026-01-21 23:43:31.067 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.part /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:31 compute-0 nova_compute[182935]: 2026-01-21 23:43:31.241 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.part /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.converted" returned: 0 in 0.174s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:31 compute-0 nova_compute[182935]: 2026-01-21 23:43:31.246 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:31 compute-0 nova_compute[182935]: 2026-01-21 23:43:31.329 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474.converted --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:31 compute-0 nova_compute[182935]: 2026-01-21 23:43:31.332 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.939s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:31 compute-0 nova_compute[182935]: 2026-01-21 23:43:31.361 182939 INFO oslo.privsep.daemon [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpy_s80mye/privsep.sock']
Jan 21 23:43:31 compute-0 podman[211626]: 2026-01-21 23:43:31.68490564 +0000 UTC m=+0.058395803 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, release=1755695350, io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.197 182939 INFO oslo.privsep.daemon [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Spawned new privsep daemon via rootwrap
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.014 211648 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.019 211648 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.021 211648 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.021 211648 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211648
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.281 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.362 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.364 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.365 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.389 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.455 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.457 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.514 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.516 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.517 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.581 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.583 182939 DEBUG nova.virt.disk.api [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Checking if we can resize image /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.584 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.643 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.645 182939 DEBUG nova.virt.disk.api [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Cannot resize image /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.646 182939 DEBUG nova.objects.instance [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lazy-loading 'migration_context' on Instance uuid 079c4a41-1146-4c56-a278-70fbef0949eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.719 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.720 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Ensure instance console log exists: /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.720 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.721 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.721 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.723 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.729 182939 WARNING nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.733 182939 DEBUG nova.virt.libvirt.host [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.733 182939 DEBUG nova.virt.libvirt.host [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.737 182939 DEBUG nova.virt.libvirt.host [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.737 182939 DEBUG nova.virt.libvirt.host [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.739 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.739 182939 DEBUG nova.virt.hardware [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.740 182939 DEBUG nova.virt.hardware [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.740 182939 DEBUG nova.virt.hardware [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.740 182939 DEBUG nova.virt.hardware [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.740 182939 DEBUG nova.virt.hardware [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.740 182939 DEBUG nova.virt.hardware [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.741 182939 DEBUG nova.virt.hardware [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.741 182939 DEBUG nova.virt.hardware [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.741 182939 DEBUG nova.virt.hardware [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.741 182939 DEBUG nova.virt.hardware [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.741 182939 DEBUG nova.virt.hardware [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.745 182939 DEBUG nova.privsep.utils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.746 182939 DEBUG nova.objects.instance [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lazy-loading 'pci_devices' on Instance uuid 079c4a41-1146-4c56-a278-70fbef0949eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.802 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:43:32 compute-0 nova_compute[182935]:   <uuid>079c4a41-1146-4c56-a278-70fbef0949eb</uuid>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   <name>instance-00000001</name>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <nova:name>tempest-AutoAllocateNetworkTest-server-178392874</nova:name>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:43:32</nova:creationTime>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:43:32 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:43:32 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:43:32 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:43:32 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:43:32 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:43:32 compute-0 nova_compute[182935]:         <nova:user uuid="f92dd0c2072346c6b7e7588673443ff2">tempest-AutoAllocateNetworkTest-1853609216-project-member</nova:user>
Jan 21 23:43:32 compute-0 nova_compute[182935]:         <nova:project uuid="8981554bfb65485a9218dab7f347822d">tempest-AutoAllocateNetworkTest-1853609216</nova:project>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <system>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <entry name="serial">079c4a41-1146-4c56-a278-70fbef0949eb</entry>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <entry name="uuid">079c4a41-1146-4c56-a278-70fbef0949eb</entry>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     </system>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   <os>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   </os>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   <features>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   </features>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk.config"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/console.log" append="off"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <video>
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     </video>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:43:32 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:43:32 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:43:32 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:43:32 compute-0 nova_compute[182935]: </domain>
Jan 21 23:43:32 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.861 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.862 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:43:32 compute-0 nova_compute[182935]: 2026-01-21 23:43:32.862 182939 INFO nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Using config drive
Jan 21 23:43:33 compute-0 nova_compute[182935]: 2026-01-21 23:43:33.510 182939 INFO nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Creating config drive at /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk.config
Jan 21 23:43:33 compute-0 nova_compute[182935]: 2026-01-21 23:43:33.515 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfvg7n574 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:33 compute-0 nova_compute[182935]: 2026-01-21 23:43:33.640 182939 DEBUG oslo_concurrency.processutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfvg7n574" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:33 compute-0 systemd-machined[154182]: New machine qemu-1-instance-00000001.
Jan 21 23:43:33 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.380 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039014.3787172, 079c4a41-1146-4c56-a278-70fbef0949eb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.382 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] VM Resumed (Lifecycle Event)
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.385 182939 DEBUG nova.compute.manager [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.386 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.390 182939 INFO nova.virt.libvirt.driver [-] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Instance spawned successfully.
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.391 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.433 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.439 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.442 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.442 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.442 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.443 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.443 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.444 182939 DEBUG nova.virt.libvirt.driver [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.466 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.467 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039014.3791943, 079c4a41-1146-4c56-a278-70fbef0949eb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.467 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] VM Started (Lifecycle Event)
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.505 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.509 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.547 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.617 182939 INFO nova.compute.manager [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Took 6.23 seconds to spawn the instance on the hypervisor.
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.619 182939 DEBUG nova.compute.manager [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.728 182939 INFO nova.compute.manager [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Took 6.91 seconds to build instance.
Jan 21 23:43:34 compute-0 nova_compute[182935]: 2026-01-21 23:43:34.758 182939 DEBUG oslo_concurrency.lockutils [None req-9669d22c-cc1f-4276-9cd4-b0bc11dd29d2 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "079c4a41-1146-4c56-a278-70fbef0949eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:43:36.876 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:43:39 compute-0 nova_compute[182935]: 2026-01-21 23:43:39.969 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "a56178c2-b7df-492f-816a-580ea1a80c21" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:39 compute-0 nova_compute[182935]: 2026-01-21 23:43:39.969 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:39 compute-0 nova_compute[182935]: 2026-01-21 23:43:39.997 182939 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.124 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.125 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.133 182939 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.134 182939 INFO nova.compute.claims [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.376 182939 DEBUG nova.compute.provider_tree [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.438 182939 ERROR nova.scheduler.client.report [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [req-f8717ae3-17fe-424c-a479-96eb206ef7ad] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 5f09a77c-505f-4bd3-ac26-41f43ebdf535.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-f8717ae3-17fe-424c-a479-96eb206ef7ad"}]}
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.484 182939 DEBUG nova.scheduler.client.report [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.512 182939 DEBUG nova.scheduler.client.report [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.513 182939 DEBUG nova.compute.provider_tree [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.532 182939 DEBUG nova.scheduler.client.report [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.564 182939 DEBUG nova.scheduler.client.report [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.662 182939 DEBUG nova.compute.provider_tree [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:43:40 compute-0 podman[211693]: 2026-01-21 23:43:40.762218221 +0000 UTC m=+0.120328122 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.779 182939 DEBUG nova.scheduler.client.report [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Updated inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with generation 8 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.780 182939 DEBUG nova.compute.provider_tree [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Updating resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 generation from 8 to 9 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.780 182939 DEBUG nova.compute.provider_tree [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:43:40 compute-0 podman[211692]: 2026-01-21 23:43:40.792040262 +0000 UTC m=+0.161567355 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.870 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:40 compute-0 nova_compute[182935]: 2026-01-21 23:43:40.871 182939 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.010 182939 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.012 182939 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.076 182939 INFO nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.132 182939 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.285 182939 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.288 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.289 182939 INFO nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Creating image(s)
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.290 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "/var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.291 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "/var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.292 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "/var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.327 182939 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.409 182939 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.410 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.411 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.422 182939 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.477 182939 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.478 182939 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.512 182939 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.513 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.514 182939 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.567 182939 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.569 182939 DEBUG nova.virt.disk.api [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Checking if we can resize image /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.569 182939 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.625 182939 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.626 182939 DEBUG nova.virt.disk.api [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Cannot resize image /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.626 182939 DEBUG nova.objects.instance [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lazy-loading 'migration_context' on Instance uuid a56178c2-b7df-492f-816a-580ea1a80c21 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.641 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.641 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Ensure instance console log exists: /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.642 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.642 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.642 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:43:41 compute-0 nova_compute[182935]: 2026-01-21 23:43:41.829 182939 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Automatically allocating a network for project 8981554bfb65485a9218dab7f347822d. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460
Jan 21 23:43:45 compute-0 sshd-session[211757]: Invalid user nagios from 188.166.69.60 port 35618
Jan 21 23:43:45 compute-0 sshd-session[211757]: Connection closed by invalid user nagios 188.166.69.60 port 35618 [preauth]
Jan 21 23:43:48 compute-0 podman[211781]: 2026-01-21 23:43:48.768648564 +0000 UTC m=+0.081869834 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:43:51 compute-0 podman[211805]: 2026-01-21 23:43:51.687401109 +0000 UTC m=+0.059328537 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 21 23:43:55 compute-0 nova_compute[182935]: 2026-01-21 23:43:55.244 182939 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Automatically allocated network: {'id': '48de92c9-2a56-4dfe-a16e-fe0d52617564', 'name': 'auto_allocated_network', 'tenant_id': '8981554bfb65485a9218dab7f347822d', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['91f54b37-b5bc-463a-931b-a34707078f9d', 'd15c7507-6da8-4e7a-bb36-7b411b2b575a'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2026-01-21T23:43:42Z', 'updated_at': '2026-01-21T23:43:54Z', 'revision_number': 4, 'project_id': '8981554bfb65485a9218dab7f347822d'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478
Jan 21 23:43:55 compute-0 nova_compute[182935]: 2026-01-21 23:43:55.261 182939 WARNING oslo_policy.policy [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 21 23:43:55 compute-0 nova_compute[182935]: 2026-01-21 23:43:55.261 182939 WARNING oslo_policy.policy [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 21 23:43:55 compute-0 nova_compute[182935]: 2026-01-21 23:43:55.265 182939 DEBUG nova.policy [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:43:57 compute-0 nova_compute[182935]: 2026-01-21 23:43:57.891 182939 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Successfully created port: 4bf69269-42ff-414d-a6b7-9b7b63abe9ad _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:44:00 compute-0 podman[211824]: 2026-01-21 23:44:00.688716579 +0000 UTC m=+0.065237678 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 23:44:01 compute-0 nova_compute[182935]: 2026-01-21 23:44:01.046 182939 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Successfully updated port: 4bf69269-42ff-414d-a6b7-9b7b63abe9ad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:44:01 compute-0 nova_compute[182935]: 2026-01-21 23:44:01.071 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "refresh_cache-a56178c2-b7df-492f-816a-580ea1a80c21" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:44:01 compute-0 nova_compute[182935]: 2026-01-21 23:44:01.071 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquired lock "refresh_cache-a56178c2-b7df-492f-816a-580ea1a80c21" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:44:01 compute-0 nova_compute[182935]: 2026-01-21 23:44:01.071 182939 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:44:01 compute-0 nova_compute[182935]: 2026-01-21 23:44:01.470 182939 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:44:01 compute-0 nova_compute[182935]: 2026-01-21 23:44:01.576 182939 DEBUG nova.compute.manager [req-d15dae7e-457c-4069-b5ee-bfa6b8cf8816 req-a886bc22-4d3c-4996-9249-be801b493a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Received event network-changed-4bf69269-42ff-414d-a6b7-9b7b63abe9ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:01 compute-0 nova_compute[182935]: 2026-01-21 23:44:01.576 182939 DEBUG nova.compute.manager [req-d15dae7e-457c-4069-b5ee-bfa6b8cf8816 req-a886bc22-4d3c-4996-9249-be801b493a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Refreshing instance network info cache due to event network-changed-4bf69269-42ff-414d-a6b7-9b7b63abe9ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:44:01 compute-0 nova_compute[182935]: 2026-01-21 23:44:01.577 182939 DEBUG oslo_concurrency.lockutils [req-d15dae7e-457c-4069-b5ee-bfa6b8cf8816 req-a886bc22-4d3c-4996-9249-be801b493a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-a56178c2-b7df-492f-816a-580ea1a80c21" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:44:02 compute-0 podman[211845]: 2026-01-21 23:44:02.732867807 +0000 UTC m=+0.065501393 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.expose-services=, release=1755695350, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Jan 21 23:44:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:03.177 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:03.177 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:03.178 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.465 182939 DEBUG nova.network.neutron [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Updating instance_info_cache with network_info: [{"id": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "address": "fa:16:3e:c9:b1:78", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::18e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bf69269-42", "ovs_interfaceid": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.529 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Releasing lock "refresh_cache-a56178c2-b7df-492f-816a-580ea1a80c21" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.529 182939 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Instance network_info: |[{"id": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "address": "fa:16:3e:c9:b1:78", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::18e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bf69269-42", "ovs_interfaceid": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.530 182939 DEBUG oslo_concurrency.lockutils [req-d15dae7e-457c-4069-b5ee-bfa6b8cf8816 req-a886bc22-4d3c-4996-9249-be801b493a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-a56178c2-b7df-492f-816a-580ea1a80c21" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.530 182939 DEBUG nova.network.neutron [req-d15dae7e-457c-4069-b5ee-bfa6b8cf8816 req-a886bc22-4d3c-4996-9249-be801b493a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Refreshing network info cache for port 4bf69269-42ff-414d-a6b7-9b7b63abe9ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.532 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Start _get_guest_xml network_info=[{"id": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "address": "fa:16:3e:c9:b1:78", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::18e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bf69269-42", "ovs_interfaceid": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.539 182939 WARNING nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.547 182939 DEBUG nova.virt.libvirt.host [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.548 182939 DEBUG nova.virt.libvirt.host [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.557 182939 DEBUG nova.virt.libvirt.host [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.558 182939 DEBUG nova.virt.libvirt.host [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.560 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.561 182939 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.561 182939 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.562 182939 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.562 182939 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.562 182939 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.563 182939 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.563 182939 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.563 182939 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.563 182939 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.564 182939 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.564 182939 DEBUG nova.virt.hardware [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.570 182939 DEBUG nova.virt.libvirt.vif [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:43:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1377062952-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1377062952-3',id=4,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8981554bfb65485a9218dab7f347822d',ramdisk_id='',reservation_id='r-ourrdd3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-1853609216',owner_user_name='tempest-AutoAllocateNetworkTest-1853609216-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:43:41Z,user_data=None,user_id='f92dd0c2072346c6b7e7588673443ff2',uuid=a56178c2-b7df-492f-816a-580ea1a80c21,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "address": "fa:16:3e:c9:b1:78", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::18e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bf69269-42", "ovs_interfaceid": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.570 182939 DEBUG nova.network.os_vif_util [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Converting VIF {"id": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "address": "fa:16:3e:c9:b1:78", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::18e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bf69269-42", "ovs_interfaceid": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.571 182939 DEBUG nova.network.os_vif_util [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:b1:78,bridge_name='br-int',has_traffic_filtering=True,id=4bf69269-42ff-414d-a6b7-9b7b63abe9ad,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bf69269-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.573 182939 DEBUG nova.objects.instance [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lazy-loading 'pci_devices' on Instance uuid a56178c2-b7df-492f-816a-580ea1a80c21 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.590 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:44:03 compute-0 nova_compute[182935]:   <uuid>a56178c2-b7df-492f-816a-580ea1a80c21</uuid>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   <name>instance-00000004</name>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <nova:name>tempest-tempest.common.compute-instance-1377062952-3</nova:name>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:44:03</nova:creationTime>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:44:03 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:44:03 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:44:03 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:44:03 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:44:03 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:44:03 compute-0 nova_compute[182935]:         <nova:user uuid="f92dd0c2072346c6b7e7588673443ff2">tempest-AutoAllocateNetworkTest-1853609216-project-member</nova:user>
Jan 21 23:44:03 compute-0 nova_compute[182935]:         <nova:project uuid="8981554bfb65485a9218dab7f347822d">tempest-AutoAllocateNetworkTest-1853609216</nova:project>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:44:03 compute-0 nova_compute[182935]:         <nova:port uuid="4bf69269-42ff-414d-a6b7-9b7b63abe9ad">
Jan 21 23:44:03 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.1.0.43" ipVersion="4"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="fdfe:381f:8400::18e" ipVersion="6"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <system>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <entry name="serial">a56178c2-b7df-492f-816a-580ea1a80c21</entry>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <entry name="uuid">a56178c2-b7df-492f-816a-580ea1a80c21</entry>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     </system>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   <os>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   </os>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   <features>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   </features>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk.config"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:c9:b1:78"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <target dev="tap4bf69269-42"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/console.log" append="off"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <video>
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     </video>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:44:03 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:44:03 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:44:03 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:44:03 compute-0 nova_compute[182935]: </domain>
Jan 21 23:44:03 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.591 182939 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Preparing to wait for external event network-vif-plugged-4bf69269-42ff-414d-a6b7-9b7b63abe9ad prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.591 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.592 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.592 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.593 182939 DEBUG nova.virt.libvirt.vif [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:43:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1377062952-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1377062952-3',id=4,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8981554bfb65485a9218dab7f347822d',ramdisk_id='',reservation_id='r-ourrdd3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-1853609216',owner_user_name='tempest-AutoAllocateNetworkTest-1853609216-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:43:41Z,user_data=None,user_id='f92dd0c2072346c6b7e7588673443ff2',uuid=a56178c2-b7df-492f-816a-580ea1a80c21,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "address": "fa:16:3e:c9:b1:78", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::18e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bf69269-42", "ovs_interfaceid": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.594 182939 DEBUG nova.network.os_vif_util [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Converting VIF {"id": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "address": "fa:16:3e:c9:b1:78", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::18e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bf69269-42", "ovs_interfaceid": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.595 182939 DEBUG nova.network.os_vif_util [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:b1:78,bridge_name='br-int',has_traffic_filtering=True,id=4bf69269-42ff-414d-a6b7-9b7b63abe9ad,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bf69269-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.595 182939 DEBUG os_vif [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:b1:78,bridge_name='br-int',has_traffic_filtering=True,id=4bf69269-42ff-414d-a6b7-9b7b63abe9ad,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bf69269-42') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.661 182939 DEBUG ovsdbapp.backend.ovs_idl [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.662 182939 DEBUG ovsdbapp.backend.ovs_idl [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.662 182939 DEBUG ovsdbapp.backend.ovs_idl [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.662 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.663 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [POLLOUT] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.663 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.664 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.665 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.667 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.677 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.677 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.677 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:44:03 compute-0 nova_compute[182935]: 2026-01-21 23:44:03.679 182939 INFO oslo.privsep.daemon [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpmj48x3k_/privsep.sock']
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.371 182939 INFO oslo.privsep.daemon [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Spawned new privsep daemon via rootwrap
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.234 211869 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.239 211869 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.242 211869 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.242 211869 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211869
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.530 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.677 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.678 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bf69269-42, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.679 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4bf69269-42, col_values=(('external_ids', {'iface-id': '4bf69269-42ff-414d-a6b7-9b7b63abe9ad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:b1:78', 'vm-uuid': 'a56178c2-b7df-492f-816a-580ea1a80c21'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.681 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:04 compute-0 NetworkManager[55139]: <info>  [1769039044.6826] manager: (tap4bf69269-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.685 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.733 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:04 compute-0 nova_compute[182935]: 2026-01-21 23:44:04.735 182939 INFO os_vif [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:b1:78,bridge_name='br-int',has_traffic_filtering=True,id=4bf69269-42ff-414d-a6b7-9b7b63abe9ad,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bf69269-42')
Jan 21 23:44:05 compute-0 nova_compute[182935]: 2026-01-21 23:44:05.566 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:44:05 compute-0 nova_compute[182935]: 2026-01-21 23:44:05.567 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:44:05 compute-0 nova_compute[182935]: 2026-01-21 23:44:05.567 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] No VIF found with MAC fa:16:3e:c9:b1:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:44:05 compute-0 nova_compute[182935]: 2026-01-21 23:44:05.568 182939 INFO nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Using config drive
Jan 21 23:44:07 compute-0 nova_compute[182935]: 2026-01-21 23:44:07.063 182939 INFO nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Creating config drive at /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk.config
Jan 21 23:44:07 compute-0 nova_compute[182935]: 2026-01-21 23:44:07.068 182939 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpflp39qnv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:07 compute-0 nova_compute[182935]: 2026-01-21 23:44:07.097 182939 DEBUG nova.network.neutron [req-d15dae7e-457c-4069-b5ee-bfa6b8cf8816 req-a886bc22-4d3c-4996-9249-be801b493a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Updated VIF entry in instance network info cache for port 4bf69269-42ff-414d-a6b7-9b7b63abe9ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:44:07 compute-0 nova_compute[182935]: 2026-01-21 23:44:07.098 182939 DEBUG nova.network.neutron [req-d15dae7e-457c-4069-b5ee-bfa6b8cf8816 req-a886bc22-4d3c-4996-9249-be801b493a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Updating instance_info_cache with network_info: [{"id": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "address": "fa:16:3e:c9:b1:78", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::18e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bf69269-42", "ovs_interfaceid": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:07 compute-0 nova_compute[182935]: 2026-01-21 23:44:07.116 182939 DEBUG oslo_concurrency.lockutils [req-d15dae7e-457c-4069-b5ee-bfa6b8cf8816 req-a886bc22-4d3c-4996-9249-be801b493a7c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-a56178c2-b7df-492f-816a-580ea1a80c21" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:44:07 compute-0 nova_compute[182935]: 2026-01-21 23:44:07.209 182939 DEBUG oslo_concurrency.processutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpflp39qnv" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:07 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 21 23:44:07 compute-0 kernel: tap4bf69269-42: entered promiscuous mode
Jan 21 23:44:07 compute-0 NetworkManager[55139]: <info>  [1769039047.3079] manager: (tap4bf69269-42): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Jan 21 23:44:07 compute-0 nova_compute[182935]: 2026-01-21 23:44:07.309 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:07 compute-0 ovn_controller[95047]: 2026-01-21T23:44:07Z|00027|binding|INFO|Claiming lport 4bf69269-42ff-414d-a6b7-9b7b63abe9ad for this chassis.
Jan 21 23:44:07 compute-0 ovn_controller[95047]: 2026-01-21T23:44:07Z|00028|binding|INFO|4bf69269-42ff-414d-a6b7-9b7b63abe9ad: Claiming fa:16:3e:c9:b1:78 10.1.0.43 fdfe:381f:8400::18e
Jan 21 23:44:07 compute-0 nova_compute[182935]: 2026-01-21 23:44:07.316 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:07.339 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:b1:78 10.1.0.43 fdfe:381f:8400::18e'], port_security=['fa:16:3e:c9:b1:78 10.1.0.43 fdfe:381f:8400::18e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.43/26 fdfe:381f:8400::18e/64', 'neutron:device_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8981554bfb65485a9218dab7f347822d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e4bb4842-7cc7-47df-ad92-e426d20758f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=343b670f-2d8d-4f56-9cb9-7d9682347428, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=4bf69269-42ff-414d-a6b7-9b7b63abe9ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:44:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:07.344 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 4bf69269-42ff-414d-a6b7-9b7b63abe9ad in datapath 48de92c9-2a56-4dfe-a16e-fe0d52617564 bound to our chassis
Jan 21 23:44:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:07.349 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48de92c9-2a56-4dfe-a16e-fe0d52617564
Jan 21 23:44:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:07.352 104408 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpldmbzkd1/privsep.sock']
Jan 21 23:44:07 compute-0 systemd-udevd[211897]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:44:07 compute-0 NetworkManager[55139]: <info>  [1769039047.3916] device (tap4bf69269-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:44:07 compute-0 NetworkManager[55139]: <info>  [1769039047.3928] device (tap4bf69269-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:44:07 compute-0 systemd-machined[154182]: New machine qemu-2-instance-00000004.
Jan 21 23:44:07 compute-0 nova_compute[182935]: 2026-01-21 23:44:07.430 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:07 compute-0 ovn_controller[95047]: 2026-01-21T23:44:07Z|00029|binding|INFO|Setting lport 4bf69269-42ff-414d-a6b7-9b7b63abe9ad ovn-installed in OVS
Jan 21 23:44:07 compute-0 ovn_controller[95047]: 2026-01-21T23:44:07Z|00030|binding|INFO|Setting lport 4bf69269-42ff-414d-a6b7-9b7b63abe9ad up in Southbound
Jan 21 23:44:07 compute-0 nova_compute[182935]: 2026-01-21 23:44:07.437 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:07 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Jan 21 23:44:08 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:08.819 104408 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 21 23:44:08 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:08.819 104408 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpldmbzkd1/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 21 23:44:08 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:07.957 211917 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 23:44:08 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:08.021 211917 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 23:44:08 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:08.024 211917 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 21 23:44:08 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:08.025 211917 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211917
Jan 21 23:44:08 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:08.822 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5ae64651-276d-40c6-89e0-45e10278d60b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:09 compute-0 nova_compute[182935]: 2026-01-21 23:44:09.465 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039049.4646893, a56178c2-b7df-492f-816a-580ea1a80c21 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:09 compute-0 nova_compute[182935]: 2026-01-21 23:44:09.466 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] VM Started (Lifecycle Event)
Jan 21 23:44:09 compute-0 nova_compute[182935]: 2026-01-21 23:44:09.499 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:09 compute-0 nova_compute[182935]: 2026-01-21 23:44:09.505 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039049.4650924, a56178c2-b7df-492f-816a-580ea1a80c21 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:09 compute-0 nova_compute[182935]: 2026-01-21 23:44:09.506 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] VM Paused (Lifecycle Event)
Jan 21 23:44:09 compute-0 nova_compute[182935]: 2026-01-21 23:44:09.529 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:09 compute-0 nova_compute[182935]: 2026-01-21 23:44:09.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:09 compute-0 nova_compute[182935]: 2026-01-21 23:44:09.541 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:44:09 compute-0 nova_compute[182935]: 2026-01-21 23:44:09.578 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:44:09 compute-0 nova_compute[182935]: 2026-01-21 23:44:09.681 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:09 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:09.783 211917 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:09 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:09.783 211917 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:09 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:09.783 211917 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:10.423 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e78ad942-7963-418b-b01c-234b6ce77517]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:10.426 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48de92c9-21 in ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:44:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:10.428 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48de92c9-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:44:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:10.428 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[835e618d-b2aa-4f72-b457-cdb21066d23f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:10.432 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[97f6ac8e-9091-4c67-9079-8fb495a847d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:10.464 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[3231c7d5-76a4-42ee-a054-577ffd730c05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:10.497 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[17231479-5720-41db-84c4-dc010f690195]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:10.500 104408 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpionngf6a/privsep.sock']
Jan 21 23:44:11 compute-0 nova_compute[182935]: 2026-01-21 23:44:11.882 182939 DEBUG nova.compute.manager [req-242b2a6e-bdae-490d-98a4-c262af65b4f8 req-f1a30f6b-314f-4c07-ae0d-ca20a5ddbd75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Received event network-vif-plugged-4bf69269-42ff-414d-a6b7-9b7b63abe9ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:11 compute-0 nova_compute[182935]: 2026-01-21 23:44:11.882 182939 DEBUG oslo_concurrency.lockutils [req-242b2a6e-bdae-490d-98a4-c262af65b4f8 req-f1a30f6b-314f-4c07-ae0d-ca20a5ddbd75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:11 compute-0 nova_compute[182935]: 2026-01-21 23:44:11.883 182939 DEBUG oslo_concurrency.lockutils [req-242b2a6e-bdae-490d-98a4-c262af65b4f8 req-f1a30f6b-314f-4c07-ae0d-ca20a5ddbd75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:11 compute-0 nova_compute[182935]: 2026-01-21 23:44:11.883 182939 DEBUG oslo_concurrency.lockutils [req-242b2a6e-bdae-490d-98a4-c262af65b4f8 req-f1a30f6b-314f-4c07-ae0d-ca20a5ddbd75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:11 compute-0 nova_compute[182935]: 2026-01-21 23:44:11.883 182939 DEBUG nova.compute.manager [req-242b2a6e-bdae-490d-98a4-c262af65b4f8 req-f1a30f6b-314f-4c07-ae0d-ca20a5ddbd75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Processing event network-vif-plugged-4bf69269-42ff-414d-a6b7-9b7b63abe9ad _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:44:11 compute-0 nova_compute[182935]: 2026-01-21 23:44:11.884 182939 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:44:11 compute-0 nova_compute[182935]: 2026-01-21 23:44:11.927 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039051.9272234, a56178c2-b7df-492f-816a-580ea1a80c21 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:11 compute-0 nova_compute[182935]: 2026-01-21 23:44:11.928 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] VM Resumed (Lifecycle Event)
Jan 21 23:44:11 compute-0 nova_compute[182935]: 2026-01-21 23:44:11.930 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.033 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.086 182939 INFO nova.virt.libvirt.driver [-] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Instance spawned successfully.
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.087 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.096 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.120 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.123 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.123 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.124 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.124 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.124 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.125 182939 DEBUG nova.virt.libvirt.driver [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:44:12 compute-0 podman[211940]: 2026-01-21 23:44:12.134090748 +0000 UTC m=+0.209213172 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:44:12 compute-0 podman[211939]: 2026-01-21 23:44:12.175469246 +0000 UTC m=+0.089170969 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.207 182939 INFO nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Took 30.92 seconds to spawn the instance on the hypervisor.
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.207 182939 DEBUG nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:12 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:12.222 104408 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 21 23:44:12 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:12.223 104408 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpionngf6a/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 21 23:44:12 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:11.562 211938 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 21 23:44:12 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:11.799 211938 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 21 23:44:12 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:11.801 211938 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 21 23:44:12 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:11.801 211938 INFO oslo.privsep.daemon [-] privsep daemon running as pid 211938
Jan 21 23:44:12 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:12.225 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d5538636-4e7d-42f5-ac80-cf6e82bd4bca]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.298 182939 INFO nova.compute.manager [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Took 32.21 seconds to build instance.
Jan 21 23:44:12 compute-0 nova_compute[182935]: 2026-01-21 23:44:12.329 182939 DEBUG oslo_concurrency.lockutils [None req-6a50c92f-4b45-45d8-a1f7-9bf4d114bfc5 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 32.360s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:12 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:12.783 211938 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:12 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:12.784 211938 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:12 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:12.784 211938 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:14 compute-0 nova_compute[182935]: 2026-01-21 23:44:14.383 182939 DEBUG nova.compute.manager [req-9ae3b464-9d0e-46cc-a942-79dc585b237c req-4f890875-23ce-4c50-8ac4-23b9830804ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Received event network-vif-plugged-4bf69269-42ff-414d-a6b7-9b7b63abe9ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:14 compute-0 nova_compute[182935]: 2026-01-21 23:44:14.384 182939 DEBUG oslo_concurrency.lockutils [req-9ae3b464-9d0e-46cc-a942-79dc585b237c req-4f890875-23ce-4c50-8ac4-23b9830804ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:14 compute-0 nova_compute[182935]: 2026-01-21 23:44:14.384 182939 DEBUG oslo_concurrency.lockutils [req-9ae3b464-9d0e-46cc-a942-79dc585b237c req-4f890875-23ce-4c50-8ac4-23b9830804ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:14 compute-0 nova_compute[182935]: 2026-01-21 23:44:14.384 182939 DEBUG oslo_concurrency.lockutils [req-9ae3b464-9d0e-46cc-a942-79dc585b237c req-4f890875-23ce-4c50-8ac4-23b9830804ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:14 compute-0 nova_compute[182935]: 2026-01-21 23:44:14.384 182939 DEBUG nova.compute.manager [req-9ae3b464-9d0e-46cc-a942-79dc585b237c req-4f890875-23ce-4c50-8ac4-23b9830804ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] No waiting events found dispatching network-vif-plugged-4bf69269-42ff-414d-a6b7-9b7b63abe9ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:44:14 compute-0 nova_compute[182935]: 2026-01-21 23:44:14.385 182939 WARNING nova.compute.manager [req-9ae3b464-9d0e-46cc-a942-79dc585b237c req-4f890875-23ce-4c50-8ac4-23b9830804ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Received unexpected event network-vif-plugged-4bf69269-42ff-414d-a6b7-9b7b63abe9ad for instance with vm_state active and task_state None.
Jan 21 23:44:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:14.467 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[911b32b9-3b3d-440f-9818-5f34d970c082]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-0 NetworkManager[55139]: <info>  [1769039054.5166] manager: (tap48de92c9-20): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Jan 21 23:44:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:14.519 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c8dbf2f3-ccd1-4e3f-a6bd-2b647360b090]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-0 nova_compute[182935]: 2026-01-21 23:44:14.532 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:14 compute-0 systemd-udevd[211995]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:44:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:14.544 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9ecc5f17-668c-4750-bbbe-de67c1a74abc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:14.547 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[4a7c7dbe-1fbf-4873-bf0f-704ef51dc596]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-0 NetworkManager[55139]: <info>  [1769039054.5685] device (tap48de92c9-20): carrier: link connected
Jan 21 23:44:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:14.574 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[2c92757d-921a-4bba-87dd-4c87a1fa0cce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-0 nova_compute[182935]: 2026-01-21 23:44:14.682 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:14.695 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2851648f-b4a3-4313-afc3-4c2fc7c4624b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48de92c9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:15:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355730, 'reachable_time': 25033, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211999, 'error': None, 'target': 'ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:14.720 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1f2e9b99-7ced-4abc-97c1-f36544c5c0e5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:155d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 355730, 'tstamp': 355730}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212006, 'error': None, 'target': 'ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:14.738 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[df0bc9d2-daa1-4664-b82c-e480bf305bd7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48de92c9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:15:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355730, 'reachable_time': 25033, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212007, 'error': None, 'target': 'ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:14.934 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8bc83ffb-93bd-4702-a525-aee2b338869e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:15.091 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[732d7669-2303-4b81-bb3c-8430430d8107]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:15.095 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48de92c9-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:15.096 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:15.097 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48de92c9-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:15 compute-0 nova_compute[182935]: 2026-01-21 23:44:15.099 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:15 compute-0 kernel: tap48de92c9-20: entered promiscuous mode
Jan 21 23:44:15 compute-0 NetworkManager[55139]: <info>  [1769039055.1007] manager: (tap48de92c9-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:15.110 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48de92c9-20, col_values=(('external_ids', {'iface-id': '07b3db2a-1439-4d24-a2d8-d2c29586e870'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:15 compute-0 nova_compute[182935]: 2026-01-21 23:44:15.112 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:15 compute-0 ovn_controller[95047]: 2026-01-21T23:44:15Z|00031|binding|INFO|Releasing lport 07b3db2a-1439-4d24-a2d8-d2c29586e870 from this chassis (sb_readonly=0)
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:15.118 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48de92c9-2a56-4dfe-a16e-fe0d52617564.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48de92c9-2a56-4dfe-a16e-fe0d52617564.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:15.129 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[57b842a6-9c3f-4a4a-8424-3a082b4a51c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:15 compute-0 nova_compute[182935]: 2026-01-21 23:44:15.132 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:15.136 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-48de92c9-2a56-4dfe-a16e-fe0d52617564
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/48de92c9-2a56-4dfe-a16e-fe0d52617564.pid.haproxy
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 48de92c9-2a56-4dfe-a16e-fe0d52617564
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:44:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:15.138 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'env', 'PROCESS_TAG=haproxy-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48de92c9-2a56-4dfe-a16e-fe0d52617564.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:44:15 compute-0 nova_compute[182935]: 2026-01-21 23:44:15.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:16 compute-0 podman[212040]: 2026-01-21 23:44:16.100735074 +0000 UTC m=+0.095690494 container create 5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 23:44:16 compute-0 systemd[1]: Started libpod-conmon-5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1.scope.
Jan 21 23:44:16 compute-0 podman[212040]: 2026-01-21 23:44:16.07249516 +0000 UTC m=+0.067450580 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:44:16 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:44:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb6fa5c69833ac3ae43a8a7a44287f25b9d1d53d6367b7d93ed48f464fc6f3e0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:44:16 compute-0 podman[212040]: 2026-01-21 23:44:16.197788849 +0000 UTC m=+0.192744249 container init 5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:44:16 compute-0 podman[212040]: 2026-01-21 23:44:16.203074965 +0000 UTC m=+0.198030365 container start 5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:44:16 compute-0 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[212055]: [NOTICE]   (212059) : New worker (212061) forked
Jan 21 23:44:16 compute-0 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[212055]: [NOTICE]   (212059) : Loading success.
Jan 21 23:44:16 compute-0 nova_compute[182935]: 2026-01-21 23:44:16.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:16 compute-0 nova_compute[182935]: 2026-01-21 23:44:16.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:16 compute-0 nova_compute[182935]: 2026-01-21 23:44:16.838 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:16 compute-0 nova_compute[182935]: 2026-01-21 23:44:16.839 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:16 compute-0 nova_compute[182935]: 2026-01-21 23:44:16.839 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:16 compute-0 nova_compute[182935]: 2026-01-21 23:44:16.839 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:44:16 compute-0 nova_compute[182935]: 2026-01-21 23:44:16.942 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.038 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.040 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.098 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.105 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.166 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.167 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.257 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.413 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.415 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5451MB free_disk=73.35369873046875GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.415 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.416 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.525 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 079c4a41-1146-4c56-a278-70fbef0949eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.525 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance a56178c2-b7df-492f-816a-580ea1a80c21 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.525 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.525 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.602 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.635 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.669 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:44:17 compute-0 nova_compute[182935]: 2026-01-21 23:44:17.669 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:18 compute-0 nova_compute[182935]: 2026-01-21 23:44:18.664 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:18 compute-0 nova_compute[182935]: 2026-01-21 23:44:18.665 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:18 compute-0 nova_compute[182935]: 2026-01-21 23:44:18.665 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:18 compute-0 nova_compute[182935]: 2026-01-21 23:44:18.665 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:44:18 compute-0 nova_compute[182935]: 2026-01-21 23:44:18.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:18 compute-0 nova_compute[182935]: 2026-01-21 23:44:18.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:44:18 compute-0 nova_compute[182935]: 2026-01-21 23:44:18.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:44:19 compute-0 nova_compute[182935]: 2026-01-21 23:44:19.026 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-079c4a41-1146-4c56-a278-70fbef0949eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:44:19 compute-0 nova_compute[182935]: 2026-01-21 23:44:19.026 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-079c4a41-1146-4c56-a278-70fbef0949eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:44:19 compute-0 nova_compute[182935]: 2026-01-21 23:44:19.026 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:44:19 compute-0 nova_compute[182935]: 2026-01-21 23:44:19.027 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 079c4a41-1146-4c56-a278-70fbef0949eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:19 compute-0 nova_compute[182935]: 2026-01-21 23:44:19.430 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:44:19 compute-0 nova_compute[182935]: 2026-01-21 23:44:19.534 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:19 compute-0 nova_compute[182935]: 2026-01-21 23:44:19.685 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:19 compute-0 podman[212090]: 2026-01-21 23:44:19.698597581 +0000 UTC m=+0.049414280 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:44:20 compute-0 nova_compute[182935]: 2026-01-21 23:44:20.153 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:20 compute-0 nova_compute[182935]: 2026-01-21 23:44:20.196 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-079c4a41-1146-4c56-a278-70fbef0949eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:44:20 compute-0 nova_compute[182935]: 2026-01-21 23:44:20.197 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:44:20 compute-0 nova_compute[182935]: 2026-01-21 23:44:20.197 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:20 compute-0 nova_compute[182935]: 2026-01-21 23:44:20.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:44:22 compute-0 podman[212115]: 2026-01-21 23:44:22.733425224 +0000 UTC m=+0.104255798 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 23:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:23.737 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}56c7ea009b801d9698f0c834e3db9692471ec425d445fb9ddff9e80c5e7b5f2e" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 21 23:44:24 compute-0 nova_compute[182935]: 2026-01-21 23:44:24.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.680 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1181 Content-Type: application/json Date: Wed, 21 Jan 2026 23:44:23 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-9b6b1b9a-5cab-4df3-8ce8-a49e26e2f30d x-openstack-request-id: req-9b6b1b9a-5cab-4df3-8ce8-a49e26e2f30d _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.680 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1628087693", "name": "tempest-flavor_with_ephemeral_1-1039478167", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1628087693"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1628087693"}]}, {"id": "17063997", "name": "tempest-flavor_with_ephemeral_0-176066592", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/17063997"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/17063997"}]}, {"id": "c3389c03-89c4-4ff5-9e03-1a99d41713d4", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4"}]}, {"id": "ff01ccba-ad51-439f-9037-926190d6dc0f", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/ff01ccba-ad51-439f-9037-926190d6dc0f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/ff01ccba-ad51-439f-9037-926190d6dc0f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.681 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-9b6b1b9a-5cab-4df3-8ce8-a49e26e2f30d request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.683 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}56c7ea009b801d9698f0c834e3db9692471ec425d445fb9ddff9e80c5e7b5f2e" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 21 23:44:24 compute-0 nova_compute[182935]: 2026-01-21 23:44:24.687 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.846 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Wed, 21 Jan 2026 23:44:24 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-a3487b68-3dbf-40fb-abd9-793752073f83 x-openstack-request-id: req-a3487b68-3dbf-40fb-abd9-793752073f83 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.846 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "c3389c03-89c4-4ff5-9e03-1a99d41713d4", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.847 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4 used request id req-a3487b68-3dbf-40fb-abd9-793752073f83 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.849 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'name': 'tempest-tempest.common.compute-instance-1377062952-3', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000004', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8981554bfb65485a9218dab7f347822d', 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'hostId': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.852 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8981554bfb65485a9218dab7f347822d', 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'hostId': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.852 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.866 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.868 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.881 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.882 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d228190-3bba-4d55-9fa1-6668a71bc311', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-vda', 'timestamp': '2026-01-21T23:44:24.852922', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e0e9fea-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.647423338, 'message_signature': '3f0b86fbe94706cf5ec235a3eac949c88e8139ca0b2a79fe7427f470870bf1f1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-sda', 'timestamp': '2026-01-21T23:44:24.852922', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e0eb93a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.647423338, 'message_signature': '7df08378d3183e0269190108a1f9c8189ea56ad1ea8b7bce997afc2c88eb809f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-vda', 'timestamp': '2026-01-21T23:44:24.852922', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e10db70-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.662691793, 'message_signature': 'f816bbffb71c85d386635156f3b62ca280491e4fab9dcd141607941f0d6ac312'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-sda', 'timestamp': '2026-01-21T23:44:24.852922', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e10e8d6-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.662691793, 'message_signature': '241a0ccd223e86bb30743a3572b1503f36e62a2e00f7eeecf6264b1f08bbb6ad'}]}, 'timestamp': '2026-01-21 23:44:24.882915', '_unique_id': '5ad7ccdfd6734e5691fb7e4c53461725'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.890 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.893 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.893 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.894 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-1377062952-3>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-178392874>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-1377062952-3>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-178392874>]
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.894 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.898 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a56178c2-b7df-492f-816a-580ea1a80c21 / tap4bf69269-42 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.898 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a040ddf-fdbb-4376-a827-a0f5f5318703', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'instance-00000004-a56178c2-b7df-492f-816a-580ea1a80c21-tap4bf69269-42', 'timestamp': '2026-01-21T23:44:24.894653', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'tap4bf69269-42', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:b1:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bf69269-42'}, 'message_id': '1e13576a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.689176275, 'message_signature': '4cf7e8f186ef5c5c67d003e64adf171d3364b9908476e0c78970c79caf356404'}]}, 'timestamp': '2026-01-21 23:44:24.900535', '_unique_id': '07273186de3d473896cef353c4903b3d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.901 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.902 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.902 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e3cfc57-bd4e-4e75-830b-c0ab789cd7cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'instance-00000004-a56178c2-b7df-492f-816a-580ea1a80c21-tap4bf69269-42', 'timestamp': '2026-01-21T23:44:24.902737', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'tap4bf69269-42', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:b1:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bf69269-42'}, 'message_id': '1e14034a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.689176275, 'message_signature': 'e069c5e2091021de6fc0867a42a7eb7fba61ed529a515bba332ed0fe7ad0a916'}]}, 'timestamp': '2026-01-21 23:44:24.903335', '_unique_id': 'b48ad763f71240938a2520a9344e52d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.904 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.905 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd620846-84e5-49ce-87bd-87a4c0ebcb25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'instance-00000004-a56178c2-b7df-492f-816a-580ea1a80c21-tap4bf69269-42', 'timestamp': '2026-01-21T23:44:24.905060', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'tap4bf69269-42', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:b1:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bf69269-42'}, 'message_id': '1e145d40-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.689176275, 'message_signature': 'bab7ba1a641c4ccb415bbc6169e13d0e40988b5a7c69612c186ac6666325ff5a'}]}, 'timestamp': '2026-01-21 23:44:24.905422', '_unique_id': '03ac2b12130543f1a11108fc19ca8adb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.906 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.907 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f40aea70-2353-4bd1-bedb-568a0ee03c1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'instance-00000004-a56178c2-b7df-492f-816a-580ea1a80c21-tap4bf69269-42', 'timestamp': '2026-01-21T23:44:24.907044', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'tap4bf69269-42', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:b1:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bf69269-42'}, 'message_id': '1e14a9e4-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.689176275, 'message_signature': 'b37202102a5af865ea9b9bea3a13bfbc5e0a59a5b4e79f096bac0037fffead62'}]}, 'timestamp': '2026-01-21 23:44:24.907377', '_unique_id': '66afdd11196c4912ba1b76baeaff5919'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.908 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.925 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/memory.usage volume: 40.4765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.938 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/memory.usage volume: 41.14453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'daca943f-6fed-4252-abac-9f4af958e732', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4765625, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'timestamp': '2026-01-21T23:44:24.908883', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '1e17834e-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.719417417, 'message_signature': '18f37444fe93558e1301a209ffacad19ed999d0f03fb8b2ba05492c89613c3b9'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 41.14453125, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'timestamp': '2026-01-21T23:44:24.908883', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '1e19854a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.73298552, 'message_signature': 'cbe0b42ffc99851501b99bbc929814f3d5f844d20d705f30a0b27d6f7b38ec1d'}]}, 'timestamp': '2026-01-21 23:44:24.939210', '_unique_id': '864b2eac53a64678bf773bf1f17c6a80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.940 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.941 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f53cbc43-5c7b-4d50-887e-5871ef716a29', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'instance-00000004-a56178c2-b7df-492f-816a-580ea1a80c21-tap4bf69269-42', 'timestamp': '2026-01-21T23:44:24.941080', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'tap4bf69269-42', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:b1:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bf69269-42'}, 'message_id': '1e19dbd0-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.689176275, 'message_signature': '61a725568623a6c99334955725ab3855d03f0a87370964db6415b412fdec36f7'}]}, 'timestamp': '2026-01-21 23:44:24.941426', '_unique_id': 'f294165c637c44d3adf8a06415b6d086'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.942 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.943 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.943 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.943 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-1377062952-3>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-178392874>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-1377062952-3>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-178392874>]
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.943 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.967 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.967 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.987 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.read.bytes volume: 29870592 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.987 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc894f47-a737-462d-ae53-e459f904cee4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-vda', 'timestamp': '2026-01-21T23:44:24.943752', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e1ddc3a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.738158093, 'message_signature': '08ad11cfefb0a17e1bc9e59bd9442223a45d3d290598b8538892863797b80253'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-sda', 'timestamp': '2026-01-21T23:44:24.943752', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e1de766-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.738158093, 'message_signature': '711480f6896abed9eedf80e3603b277e623adf0dcd5576281928383bcbdb3dc5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29870592, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-vda', 'timestamp': '2026-01-21T23:44:24.943752', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e20e402-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.762164096, 'message_signature': 'c07d4d5dc76d074a93d472bb03f09e09821113b5c4c15db76e5579124b61d013'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-sda', 'timestamp': '2026-01-21T23:44:24.943752', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e20ee8e-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.762164096, 'message_signature': '27eb5516630f93b7592e345db47777ac253ce4a6c7c395fc43dd677d763aeb9f'}]}, 'timestamp': '2026-01-21 23:44:24.987757', '_unique_id': '413821e6c1864dccbd0468f329115553'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.988 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.989 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.989 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.read.latency volume: 118467059 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.989 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.read.latency volume: 355429 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.990 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.read.latency volume: 163754388 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.990 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.read.latency volume: 30331299 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3fe92fa-1c45-4088-9b35-af3d5326eec3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 118467059, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-vda', 'timestamp': '2026-01-21T23:44:24.989519', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e214190-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.738158093, 'message_signature': 'f75628933e123e5e856d73c8b5a9d63ed5bf94417cc356ad05c4bc4428e07154'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 355429, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-sda', 'timestamp': '2026-01-21T23:44:24.989519', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e214e38-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.738158093, 'message_signature': '7cbd21fd99f3efbffa7b4b2890f30775521f78d47cba44ff50a62b9c7aa94b4f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 163754388, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-vda', 'timestamp': '2026-01-21T23:44:24.989519', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e21595a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.762164096, 'message_signature': 'c7039c15c719bbb93a5006b50d510284e367c06bb06316e1dd278cb6aa986a6f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 30331299, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-sda', 'timestamp': '2026-01-21T23:44:24.989519', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e216422-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.762164096, 'message_signature': '2ee83b71affda556fb2f50307b824a2077a881f1d004374c06239999295b2d0e'}]}, 'timestamp': '2026-01-21 23:44:24.990760', '_unique_id': '927dbcff8cba4ddda0a9af4420495258'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.991 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.992 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.992 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.993 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.write.requests volume: 321 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.993 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c64c6161-4f02-44f2-afbb-70df74aae123', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-vda', 'timestamp': '2026-01-21T23:44:24.992486', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e21b3d2-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.738158093, 'message_signature': '4771cdaa00364c1c2c609e4b4ec396aadc0fe060c3d6940332b5d11dcb9eff07'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-sda', 'timestamp': '2026-01-21T23:44:24.992486', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e21c03e-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.738158093, 'message_signature': '5760f3ae0038518adafb62cd809bf11e23afd72e54617c9b5d664b5281e7646e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 321, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-vda', 'timestamp': '2026-01-21T23:44:24.992486', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e21cb4c-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.762164096, 'message_signature': '6f8b327f9dfa6e1fe3cc5f47aefa278581a5f42355f1e26122805489b20ff8de'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-sda', 'timestamp': '2026-01-21T23:44:24.992486', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e21d60a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.762164096, 'message_signature': 'f50cbeb2600b1ec2c3fd97da08798a264c48bed3ded51a05f69dedad085dfb43'}]}, 'timestamp': '2026-01-21 23:44:24.993711', '_unique_id': '6d273aa9d40a4795984e3ceee0489a37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.994 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.995 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52e1f61a-f8a7-4d5a-ad8a-403fd127ef88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'instance-00000004-a56178c2-b7df-492f-816a-580ea1a80c21-tap4bf69269-42', 'timestamp': '2026-01-21T23:44:24.995360', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'tap4bf69269-42', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:b1:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bf69269-42'}, 'message_id': '1e22247a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.689176275, 'message_signature': 'e0523182d87ecb381ef443a06be488d86b9fd7d331eb3d9bf537dceeced830d5'}]}, 'timestamp': '2026-01-21 23:44:24.995704', '_unique_id': '3172671c23ba4205987fae42704bc1e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.996 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.997 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.997 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/cpu volume: 12550000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.997 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/cpu volume: 13660000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c468e657-48fc-4aed-976e-01d1b0ba91ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12550000000, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'timestamp': '2026-01-21T23:44:24.997365', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '1e227272-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.719417417, 'message_signature': '0a4ef83fec7884d44c3d103e626ac162f6fc25d95d48b29316e212c44c831436'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13660000000, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'timestamp': '2026-01-21T23:44:24.997365', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '1e228276-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.73298552, 'message_signature': 'd932db75e0c4fe993f8801236de4683e387c5fd1a80b7bcfaafe7c3ede7386bb'}]}, 'timestamp': '2026-01-21 23:44:24.998095', '_unique_id': '116c5d5d3f60438296287f7cc5d62008'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.998 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.999 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 23:44:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.999 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:24.999 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.000 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.write.latency volume: 1519719382 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.000 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9e959ef-2e9b-452d-ac36-5b7d1e4df4e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-vda', 'timestamp': '2026-01-21T23:44:24.999594', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e22c902-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.738158093, 'message_signature': 'd541ba30530d488a194b881eabaf6f4e2de7bb807de44aa7ee2f42371ce4f1c0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-sda', 'timestamp': '2026-01-21T23:44:24.999594', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e22d53c-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.738158093, 'message_signature': '1c61b39c11a59668ff5259a286a05478cec0a72be15e10fad321e26457cdca65'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1519719382, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-vda', 'timestamp': '2026-01-21T23:44:24.999594', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e22e054-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.762164096, 'message_signature': '47e3044e7121712d531224bcda5d328bb23c6c84df500986a7f2f5d3e5423b46'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-sda', 'timestamp': '2026-01-21T23:44:24.999594', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e22eb1c-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.762164096, 'message_signature': '4c8c30450863697c3b9c76f60b9e065d00c02743531cd8a30732329697e251c0'}]}, 'timestamp': '2026-01-21 23:44:25.000768', '_unique_id': '6040fd6981b84285a23378bf44e4220d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.001 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.002 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fd6d7f7-8cb4-4419-ba03-497973a42c2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'instance-00000004-a56178c2-b7df-492f-816a-580ea1a80c21-tap4bf69269-42', 'timestamp': '2026-01-21T23:44:25.002431', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'tap4bf69269-42', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:b1:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bf69269-42'}, 'message_id': '1e2337f2-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.689176275, 'message_signature': '7d12dc6dea5bd5b96a0eeb1389e496844d1f770fb183627a0d4beb1d51a4fb28'}]}, 'timestamp': '2026-01-21 23:44:25.002755', '_unique_id': '7cd9f0d2efe5405aabbc3b5c7c035dc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.003 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.004 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.004 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.004 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.004 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.005 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ed231ba-d217-4277-b703-9ba6253409e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-vda', 'timestamp': '2026-01-21T23:44:25.004237', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e237e38-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.647423338, 'message_signature': 'c89034a67e3868cb7aa27ebe747a360b0192e30499bb9af82a294316d27ec9c7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-sda', 'timestamp': '2026-01-21T23:44:25.004237', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e238a54-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.647423338, 'message_signature': 'a5ba0445eb9ee8f690bf98ed7c1c3319b0598f9ab2921b36edaaa1fb5d4d4fd6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-vda', 'timestamp': '2026-01-21T23:44:25.004237', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e239684-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.662691793, 'message_signature': 'e71619ae7e96215da3450df1f866f5cf3b55618d895ead81764a5b68c5336352'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-sda', 'timestamp': '2026-01-21T23:44:25.004237', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e23a142-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.662691793, 'message_signature': '0a3307a973b8481e0637076ead60d4d65a184bb4185f14dbefb5fb057a2d7951'}]}, 'timestamp': '2026-01-21 23:44:25.005431', '_unique_id': '5c6c61ae076a45dc93df4397f56a3d5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.006 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.007 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.007 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.007 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.007 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.read.requests volume: 1069 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.008 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08d63ba9-af4c-4dcd-b7bc-e0f844dd2a6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-vda', 'timestamp': '2026-01-21T23:44:25.007119', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e23ef6c-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.738158093, 'message_signature': '7576046d7b1831b6ed66668d538c87af0588dd3e6e466053afea2a1662c798bb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-sda', 'timestamp': '2026-01-21T23:44:25.007119', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e23fb1a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.738158093, 'message_signature': '2f8c0cea4a643cab8aee9d58ad8837010ad2c8f0b48e3d7e3961f72a55fdd0bf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1069, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-vda', 'timestamp': '2026-01-21T23:44:25.007119', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e2406b4-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.762164096, 'message_signature': '5f7745635d7b97ae3e26cfcc4aecf53c04f609c72150e428153dcdeb73e2bd85'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-sda', 'timestamp': '2026-01-21T23:44:25.007119', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e2415c8-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.762164096, 'message_signature': 'e2a4027e63113a91dd933adc9a85468fa54cb6ce8ec3d11e2a39a6e91b602ea8'}]}, 'timestamp': '2026-01-21 23:44:25.008454', '_unique_id': 'ec1e35d51ed949679e8461f3eaa2efa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.009 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.010 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.010 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e93cbde-ae38-48c5-a084-04cd63e7a49e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'instance-00000004-a56178c2-b7df-492f-816a-580ea1a80c21-tap4bf69269-42', 'timestamp': '2026-01-21T23:44:25.010298', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'tap4bf69269-42', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:b1:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bf69269-42'}, 'message_id': '1e246bc2-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.689176275, 'message_signature': 'f1bb50d15b808a9d921da709482eb77cfddad03634baf27c9869558bbf2d2053'}]}, 'timestamp': '2026-01-21 23:44:25.010643', '_unique_id': '0db9f24002d84ec19f6830d7a09488dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.011 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.012 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.012 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e01d9402-c603-41e9-a4f0-4d313a3d709b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'instance-00000004-a56178c2-b7df-492f-816a-580ea1a80c21-tap4bf69269-42', 'timestamp': '2026-01-21T23:44:25.012128', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'tap4bf69269-42', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:b1:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bf69269-42'}, 'message_id': '1e24b29e-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.689176275, 'message_signature': 'ae099d4514ddec857d80fcc0837c0f5d94ade288dffbe47f28d33a8c7d30694e'}]}, 'timestamp': '2026-01-21 23:44:25.012465', '_unique_id': '98cc1fd1fbce4bd095dc78b6ec9692f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.013 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.014 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.014 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.014 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.write.bytes volume: 72953856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.014 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88dd7a9d-3e57-46dc-a461-c04f41ff7230', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-vda', 'timestamp': '2026-01-21T23:44:25.014076', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e24fe8e-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.738158093, 'message_signature': '1d850d0dee85b0b1d4a36680aad96e023b2feed04fdd2f7c1ca2493f25889fa4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-sda', 'timestamp': '2026-01-21T23:44:25.014076', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e2509a6-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.738158093, 'message_signature': 'eb85d27bc88c7ddb140c5c4697862b19e248ec3658cdeead7d3e9262f7560503'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72953856, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-vda', 'timestamp': '2026-01-21T23:44:25.014076', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e25166c-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.762164096, 'message_signature': '8a5e9427aba735c4cb1b2204fea805b9cd70094e7bdc63a9084afbfb3040aa7d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-sda', 'timestamp': '2026-01-21T23:44:25.014076', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e25215c-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.762164096, 'message_signature': '83e3fa6d9447d80b43fd7be85bff3dc484124e4c435399ea4ecc63c4a6e1fe2c'}]}, 'timestamp': '2026-01-21 23:44:25.015265', '_unique_id': '44ea5ae26d914d408b197f91bb557109'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.015 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.016 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.016 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.017 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-1377062952-3>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-178392874>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-1377062952-3>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-178392874>]
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.017 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.017 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06f3dc37-4b98-487e-970e-59b1cbae16a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'instance-00000004-a56178c2-b7df-492f-816a-580ea1a80c21-tap4bf69269-42', 'timestamp': '2026-01-21T23:44:25.017341', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'tap4bf69269-42', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c9:b1:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4bf69269-42'}, 'message_id': '1e257e68-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.689176275, 'message_signature': '110e33b70417b60960121ca6a960473cee0aa0419ef58b4eb7f0ca66811fbe0d'}]}, 'timestamp': '2026-01-21 23:44:25.017723', '_unique_id': '6cb5617519ae4d019ec70b8cadd45be7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.018 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.019 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.019 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.019 12 DEBUG ceilometer.compute.pollsters [-] a56178c2-b7df-492f-816a-580ea1a80c21/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.019 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.019 12 DEBUG ceilometer.compute.pollsters [-] 079c4a41-1146-4c56-a278-70fbef0949eb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36bb6fbb-9f79-430a-8c9b-0d4d7c1079bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-vda', 'timestamp': '2026-01-21T23:44:25.019125', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e25c3c8-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.647423338, 'message_signature': '0697c0dd49bcb3c9e1350bdeaaaade03af2fcd9ffb8c251a8393ab84dcf3ff71'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': 'a56178c2-b7df-492f-816a-580ea1a80c21-sda', 'timestamp': '2026-01-21T23:44:25.019125', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1377062952-3', 'name': 'instance-00000004', 'instance_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e25cc06-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.647423338, 'message_signature': '9dcb185616f23c98d49b3cc310e0e161a865bab8db18e174c82afeeb68591729'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-vda', 'timestamp': '2026-01-21T23:44:25.019125', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e25d386-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.662691793, 'message_signature': '42cfd470128cf02e99877f9b55664e53645ed27d64bdf1242ae2263d9d0ba34e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f92dd0c2072346c6b7e7588673443ff2', 'user_name': None, 'project_id': '8981554bfb65485a9218dab7f347822d', 'project_name': None, 'resource_id': '079c4a41-1146-4c56-a278-70fbef0949eb-sda', 'timestamp': '2026-01-21T23:44:25.019125', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-178392874', 'name': 'instance-00000001', 'instance_id': '079c4a41-1146-4c56-a278-70fbef0949eb', 'instance_type': 'm1.nano', 'host': '7a280b6f24d4c5d4f5d8eeb675dd08f3fc42047bc1eb35976af0c406', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e25dbe2-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3567.662691793, 'message_signature': 'd97c9477bdd52e1ea16744e9c3c5ca252586efa15bd6b16633bae43ddd645087'}]}, 'timestamp': '2026-01-21 23:44:25.019995', '_unique_id': 'b85df586d13749a9b2a942d8c24f3bfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.020 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.021 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.021 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:44:25 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:44:25.021 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-1377062952-3>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-178392874>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-1377062952-3>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-178392874>]
Jan 21 23:44:26 compute-0 ovn_controller[95047]: 2026-01-21T23:44:26Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c9:b1:78 10.1.0.43
Jan 21 23:44:26 compute-0 ovn_controller[95047]: 2026-01-21T23:44:26Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c9:b1:78 10.1.0.43
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.282 182939 DEBUG oslo_concurrency.lockutils [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "a56178c2-b7df-492f-816a-580ea1a80c21" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.283 182939 DEBUG oslo_concurrency.lockutils [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.283 182939 DEBUG oslo_concurrency.lockutils [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.284 182939 DEBUG oslo_concurrency.lockutils [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.284 182939 DEBUG oslo_concurrency.lockutils [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.296 182939 INFO nova.compute.manager [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Terminating instance
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.304 182939 DEBUG nova.compute.manager [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:44:26 compute-0 kernel: tap4bf69269-42 (unregistering): left promiscuous mode
Jan 21 23:44:26 compute-0 NetworkManager[55139]: <info>  [1769039066.3247] device (tap4bf69269-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:44:26 compute-0 ovn_controller[95047]: 2026-01-21T23:44:26Z|00032|binding|INFO|Releasing lport 4bf69269-42ff-414d-a6b7-9b7b63abe9ad from this chassis (sb_readonly=0)
Jan 21 23:44:26 compute-0 ovn_controller[95047]: 2026-01-21T23:44:26Z|00033|binding|INFO|Setting lport 4bf69269-42ff-414d-a6b7-9b7b63abe9ad down in Southbound
Jan 21 23:44:26 compute-0 ovn_controller[95047]: 2026-01-21T23:44:26Z|00034|binding|INFO|Removing iface tap4bf69269-42 ovn-installed in OVS
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.368 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.370 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.376 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:b1:78 10.1.0.43 fdfe:381f:8400::18e'], port_security=['fa:16:3e:c9:b1:78 10.1.0.43 fdfe:381f:8400::18e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.43/26 fdfe:381f:8400::18e/64', 'neutron:device_id': 'a56178c2-b7df-492f-816a-580ea1a80c21', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8981554bfb65485a9218dab7f347822d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e4bb4842-7cc7-47df-ad92-e426d20758f4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=343b670f-2d8d-4f56-9cb9-7d9682347428, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=4bf69269-42ff-414d-a6b7-9b7b63abe9ad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.377 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 4bf69269-42ff-414d-a6b7-9b7b63abe9ad in datapath 48de92c9-2a56-4dfe-a16e-fe0d52617564 unbound from our chassis
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.378 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48de92c9-2a56-4dfe-a16e-fe0d52617564, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.381 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce95d00-5880-4be9-989b-c3bce85bde9c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.382 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564 namespace which is not needed anymore
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.387 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:26 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 21 23:44:26 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 16.107s CPU time.
Jan 21 23:44:26 compute-0 systemd-machined[154182]: Machine qemu-2-instance-00000004 terminated.
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.525 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.529 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:26 compute-0 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[212055]: [NOTICE]   (212059) : haproxy version is 2.8.14-c23fe91
Jan 21 23:44:26 compute-0 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[212055]: [NOTICE]   (212059) : path to executable is /usr/sbin/haproxy
Jan 21 23:44:26 compute-0 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[212055]: [WARNING]  (212059) : Exiting Master process...
Jan 21 23:44:26 compute-0 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[212055]: [WARNING]  (212059) : Exiting Master process...
Jan 21 23:44:26 compute-0 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[212055]: [ALERT]    (212059) : Current worker (212061) exited with code 143 (Terminated)
Jan 21 23:44:26 compute-0 neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564[212055]: [WARNING]  (212059) : All workers exited. Exiting... (0)
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.563 182939 INFO nova.virt.libvirt.driver [-] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Instance destroyed successfully.
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.563 182939 DEBUG nova.objects.instance [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lazy-loading 'resources' on Instance uuid a56178c2-b7df-492f-816a-580ea1a80c21 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:26 compute-0 systemd[1]: libpod-5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1.scope: Deactivated successfully.
Jan 21 23:44:26 compute-0 podman[212171]: 2026-01-21 23:44:26.572048524 +0000 UTC m=+0.058975838 container died 5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.585 182939 DEBUG nova.virt.libvirt.vif [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:43:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-1377062952-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1377062952-3',id=4,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2026-01-21T23:44:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8981554bfb65485a9218dab7f347822d',ramdisk_id='',reservation_id='r-ourrdd3l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-1853609216',owner_user_name='tempest-AutoAllocateNetworkTest-1853609216-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:44:12Z,user_data=None,user_id='f92dd0c2072346c6b7e7588673443ff2',uuid=a56178c2-b7df-492f-816a-580ea1a80c21,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "address": "fa:16:3e:c9:b1:78", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::18e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bf69269-42", "ovs_interfaceid": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.585 182939 DEBUG nova.network.os_vif_util [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Converting VIF {"id": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "address": "fa:16:3e:c9:b1:78", "network": {"id": "48de92c9-2a56-4dfe-a16e-fe0d52617564", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.43", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::18e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8981554bfb65485a9218dab7f347822d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bf69269-42", "ovs_interfaceid": "4bf69269-42ff-414d-a6b7-9b7b63abe9ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.586 182939 DEBUG nova.network.os_vif_util [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:b1:78,bridge_name='br-int',has_traffic_filtering=True,id=4bf69269-42ff-414d-a6b7-9b7b63abe9ad,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bf69269-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.587 182939 DEBUG os_vif [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:b1:78,bridge_name='br-int',has_traffic_filtering=True,id=4bf69269-42ff-414d-a6b7-9b7b63abe9ad,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bf69269-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.589 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.590 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bf69269-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.592 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.593 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.594 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.599 182939 INFO os_vif [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:b1:78,bridge_name='br-int',has_traffic_filtering=True,id=4bf69269-42ff-414d-a6b7-9b7b63abe9ad,network=Network(48de92c9-2a56-4dfe-a16e-fe0d52617564),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bf69269-42')
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.600 182939 INFO nova.virt.libvirt.driver [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Deleting instance files /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21_del
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.601 182939 INFO nova.virt.libvirt.driver [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Deletion of /var/lib/nova/instances/a56178c2-b7df-492f-816a-580ea1a80c21_del complete
Jan 21 23:44:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1-userdata-shm.mount: Deactivated successfully.
Jan 21 23:44:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-fb6fa5c69833ac3ae43a8a7a44287f25b9d1d53d6367b7d93ed48f464fc6f3e0-merged.mount: Deactivated successfully.
Jan 21 23:44:26 compute-0 podman[212171]: 2026-01-21 23:44:26.61886873 +0000 UTC m=+0.105796024 container cleanup 5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:44:26 compute-0 systemd[1]: libpod-conmon-5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1.scope: Deactivated successfully.
Jan 21 23:44:26 compute-0 podman[212215]: 2026-01-21 23:44:26.686527505 +0000 UTC m=+0.042065785 container remove 5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.691 182939 DEBUG nova.virt.libvirt.host [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.692 182939 INFO nova.virt.libvirt.host [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] UEFI support detected
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.694 182939 INFO nova.compute.manager [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.693 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ed8448a3-fde2-4397-ba00-9ff61203c09c]: (4, ('Wed Jan 21 11:44:26 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564 (5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1)\n5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1\nWed Jan 21 11:44:26 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564 (5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1)\n5cf89255f15783eebd6fe2c2c323001839e466f8fd15ad995bde7e1196530ac1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.694 182939 DEBUG oslo.service.loopingcall [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.695 182939 DEBUG nova.compute.manager [-] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.695 182939 DEBUG nova.network.neutron [-] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.695 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f35c80a7-0ce6-434e-acf1-2dc99e1f4474]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.696 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48de92c9-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:26 compute-0 kernel: tap48de92c9-20: left promiscuous mode
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.702 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.702 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f045a7a7-c0ac-4ce0-94b5-771943b42222]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:26 compute-0 nova_compute[182935]: 2026-01-21 23:44:26.712 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.727 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[45f1c34d-4a6f-447e-aca1-d3600d9e97eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.729 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[847f7351-6754-4d35-a9f6-7cb544828be2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.745 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[48d7d60d-e266-467a-9a46-d54645f22062]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355720, 'reachable_time': 42026, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212230, 'error': None, 'target': 'ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:26 compute-0 systemd[1]: run-netns-ovnmeta\x2d48de92c9\x2d2a56\x2d4dfe\x2da16e\x2dfe0d52617564.mount: Deactivated successfully.
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.756 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-48de92c9-2a56-4dfe-a16e-fe0d52617564 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:44:26 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:26.757 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[e27d6e69-6bd3-4fb6-a1cd-d652c54d8472]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:44:27 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:27.671 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:44:27 compute-0 nova_compute[182935]: 2026-01-21 23:44:27.671 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:27 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:27.672 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.347 182939 DEBUG nova.network.neutron [-] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.368 182939 INFO nova.compute.manager [-] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Took 1.67 seconds to deallocate network for instance.
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.520 182939 DEBUG oslo_concurrency.lockutils [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.521 182939 DEBUG oslo_concurrency.lockutils [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.545 182939 DEBUG nova.compute.manager [req-7c9adf07-cc75-4558-97f6-29781a6ba74e req-ba2d3f6e-ed96-42ef-ba87-af65996ae081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Received event network-vif-unplugged-4bf69269-42ff-414d-a6b7-9b7b63abe9ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.546 182939 DEBUG oslo_concurrency.lockutils [req-7c9adf07-cc75-4558-97f6-29781a6ba74e req-ba2d3f6e-ed96-42ef-ba87-af65996ae081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.547 182939 DEBUG oslo_concurrency.lockutils [req-7c9adf07-cc75-4558-97f6-29781a6ba74e req-ba2d3f6e-ed96-42ef-ba87-af65996ae081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.547 182939 DEBUG oslo_concurrency.lockutils [req-7c9adf07-cc75-4558-97f6-29781a6ba74e req-ba2d3f6e-ed96-42ef-ba87-af65996ae081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.547 182939 DEBUG nova.compute.manager [req-7c9adf07-cc75-4558-97f6-29781a6ba74e req-ba2d3f6e-ed96-42ef-ba87-af65996ae081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] No waiting events found dispatching network-vif-unplugged-4bf69269-42ff-414d-a6b7-9b7b63abe9ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.548 182939 WARNING nova.compute.manager [req-7c9adf07-cc75-4558-97f6-29781a6ba74e req-ba2d3f6e-ed96-42ef-ba87-af65996ae081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Received unexpected event network-vif-unplugged-4bf69269-42ff-414d-a6b7-9b7b63abe9ad for instance with vm_state deleted and task_state None.
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.548 182939 DEBUG nova.compute.manager [req-7c9adf07-cc75-4558-97f6-29781a6ba74e req-ba2d3f6e-ed96-42ef-ba87-af65996ae081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Received event network-vif-plugged-4bf69269-42ff-414d-a6b7-9b7b63abe9ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.548 182939 DEBUG oslo_concurrency.lockutils [req-7c9adf07-cc75-4558-97f6-29781a6ba74e req-ba2d3f6e-ed96-42ef-ba87-af65996ae081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.549 182939 DEBUG oslo_concurrency.lockutils [req-7c9adf07-cc75-4558-97f6-29781a6ba74e req-ba2d3f6e-ed96-42ef-ba87-af65996ae081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.549 182939 DEBUG oslo_concurrency.lockutils [req-7c9adf07-cc75-4558-97f6-29781a6ba74e req-ba2d3f6e-ed96-42ef-ba87-af65996ae081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.549 182939 DEBUG nova.compute.manager [req-7c9adf07-cc75-4558-97f6-29781a6ba74e req-ba2d3f6e-ed96-42ef-ba87-af65996ae081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] No waiting events found dispatching network-vif-plugged-4bf69269-42ff-414d-a6b7-9b7b63abe9ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.550 182939 WARNING nova.compute.manager [req-7c9adf07-cc75-4558-97f6-29781a6ba74e req-ba2d3f6e-ed96-42ef-ba87-af65996ae081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Received unexpected event network-vif-plugged-4bf69269-42ff-414d-a6b7-9b7b63abe9ad for instance with vm_state deleted and task_state None.
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.556 182939 DEBUG nova.compute.manager [req-4670bf8a-3749-4357-99a3-9beeba3754a6 req-74a22195-74de-462e-9a54-f4206fd3cd61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Received event network-vif-deleted-4bf69269-42ff-414d-a6b7-9b7b63abe9ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.610 182939 DEBUG nova.compute.provider_tree [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.633 182939 DEBUG nova.scheduler.client.report [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.663 182939 DEBUG oslo_concurrency.lockutils [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.695 182939 INFO nova.scheduler.client.report [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Deleted allocations for instance a56178c2-b7df-492f-816a-580ea1a80c21
Jan 21 23:44:28 compute-0 nova_compute[182935]: 2026-01-21 23:44:28.776 182939 DEBUG oslo_concurrency.lockutils [None req-0970716d-551a-43d1-85db-334244d6e384 f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "a56178c2-b7df-492f-816a-580ea1a80c21" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:29 compute-0 sshd-session[212232]: Invalid user nagios from 188.166.69.60 port 53098
Jan 21 23:44:29 compute-0 sshd-session[212232]: Connection closed by invalid user nagios 188.166.69.60 port 53098 [preauth]
Jan 21 23:44:29 compute-0 nova_compute[182935]: 2026-01-21 23:44:29.537 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:31 compute-0 nova_compute[182935]: 2026-01-21 23:44:31.592 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:31 compute-0 podman[212234]: 2026-01-21 23:44:31.696286356 +0000 UTC m=+0.064547322 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 21 23:44:33 compute-0 nova_compute[182935]: 2026-01-21 23:44:33.192 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:33 compute-0 podman[212256]: 2026-01-21 23:44:33.696215049 +0000 UTC m=+0.068859483 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_id=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 23:44:34 compute-0 nova_compute[182935]: 2026-01-21 23:44:34.539 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:35 compute-0 nova_compute[182935]: 2026-01-21 23:44:35.577 182939 DEBUG oslo_concurrency.lockutils [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "079c4a41-1146-4c56-a278-70fbef0949eb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:35 compute-0 nova_compute[182935]: 2026-01-21 23:44:35.578 182939 DEBUG oslo_concurrency.lockutils [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "079c4a41-1146-4c56-a278-70fbef0949eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:35 compute-0 nova_compute[182935]: 2026-01-21 23:44:35.579 182939 DEBUG oslo_concurrency.lockutils [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "079c4a41-1146-4c56-a278-70fbef0949eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:35 compute-0 nova_compute[182935]: 2026-01-21 23:44:35.579 182939 DEBUG oslo_concurrency.lockutils [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "079c4a41-1146-4c56-a278-70fbef0949eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:35 compute-0 nova_compute[182935]: 2026-01-21 23:44:35.579 182939 DEBUG oslo_concurrency.lockutils [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "079c4a41-1146-4c56-a278-70fbef0949eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:35 compute-0 nova_compute[182935]: 2026-01-21 23:44:35.589 182939 INFO nova.compute.manager [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Terminating instance
Jan 21 23:44:35 compute-0 nova_compute[182935]: 2026-01-21 23:44:35.601 182939 DEBUG oslo_concurrency.lockutils [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "refresh_cache-079c4a41-1146-4c56-a278-70fbef0949eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:44:35 compute-0 nova_compute[182935]: 2026-01-21 23:44:35.601 182939 DEBUG oslo_concurrency.lockutils [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquired lock "refresh_cache-079c4a41-1146-4c56-a278-70fbef0949eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:44:35 compute-0 nova_compute[182935]: 2026-01-21 23:44:35.601 182939 DEBUG nova.network.neutron [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.226 182939 DEBUG nova.network.neutron [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.571 182939 DEBUG nova.network.neutron [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.587 182939 DEBUG oslo_concurrency.lockutils [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Releasing lock "refresh_cache-079c4a41-1146-4c56-a278-70fbef0949eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.588 182939 DEBUG nova.compute.manager [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.639 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:36 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Jan 21 23:44:36 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 16.940s CPU time.
Jan 21 23:44:36 compute-0 systemd-machined[154182]: Machine qemu-1-instance-00000001 terminated.
Jan 21 23:44:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:44:36.674 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.836 182939 INFO nova.virt.libvirt.driver [-] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Instance destroyed successfully.
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.837 182939 DEBUG nova.objects.instance [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lazy-loading 'resources' on Instance uuid 079c4a41-1146-4c56-a278-70fbef0949eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.854 182939 INFO nova.virt.libvirt.driver [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Deleting instance files /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb_del
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.855 182939 INFO nova.virt.libvirt.driver [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Deletion of /var/lib/nova/instances/079c4a41-1146-4c56-a278-70fbef0949eb_del complete
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.924 182939 INFO nova.compute.manager [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.925 182939 DEBUG oslo.service.loopingcall [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.925 182939 DEBUG nova.compute.manager [-] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:44:36 compute-0 nova_compute[182935]: 2026-01-21 23:44:36.925 182939 DEBUG nova.network.neutron [-] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:44:37 compute-0 nova_compute[182935]: 2026-01-21 23:44:37.365 182939 DEBUG nova.network.neutron [-] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:44:37 compute-0 nova_compute[182935]: 2026-01-21 23:44:37.406 182939 DEBUG nova.network.neutron [-] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:44:37 compute-0 nova_compute[182935]: 2026-01-21 23:44:37.423 182939 INFO nova.compute.manager [-] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Took 0.50 seconds to deallocate network for instance.
Jan 21 23:44:37 compute-0 nova_compute[182935]: 2026-01-21 23:44:37.534 182939 DEBUG oslo_concurrency.lockutils [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:44:37 compute-0 nova_compute[182935]: 2026-01-21 23:44:37.535 182939 DEBUG oslo_concurrency.lockutils [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:44:37 compute-0 nova_compute[182935]: 2026-01-21 23:44:37.606 182939 DEBUG nova.compute.provider_tree [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:44:37 compute-0 nova_compute[182935]: 2026-01-21 23:44:37.624 182939 DEBUG nova.scheduler.client.report [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:44:37 compute-0 nova_compute[182935]: 2026-01-21 23:44:37.650 182939 DEBUG oslo_concurrency.lockutils [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:37 compute-0 nova_compute[182935]: 2026-01-21 23:44:37.679 182939 INFO nova.scheduler.client.report [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Deleted allocations for instance 079c4a41-1146-4c56-a278-70fbef0949eb
Jan 21 23:44:37 compute-0 nova_compute[182935]: 2026-01-21 23:44:37.827 182939 DEBUG oslo_concurrency.lockutils [None req-ab3a75a0-3609-46f0-98a4-50f379bf019d f92dd0c2072346c6b7e7588673443ff2 8981554bfb65485a9218dab7f347822d - - default default] Lock "079c4a41-1146-4c56-a278-70fbef0949eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:44:39 compute-0 nova_compute[182935]: 2026-01-21 23:44:39.541 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:41 compute-0 nova_compute[182935]: 2026-01-21 23:44:41.561 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039066.5597656, a56178c2-b7df-492f-816a-580ea1a80c21 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:41 compute-0 nova_compute[182935]: 2026-01-21 23:44:41.562 182939 INFO nova.compute.manager [-] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] VM Stopped (Lifecycle Event)
Jan 21 23:44:41 compute-0 nova_compute[182935]: 2026-01-21 23:44:41.583 182939 DEBUG nova.compute.manager [None req-c63531c7-e326-4af8-a2fe-2f5647922dd0 - - - - - -] [instance: a56178c2-b7df-492f-816a-580ea1a80c21] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:41 compute-0 nova_compute[182935]: 2026-01-21 23:44:41.641 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:42 compute-0 podman[212286]: 2026-01-21 23:44:42.722983324 +0000 UTC m=+0.091749700 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:44:42 compute-0 podman[212287]: 2026-01-21 23:44:42.723345863 +0000 UTC m=+0.085074541 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:44:44 compute-0 nova_compute[182935]: 2026-01-21 23:44:44.543 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:46 compute-0 nova_compute[182935]: 2026-01-21 23:44:46.644 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:49 compute-0 nova_compute[182935]: 2026-01-21 23:44:49.545 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:50 compute-0 podman[212335]: 2026-01-21 23:44:50.678141885 +0000 UTC m=+0.051989131 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:44:51 compute-0 nova_compute[182935]: 2026-01-21 23:44:51.647 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:51 compute-0 nova_compute[182935]: 2026-01-21 23:44:51.835 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039076.834544, 079c4a41-1146-4c56-a278-70fbef0949eb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:44:51 compute-0 nova_compute[182935]: 2026-01-21 23:44:51.836 182939 INFO nova.compute.manager [-] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] VM Stopped (Lifecycle Event)
Jan 21 23:44:51 compute-0 nova_compute[182935]: 2026-01-21 23:44:51.866 182939 DEBUG nova.compute.manager [None req-34e85768-82f0-42ec-b0d7-deda42311e44 - - - - - -] [instance: 079c4a41-1146-4c56-a278-70fbef0949eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:44:53 compute-0 podman[212359]: 2026-01-21 23:44:53.681736063 +0000 UTC m=+0.049659726 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 21 23:44:54 compute-0 nova_compute[182935]: 2026-01-21 23:44:54.549 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:56 compute-0 nova_compute[182935]: 2026-01-21 23:44:56.651 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:44:59 compute-0 nova_compute[182935]: 2026-01-21 23:44:59.552 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:01 compute-0 anacron[30915]: Job `cron.weekly' started
Jan 21 23:45:01 compute-0 anacron[30915]: Job `cron.weekly' terminated
Jan 21 23:45:01 compute-0 nova_compute[182935]: 2026-01-21 23:45:01.654 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:02 compute-0 podman[212380]: 2026-01-21 23:45:02.739031367 +0000 UTC m=+0.097421605 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 21 23:45:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:03.178 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:03.179 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:03.179 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:04 compute-0 nova_compute[182935]: 2026-01-21 23:45:04.587 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:04 compute-0 podman[212401]: 2026-01-21 23:45:04.730412267 +0000 UTC m=+0.095325585 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 23:45:06 compute-0 nova_compute[182935]: 2026-01-21 23:45:06.657 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:09 compute-0 nova_compute[182935]: 2026-01-21 23:45:09.591 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:11 compute-0 nova_compute[182935]: 2026-01-21 23:45:11.665 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:13 compute-0 sshd-session[212423]: Invalid user tomcat from 188.166.69.60 port 43404
Jan 21 23:45:13 compute-0 podman[212426]: 2026-01-21 23:45:13.643994245 +0000 UTC m=+0.062687087 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:45:13 compute-0 sshd-session[212423]: Connection closed by invalid user tomcat 188.166.69.60 port 43404 [preauth]
Jan 21 23:45:13 compute-0 podman[212425]: 2026-01-21 23:45:13.685454044 +0000 UTC m=+0.102906816 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 21 23:45:13 compute-0 nova_compute[182935]: 2026-01-21 23:45:13.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:14 compute-0 nova_compute[182935]: 2026-01-21 23:45:14.594 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:15 compute-0 nova_compute[182935]: 2026-01-21 23:45:15.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:16 compute-0 nova_compute[182935]: 2026-01-21 23:45:16.669 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:17 compute-0 nova_compute[182935]: 2026-01-21 23:45:17.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:18 compute-0 nova_compute[182935]: 2026-01-21 23:45:18.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:18 compute-0 nova_compute[182935]: 2026-01-21 23:45:18.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:45:18 compute-0 nova_compute[182935]: 2026-01-21 23:45:18.855 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:45:18 compute-0 nova_compute[182935]: 2026-01-21 23:45:18.856 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:18 compute-0 nova_compute[182935]: 2026-01-21 23:45:18.857 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:18 compute-0 nova_compute[182935]: 2026-01-21 23:45:18.857 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:18 compute-0 nova_compute[182935]: 2026-01-21 23:45:18.857 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:45:18 compute-0 nova_compute[182935]: 2026-01-21 23:45:18.858 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:18 compute-0 nova_compute[182935]: 2026-01-21 23:45:18.885 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:18 compute-0 nova_compute[182935]: 2026-01-21 23:45:18.886 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:18 compute-0 nova_compute[182935]: 2026-01-21 23:45:18.886 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:18 compute-0 nova_compute[182935]: 2026-01-21 23:45:18.886 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:45:19 compute-0 nova_compute[182935]: 2026-01-21 23:45:19.039 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:45:19 compute-0 nova_compute[182935]: 2026-01-21 23:45:19.041 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5795MB free_disk=73.38315200805664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:45:19 compute-0 nova_compute[182935]: 2026-01-21 23:45:19.041 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:19 compute-0 nova_compute[182935]: 2026-01-21 23:45:19.041 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:19 compute-0 nova_compute[182935]: 2026-01-21 23:45:19.123 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:45:19 compute-0 nova_compute[182935]: 2026-01-21 23:45:19.123 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:45:19 compute-0 nova_compute[182935]: 2026-01-21 23:45:19.146 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:45:19 compute-0 nova_compute[182935]: 2026-01-21 23:45:19.173 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:45:19 compute-0 nova_compute[182935]: 2026-01-21 23:45:19.203 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:45:19 compute-0 nova_compute[182935]: 2026-01-21 23:45:19.204 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:19 compute-0 nova_compute[182935]: 2026-01-21 23:45:19.596 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:20 compute-0 nova_compute[182935]: 2026-01-21 23:45:20.645 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "eae14f31-b7b2-4d6d-8b75-16323b2c7a01" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:20 compute-0 nova_compute[182935]: 2026-01-21 23:45:20.646 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "eae14f31-b7b2-4d6d-8b75-16323b2c7a01" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:20 compute-0 nova_compute[182935]: 2026-01-21 23:45:20.815 182939 DEBUG nova.compute.manager [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:45:20 compute-0 nova_compute[182935]: 2026-01-21 23:45:20.919 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:20 compute-0 nova_compute[182935]: 2026-01-21 23:45:20.919 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:20 compute-0 nova_compute[182935]: 2026-01-21 23:45:20.924 182939 DEBUG nova.virt.hardware [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:45:20 compute-0 nova_compute[182935]: 2026-01-21 23:45:20.925 182939 INFO nova.compute.claims [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.056 182939 DEBUG nova.compute.provider_tree [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.072 182939 DEBUG nova.scheduler.client.report [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.099 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.100 182939 DEBUG nova.compute.manager [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.140 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.187 182939 DEBUG nova.compute.manager [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.205 182939 INFO nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.229 182939 DEBUG nova.compute.manager [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.401 182939 DEBUG nova.compute.manager [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.402 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.402 182939 INFO nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Creating image(s)
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.403 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.403 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.404 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.416 182939 DEBUG oslo_concurrency.processutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.502 182939 DEBUG oslo_concurrency.processutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.506 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.507 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.533 182939 DEBUG oslo_concurrency.processutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.589 182939 DEBUG oslo_concurrency.processutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.590 182939 DEBUG oslo_concurrency.processutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.626 182939 DEBUG oslo_concurrency.processutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.628 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.628 182939 DEBUG oslo_concurrency.processutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.671 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:21 compute-0 podman[212483]: 2026-01-21 23:45:21.68565419 +0000 UTC m=+0.050440064 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.705 182939 DEBUG oslo_concurrency.processutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.706 182939 DEBUG nova.virt.disk.api [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Checking if we can resize image /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.706 182939 DEBUG oslo_concurrency.processutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.762 182939 DEBUG oslo_concurrency.processutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.764 182939 DEBUG nova.virt.disk.api [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Cannot resize image /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.764 182939 DEBUG nova.objects.instance [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lazy-loading 'migration_context' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.780 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.780 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Ensure instance console log exists: /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.781 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.781 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.781 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.783 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.787 182939 WARNING nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.791 182939 DEBUG nova.virt.libvirt.host [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.791 182939 DEBUG nova.virt.libvirt.host [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.794 182939 DEBUG nova.virt.libvirt.host [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.795 182939 DEBUG nova.virt.libvirt.host [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.797 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.797 182939 DEBUG nova.virt.hardware [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.797 182939 DEBUG nova.virt.hardware [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.797 182939 DEBUG nova.virt.hardware [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.798 182939 DEBUG nova.virt.hardware [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.798 182939 DEBUG nova.virt.hardware [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.798 182939 DEBUG nova.virt.hardware [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.798 182939 DEBUG nova.virt.hardware [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.799 182939 DEBUG nova.virt.hardware [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.799 182939 DEBUG nova.virt.hardware [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.799 182939 DEBUG nova.virt.hardware [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.799 182939 DEBUG nova.virt.hardware [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.803 182939 DEBUG nova.objects.instance [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lazy-loading 'pci_devices' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.815 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:45:21 compute-0 nova_compute[182935]:   <uuid>eae14f31-b7b2-4d6d-8b75-16323b2c7a01</uuid>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   <name>instance-0000000d</name>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersAdmin275Test-server-530043739</nova:name>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:45:21</nova:creationTime>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:45:21 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:45:21 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:45:21 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:45:21 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:45:21 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:45:21 compute-0 nova_compute[182935]:         <nova:user uuid="2804acabdff34faf94b1505dceaa0282">tempest-ServersAdmin275Test-820539767-project-member</nova:user>
Jan 21 23:45:21 compute-0 nova_compute[182935]:         <nova:project uuid="be00ef0f55574d99a46f805f6d04ce41">tempest-ServersAdmin275Test-820539767</nova:project>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <system>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <entry name="serial">eae14f31-b7b2-4d6d-8b75-16323b2c7a01</entry>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <entry name="uuid">eae14f31-b7b2-4d6d-8b75-16323b2c7a01</entry>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     </system>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   <os>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   </os>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   <features>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   </features>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.config"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/console.log" append="off"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <video>
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     </video>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:45:21 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:45:21 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:45:21 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:45:21 compute-0 nova_compute[182935]: </domain>
Jan 21 23:45:21 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.869 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.870 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:45:21 compute-0 nova_compute[182935]: 2026-01-21 23:45:21.870 182939 INFO nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Using config drive
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.174 182939 INFO nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Creating config drive at /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.config
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.178 182939 DEBUG oslo_concurrency.processutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwq0xd5em execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.304 182939 DEBUG oslo_concurrency.processutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwq0xd5em" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:22 compute-0 systemd-machined[154182]: New machine qemu-3-instance-0000000d.
Jan 21 23:45:22 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-0000000d.
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.743 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039122.7430131, eae14f31-b7b2-4d6d-8b75-16323b2c7a01 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.744 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] VM Resumed (Lifecycle Event)
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.747 182939 DEBUG nova.compute.manager [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.747 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.750 182939 INFO nova.virt.libvirt.driver [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance spawned successfully.
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.750 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.779 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.779 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.780 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.780 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.780 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.781 182939 DEBUG nova.virt.libvirt.driver [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.785 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.788 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.829 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.830 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039122.7442293, eae14f31-b7b2-4d6d-8b75-16323b2c7a01 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.830 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] VM Started (Lifecycle Event)
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.865 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.868 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.894 182939 INFO nova.compute.manager [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Took 1.49 seconds to spawn the instance on the hypervisor.
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.895 182939 DEBUG nova.compute.manager [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.898 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:45:22 compute-0 nova_compute[182935]: 2026-01-21 23:45:22.982 182939 INFO nova.compute.manager [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Took 2.10 seconds to build instance.
Jan 21 23:45:23 compute-0 nova_compute[182935]: 2026-01-21 23:45:23.002 182939 DEBUG oslo_concurrency.lockutils [None req-02ba8088-740d-4277-ad24-b4bfef5c43a5 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "eae14f31-b7b2-4d6d-8b75-16323b2c7a01" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:24 compute-0 nova_compute[182935]: 2026-01-21 23:45:24.599 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:24 compute-0 podman[212543]: 2026-01-21 23:45:24.694020532 +0000 UTC m=+0.062749628 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 23:45:25 compute-0 ovn_controller[95047]: 2026-01-21T23:45:25Z|00035|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 23:45:26 compute-0 nova_compute[182935]: 2026-01-21 23:45:26.676 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:26 compute-0 nova_compute[182935]: 2026-01-21 23:45:26.796 182939 INFO nova.compute.manager [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Rebuilding instance
Jan 21 23:45:27 compute-0 nova_compute[182935]: 2026-01-21 23:45:27.081 182939 DEBUG nova.compute.manager [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:27 compute-0 nova_compute[182935]: 2026-01-21 23:45:27.154 182939 DEBUG nova.objects.instance [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lazy-loading 'pci_requests' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:27 compute-0 nova_compute[182935]: 2026-01-21 23:45:27.166 182939 DEBUG nova.objects.instance [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lazy-loading 'pci_devices' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:27 compute-0 nova_compute[182935]: 2026-01-21 23:45:27.177 182939 DEBUG nova.objects.instance [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lazy-loading 'resources' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:27 compute-0 nova_compute[182935]: 2026-01-21 23:45:27.189 182939 DEBUG nova.objects.instance [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lazy-loading 'migration_context' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:27 compute-0 nova_compute[182935]: 2026-01-21 23:45:27.202 182939 DEBUG nova.objects.instance [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:45:27 compute-0 nova_compute[182935]: 2026-01-21 23:45:27.206 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:45:29 compute-0 nova_compute[182935]: 2026-01-21 23:45:29.530 182939 DEBUG nova.virt.libvirt.driver [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Creating tmpfile /var/lib/nova/instances/tmpm24qgw5h to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 21 23:45:29 compute-0 nova_compute[182935]: 2026-01-21 23:45:29.636 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:29 compute-0 nova_compute[182935]: 2026-01-21 23:45:29.825 182939 DEBUG nova.compute.manager [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm24qgw5h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 21 23:45:29 compute-0 nova_compute[182935]: 2026-01-21 23:45:29.854 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:45:29 compute-0 nova_compute[182935]: 2026-01-21 23:45:29.855 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:45:29 compute-0 nova_compute[182935]: 2026-01-21 23:45:29.866 182939 INFO nova.compute.rpcapi [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Jan 21 23:45:29 compute-0 nova_compute[182935]: 2026-01-21 23:45:29.867 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:45:31 compute-0 nova_compute[182935]: 2026-01-21 23:45:31.465 182939 DEBUG nova.compute.manager [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm24qgw5h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='69dceb72-db44-4bfc-9b98-cc8b39885ae7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 21 23:45:31 compute-0 nova_compute[182935]: 2026-01-21 23:45:31.512 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:45:31 compute-0 nova_compute[182935]: 2026-01-21 23:45:31.513 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquired lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:45:31 compute-0 nova_compute[182935]: 2026-01-21 23:45:31.513 182939 DEBUG nova.network.neutron [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:45:31 compute-0 nova_compute[182935]: 2026-01-21 23:45:31.678 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:32 compute-0 nova_compute[182935]: 2026-01-21 23:45:32.943 182939 DEBUG nova.network.neutron [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Updating instance_info_cache with network_info: [{"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:45:32 compute-0 nova_compute[182935]: 2026-01-21 23:45:32.967 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Releasing lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:45:32 compute-0 nova_compute[182935]: 2026-01-21 23:45:32.990 182939 DEBUG nova.virt.libvirt.driver [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm24qgw5h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='69dceb72-db44-4bfc-9b98-cc8b39885ae7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 21 23:45:32 compute-0 nova_compute[182935]: 2026-01-21 23:45:32.992 182939 DEBUG nova.virt.libvirt.driver [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Creating instance directory: /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 21 23:45:32 compute-0 nova_compute[182935]: 2026-01-21 23:45:32.992 182939 DEBUG nova.virt.libvirt.driver [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Creating disk.info with the contents: {'/var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk': 'qcow2', '/var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 21 23:45:32 compute-0 nova_compute[182935]: 2026-01-21 23:45:32.993 182939 DEBUG nova.virt.libvirt.driver [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 21 23:45:32 compute-0 nova_compute[182935]: 2026-01-21 23:45:32.995 182939 DEBUG nova.objects.instance [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 69dceb72-db44-4bfc-9b98-cc8b39885ae7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.036 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.118 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.120 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.122 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.145 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.227 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.229 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.273 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.275 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.276 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.353 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.354 182939 DEBUG nova.virt.disk.api [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Checking if we can resize image /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.355 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.447 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.450 182939 DEBUG nova.virt.disk.api [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Cannot resize image /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.451 182939 DEBUG nova.objects.instance [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lazy-loading 'migration_context' on Instance uuid 69dceb72-db44-4bfc-9b98-cc8b39885ae7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.481 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.524 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.config 485376" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.526 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Copying file compute-2.ctlplane.example.com:/var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.config to /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:45:33 compute-0 nova_compute[182935]: 2026-01-21 23:45:33.526 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.config /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:33 compute-0 podman[212584]: 2026-01-21 23:45:33.714772685 +0000 UTC m=+0.075564024 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.040 182939 DEBUG oslo_concurrency.processutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk.config /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.042 182939 DEBUG nova.virt.libvirt.driver [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.045 182939 DEBUG nova.virt.libvirt.vif [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:45:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1333035319',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1333035319',id=12,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:45:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1298204af0f241dc8b63851b2046cf5c',ramdisk_id='',reservation_id='r-45mgwptl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1063342224',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1063342224-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:45:22Z,user_data=None,user_id='553fdc065acf4000a185abac43878ab4',uuid=69dceb72-db44-4bfc-9b98-cc8b39885ae7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.046 182939 DEBUG nova.network.os_vif_util [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converting VIF {"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.048 182939 DEBUG nova.network.os_vif_util [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.049 182939 DEBUG os_vif [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.051 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.052 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.053 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.060 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.060 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbae5fde2-5e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.061 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbae5fde2-5e, col_values=(('external_ids', {'iface-id': 'bae5fde2-5ead-4ae5-90dd-1d6d468541ea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6f:ac:86', 'vm-uuid': '69dceb72-db44-4bfc-9b98-cc8b39885ae7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.064 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:34 compute-0 NetworkManager[55139]: <info>  [1769039134.0654] manager: (tapbae5fde2-5e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.070 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.073 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.075 182939 INFO os_vif [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e')
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.076 182939 DEBUG nova.virt.libvirt.driver [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.077 182939 DEBUG nova.compute.manager [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm24qgw5h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='69dceb72-db44-4bfc-9b98-cc8b39885ae7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 21 23:45:34 compute-0 nova_compute[182935]: 2026-01-21 23:45:34.636 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:35.009 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:45:35 compute-0 nova_compute[182935]: 2026-01-21 23:45:35.009 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:35.011 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:45:35 compute-0 nova_compute[182935]: 2026-01-21 23:45:35.537 182939 DEBUG nova.network.neutron [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Port bae5fde2-5ead-4ae5-90dd-1d6d468541ea updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 21 23:45:35 compute-0 nova_compute[182935]: 2026-01-21 23:45:35.552 182939 DEBUG nova.compute.manager [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm24qgw5h',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='69dceb72-db44-4bfc-9b98-cc8b39885ae7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 21 23:45:35 compute-0 podman[212617]: 2026-01-21 23:45:35.71517262 +0000 UTC m=+0.078774930 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, vendor=Red Hat, Inc., managed_by=edpm_ansible)
Jan 21 23:45:35 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 21 23:45:35 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 21 23:45:35 compute-0 kernel: tapbae5fde2-5e: entered promiscuous mode
Jan 21 23:45:35 compute-0 NetworkManager[55139]: <info>  [1769039135.9433] manager: (tapbae5fde2-5e): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Jan 21 23:45:35 compute-0 nova_compute[182935]: 2026-01-21 23:45:35.943 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:35 compute-0 ovn_controller[95047]: 2026-01-21T23:45:35Z|00036|binding|INFO|Claiming lport bae5fde2-5ead-4ae5-90dd-1d6d468541ea for this additional chassis.
Jan 21 23:45:35 compute-0 ovn_controller[95047]: 2026-01-21T23:45:35Z|00037|binding|INFO|bae5fde2-5ead-4ae5-90dd-1d6d468541ea: Claiming fa:16:3e:6f:ac:86 10.100.0.6
Jan 21 23:45:35 compute-0 nova_compute[182935]: 2026-01-21 23:45:35.948 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:35 compute-0 systemd-udevd[212673]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:45:35 compute-0 systemd-machined[154182]: New machine qemu-4-instance-0000000c.
Jan 21 23:45:35 compute-0 NetworkManager[55139]: <info>  [1769039135.9950] device (tapbae5fde2-5e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:45:35 compute-0 NetworkManager[55139]: <info>  [1769039135.9962] device (tapbae5fde2-5e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:45:36 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-0000000c.
Jan 21 23:45:36 compute-0 nova_compute[182935]: 2026-01-21 23:45:36.036 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:36 compute-0 ovn_controller[95047]: 2026-01-21T23:45:36Z|00038|binding|INFO|Setting lport bae5fde2-5ead-4ae5-90dd-1d6d468541ea ovn-installed in OVS
Jan 21 23:45:36 compute-0 nova_compute[182935]: 2026-01-21 23:45:36.042 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:36 compute-0 nova_compute[182935]: 2026-01-21 23:45:36.044 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:36 compute-0 nova_compute[182935]: 2026-01-21 23:45:36.518 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039136.518282, 69dceb72-db44-4bfc-9b98-cc8b39885ae7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:36 compute-0 nova_compute[182935]: 2026-01-21 23:45:36.519 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] VM Started (Lifecycle Event)
Jan 21 23:45:36 compute-0 nova_compute[182935]: 2026-01-21 23:45:36.547 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:37 compute-0 nova_compute[182935]: 2026-01-21 23:45:37.275 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 21 23:45:37 compute-0 nova_compute[182935]: 2026-01-21 23:45:37.318 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039137.3180654, 69dceb72-db44-4bfc-9b98-cc8b39885ae7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:37 compute-0 nova_compute[182935]: 2026-01-21 23:45:37.319 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] VM Resumed (Lifecycle Event)
Jan 21 23:45:37 compute-0 nova_compute[182935]: 2026-01-21 23:45:37.346 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:37 compute-0 nova_compute[182935]: 2026-01-21 23:45:37.351 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:45:37 compute-0 nova_compute[182935]: 2026-01-21 23:45:37.378 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 21 23:45:38 compute-0 ovn_controller[95047]: 2026-01-21T23:45:38Z|00039|binding|INFO|Claiming lport bae5fde2-5ead-4ae5-90dd-1d6d468541ea for this chassis.
Jan 21 23:45:38 compute-0 ovn_controller[95047]: 2026-01-21T23:45:38Z|00040|binding|INFO|bae5fde2-5ead-4ae5-90dd-1d6d468541ea: Claiming fa:16:3e:6f:ac:86 10.100.0.6
Jan 21 23:45:38 compute-0 ovn_controller[95047]: 2026-01-21T23:45:38Z|00041|binding|INFO|Setting lport bae5fde2-5ead-4ae5-90dd-1d6d468541ea up in Southbound
Jan 21 23:45:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:38.915 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:ac:86 10.100.0.6'], port_security=['fa:16:3e:6f:ac:86 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1298204af0f241dc8b63851b2046cf5c', 'neutron:revision_number': '11', 'neutron:security_group_ids': '4fca0662-11c4-4183-96b8-546eae3304ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c50c611d-d348-436f-bd12-bc6add278699, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=bae5fde2-5ead-4ae5-90dd-1d6d468541ea) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:45:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:38.917 104408 INFO neutron.agent.ovn.metadata.agent [-] Port bae5fde2-5ead-4ae5-90dd-1d6d468541ea in datapath b7816b8e-52c1-4d60-84f7-524ebe7dfa5c bound to our chassis
Jan 21 23:45:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:38.920 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b7816b8e-52c1-4d60-84f7-524ebe7dfa5c
Jan 21 23:45:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:38.938 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d57de134-7a7b-4ca0-a422-cd5419841952]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:38.939 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb7816b8e-51 in ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:45:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:38.941 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb7816b8e-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:45:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:38.942 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4ee698bc-8a77-4acc-9a66-67844c0535d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:38.943 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[627976c5-2ad1-4842-86e6-3a089ca45de3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:38.956 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[f5fd13a1-c935-45f4-825e-a4ad18df6eca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:38.984 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[21a155a5-fa7c-43a7-a0cf-5c9b91750512]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.020 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9dc0a837-ec38-45b4-bea2-0948aa466cc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:39 compute-0 NetworkManager[55139]: <info>  [1769039139.0350] manager: (tapb7816b8e-50): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.034 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a4a13edd-b14b-4dc9-a5e2-bc2d4ca20b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:39 compute-0 systemd-udevd[212708]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:45:39 compute-0 nova_compute[182935]: 2026-01-21 23:45:39.065 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.076 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa8025c-8320-4c39-b247-f1fe07dac603]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.082 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[990d6356-e0df-42aa-99af-5b0ac4bd7acb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:39 compute-0 nova_compute[182935]: 2026-01-21 23:45:39.086 182939 INFO nova.compute.manager [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Post operation of migration started
Jan 21 23:45:39 compute-0 NetworkManager[55139]: <info>  [1769039139.1209] device (tapb7816b8e-50): carrier: link connected
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.127 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f16c99ef-6b73-4cd8-95bd-c768cfd97e45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.147 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2f92cae8-3ebd-49dc-9a7a-506aa1fa3b35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb7816b8e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:20:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364185, 'reachable_time': 19432, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212727, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.165 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d9237b94-bbe5-47b1-b812-2b7871add406]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb1:20b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364185, 'tstamp': 364185}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212728, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.188 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[442d0639-ef83-45e4-ae84-5be2386eb045]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb7816b8e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:20:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364185, 'reachable_time': 19432, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212729, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.233 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a87d9d7a-51da-469c-a1b7-184629065559]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.312 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a9008996-cb40-40bf-b7ab-d26a2bf190bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.314 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7816b8e-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.315 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.316 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7816b8e-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:45:39 compute-0 kernel: tapb7816b8e-50: entered promiscuous mode
Jan 21 23:45:39 compute-0 nova_compute[182935]: 2026-01-21 23:45:39.318 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:39 compute-0 NetworkManager[55139]: <info>  [1769039139.3201] manager: (tapb7816b8e-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.324 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb7816b8e-50, col_values=(('external_ids', {'iface-id': 'ecebff42-11cb-48b4-9c3d-966172998a49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:45:39 compute-0 ovn_controller[95047]: 2026-01-21T23:45:39Z|00042|binding|INFO|Releasing lport ecebff42-11cb-48b4-9c3d-966172998a49 from this chassis (sb_readonly=0)
Jan 21 23:45:39 compute-0 nova_compute[182935]: 2026-01-21 23:45:39.325 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.326 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.327 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0c77360b-f690-4921-a4cd-6a125b316545]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.328 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.pid.haproxy
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID b7816b8e-52c1-4d60-84f7-524ebe7dfa5c
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:45:39 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:39.329 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'env', 'PROCESS_TAG=haproxy-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:45:39 compute-0 nova_compute[182935]: 2026-01-21 23:45:39.337 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:39 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 21 23:45:39 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000d.scope: Consumed 12.764s CPU time.
Jan 21 23:45:39 compute-0 systemd-machined[154182]: Machine qemu-3-instance-0000000d terminated.
Jan 21 23:45:39 compute-0 nova_compute[182935]: 2026-01-21 23:45:39.639 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:39 compute-0 nova_compute[182935]: 2026-01-21 23:45:39.711 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:45:39 compute-0 nova_compute[182935]: 2026-01-21 23:45:39.712 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquired lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:45:39 compute-0 nova_compute[182935]: 2026-01-21 23:45:39.713 182939 DEBUG nova.network.neutron [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:45:39 compute-0 podman[212770]: 2026-01-21 23:45:39.749111619 +0000 UTC m=+0.054007800 container create c0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:45:39 compute-0 systemd[1]: Started libpod-conmon-c0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088.scope.
Jan 21 23:45:39 compute-0 podman[212770]: 2026-01-21 23:45:39.722994536 +0000 UTC m=+0.027890737 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:45:39 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:45:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0771fa1ecb758cd107c2ae3a6e37629369844487d10fce55aa4fb6256741aa8e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:45:39 compute-0 podman[212770]: 2026-01-21 23:45:39.842453456 +0000 UTC m=+0.147349667 container init c0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:45:39 compute-0 podman[212770]: 2026-01-21 23:45:39.847691531 +0000 UTC m=+0.152587712 container start c0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:45:39 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[212786]: [NOTICE]   (212790) : New worker (212792) forked
Jan 21 23:45:39 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[212786]: [NOTICE]   (212790) : Loading success.
Jan 21 23:45:40 compute-0 nova_compute[182935]: 2026-01-21 23:45:40.292 182939 INFO nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance shutdown successfully after 13 seconds.
Jan 21 23:45:40 compute-0 nova_compute[182935]: 2026-01-21 23:45:40.305 182939 INFO nova.virt.libvirt.driver [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance destroyed successfully.
Jan 21 23:45:40 compute-0 nova_compute[182935]: 2026-01-21 23:45:40.311 182939 INFO nova.virt.libvirt.driver [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance destroyed successfully.
Jan 21 23:45:40 compute-0 nova_compute[182935]: 2026-01-21 23:45:40.311 182939 INFO nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Deleting instance files /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01_del
Jan 21 23:45:40 compute-0 nova_compute[182935]: 2026-01-21 23:45:40.312 182939 INFO nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Deletion of /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01_del complete
Jan 21 23:45:40 compute-0 nova_compute[182935]: 2026-01-21 23:45:40.757 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:45:40 compute-0 nova_compute[182935]: 2026-01-21 23:45:40.758 182939 INFO nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Creating image(s)
Jan 21 23:45:40 compute-0 nova_compute[182935]: 2026-01-21 23:45:40.758 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:40 compute-0 nova_compute[182935]: 2026-01-21 23:45:40.759 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:40 compute-0 nova_compute[182935]: 2026-01-21 23:45:40.759 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:40 compute-0 nova_compute[182935]: 2026-01-21 23:45:40.760 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:40 compute-0 nova_compute[182935]: 2026-01-21 23:45:40.760 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:45:43.013 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:45:44 compute-0 nova_compute[182935]: 2026-01-21 23:45:44.067 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:44 compute-0 nova_compute[182935]: 2026-01-21 23:45:44.641 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:44 compute-0 podman[212802]: 2026-01-21 23:45:44.721560003 +0000 UTC m=+0.089696785 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:45:44 compute-0 podman[212801]: 2026-01-21 23:45:44.736789518 +0000 UTC m=+0.100776691 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.350 182939 DEBUG nova.network.neutron [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Updating instance_info_cache with network_info: [{"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.373 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Releasing lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.448 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.449 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.449 182939 DEBUG oslo_concurrency.lockutils [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.455 182939 INFO nova.virt.libvirt.driver [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 21 23:45:45 compute-0 virtqemud[182477]: Domain id=4 name='instance-0000000c' uuid=69dceb72-db44-4bfc-9b98-cc8b39885ae7 is tainted: custom-monitor
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.483 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.542 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.part --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.544 182939 DEBUG nova.virt.images [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] 3e1dda74-3c6a-4d29-8792-32134d1c36c5 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.545 182939 DEBUG nova.privsep.utils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.545 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.part /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.694 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.part /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.converted" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.699 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.756 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74.converted --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.758 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 4.998s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.770 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.829 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.831 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.833 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.858 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.916 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.917 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.957 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.958 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:45 compute-0 nova_compute[182935]: 2026-01-21 23:45:45.959 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.020 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.021 182939 DEBUG nova.virt.disk.api [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Checking if we can resize image /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.022 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.079 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.080 182939 DEBUG nova.virt.disk.api [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Cannot resize image /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.081 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.082 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Ensure instance console log exists: /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.082 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.083 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.083 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.085 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.091 182939 WARNING nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.097 182939 DEBUG nova.virt.libvirt.host [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.098 182939 DEBUG nova.virt.libvirt.host [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.102 182939 DEBUG nova.virt.libvirt.host [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.103 182939 DEBUG nova.virt.libvirt.host [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.104 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.105 182939 DEBUG nova.virt.hardware [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.105 182939 DEBUG nova.virt.hardware [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.105 182939 DEBUG nova.virt.hardware [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.106 182939 DEBUG nova.virt.hardware [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.106 182939 DEBUG nova.virt.hardware [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.106 182939 DEBUG nova.virt.hardware [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.106 182939 DEBUG nova.virt.hardware [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.107 182939 DEBUG nova.virt.hardware [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.107 182939 DEBUG nova.virt.hardware [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.107 182939 DEBUG nova.virt.hardware [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.107 182939 DEBUG nova.virt.hardware [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.108 182939 DEBUG nova.objects.instance [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lazy-loading 'vcpu_model' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.138 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:45:46 compute-0 nova_compute[182935]:   <uuid>eae14f31-b7b2-4d6d-8b75-16323b2c7a01</uuid>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   <name>instance-0000000d</name>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersAdmin275Test-server-530043739</nova:name>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:45:46</nova:creationTime>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:45:46 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:45:46 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:45:46 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:45:46 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:45:46 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:45:46 compute-0 nova_compute[182935]:         <nova:user uuid="2804acabdff34faf94b1505dceaa0282">tempest-ServersAdmin275Test-820539767-project-member</nova:user>
Jan 21 23:45:46 compute-0 nova_compute[182935]:         <nova:project uuid="be00ef0f55574d99a46f805f6d04ce41">tempest-ServersAdmin275Test-820539767</nova:project>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="3e1dda74-3c6a-4d29-8792-32134d1c36c5"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <system>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <entry name="serial">eae14f31-b7b2-4d6d-8b75-16323b2c7a01</entry>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <entry name="uuid">eae14f31-b7b2-4d6d-8b75-16323b2c7a01</entry>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     </system>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   <os>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   </os>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   <features>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   </features>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.config"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/console.log" append="off"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <video>
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     </video>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:45:46 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:45:46 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:45:46 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:45:46 compute-0 nova_compute[182935]: </domain>
Jan 21 23:45:46 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.202 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.202 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.203 182939 INFO nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Using config drive
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.230 182939 DEBUG nova.objects.instance [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lazy-loading 'ec2_ids' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.392 182939 DEBUG nova.objects.instance [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lazy-loading 'keypairs' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:46 compute-0 nova_compute[182935]: 2026-01-21 23:45:46.464 182939 INFO nova.virt.libvirt.driver [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 21 23:45:47 compute-0 nova_compute[182935]: 2026-01-21 23:45:47.402 182939 INFO nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Creating config drive at /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.config
Jan 21 23:45:47 compute-0 nova_compute[182935]: 2026-01-21 23:45:47.407 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4ndu8xci execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:47 compute-0 nova_compute[182935]: 2026-01-21 23:45:47.470 182939 INFO nova.virt.libvirt.driver [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 21 23:45:47 compute-0 nova_compute[182935]: 2026-01-21 23:45:47.476 182939 DEBUG nova.compute.manager [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:47 compute-0 nova_compute[182935]: 2026-01-21 23:45:47.528 182939 DEBUG nova.objects.instance [None req-38b988f7-84b5-4621-a7d5-edc5582b4b80 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:45:47 compute-0 nova_compute[182935]: 2026-01-21 23:45:47.539 182939 DEBUG oslo_concurrency.processutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4ndu8xci" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:47 compute-0 systemd-machined[154182]: New machine qemu-5-instance-0000000d.
Jan 21 23:45:47 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-0000000d.
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.443 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for eae14f31-b7b2-4d6d-8b75-16323b2c7a01 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.445 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039148.4421768, eae14f31-b7b2-4d6d-8b75-16323b2c7a01 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.446 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] VM Resumed (Lifecycle Event)
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.450 182939 DEBUG nova.compute.manager [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.451 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.457 182939 INFO nova.virt.libvirt.driver [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance spawned successfully.
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.458 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.504 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.510 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.523 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.524 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.525 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.526 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.527 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.528 182939 DEBUG nova.virt.libvirt.driver [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.588 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.589 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039148.4442651, eae14f31-b7b2-4d6d-8b75-16323b2c7a01 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.589 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] VM Started (Lifecycle Event)
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.670 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.676 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.707 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.748 182939 DEBUG nova.compute.manager [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.916 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.917 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:45:48 compute-0 nova_compute[182935]: 2026-01-21 23:45:48.918 182939 DEBUG nova.objects.instance [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:45:49 compute-0 nova_compute[182935]: 2026-01-21 23:45:49.069 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:49 compute-0 nova_compute[182935]: 2026-01-21 23:45:49.096 182939 DEBUG oslo_concurrency.lockutils [None req-8239d837-19aa-40a5-a8f8-636e590e5421 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:45:49 compute-0 nova_compute[182935]: 2026-01-21 23:45:49.644 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:52 compute-0 podman[212908]: 2026-01-21 23:45:52.760065435 +0000 UTC m=+0.109939029 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:45:53 compute-0 nova_compute[182935]: 2026-01-21 23:45:53.238 182939 INFO nova.compute.manager [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Rebuilding instance
Jan 21 23:45:53 compute-0 nova_compute[182935]: 2026-01-21 23:45:53.768 182939 DEBUG nova.compute.manager [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:45:53 compute-0 nova_compute[182935]: 2026-01-21 23:45:53.889 182939 DEBUG nova.objects.instance [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lazy-loading 'pci_requests' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:53 compute-0 nova_compute[182935]: 2026-01-21 23:45:53.917 182939 DEBUG nova.objects.instance [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lazy-loading 'pci_devices' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:53 compute-0 nova_compute[182935]: 2026-01-21 23:45:53.943 182939 DEBUG nova.objects.instance [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lazy-loading 'resources' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:53 compute-0 nova_compute[182935]: 2026-01-21 23:45:53.965 182939 DEBUG nova.objects.instance [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lazy-loading 'migration_context' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:45:53 compute-0 nova_compute[182935]: 2026-01-21 23:45:53.982 182939 DEBUG nova.objects.instance [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:45:53 compute-0 nova_compute[182935]: 2026-01-21 23:45:53.987 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:45:54 compute-0 nova_compute[182935]: 2026-01-21 23:45:54.077 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:54 compute-0 nova_compute[182935]: 2026-01-21 23:45:54.458 182939 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Check if temp file /var/lib/nova/instances/tmpf6wm0dwd exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 23:45:54 compute-0 nova_compute[182935]: 2026-01-21 23:45:54.459 182939 DEBUG nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpf6wm0dwd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='69dceb72-db44-4bfc-9b98-cc8b39885ae7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 23:45:54 compute-0 nova_compute[182935]: 2026-01-21 23:45:54.647 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:55 compute-0 podman[212932]: 2026-01-21 23:45:55.692051819 +0000 UTC m=+0.063882049 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:45:55 compute-0 rsyslogd[1005]: imjournal: 3254 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 21 23:45:56 compute-0 nova_compute[182935]: 2026-01-21 23:45:56.151 182939 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:56 compute-0 nova_compute[182935]: 2026-01-21 23:45:56.238 182939 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:56 compute-0 nova_compute[182935]: 2026-01-21 23:45:56.241 182939 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:45:56 compute-0 nova_compute[182935]: 2026-01-21 23:45:56.303 182939 DEBUG oslo_concurrency.processutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:45:57 compute-0 sshd-session[212957]: Invalid user tomcat from 188.166.69.60 port 51466
Jan 21 23:45:57 compute-0 sshd-session[212957]: Connection closed by invalid user tomcat 188.166.69.60 port 51466 [preauth]
Jan 21 23:45:59 compute-0 nova_compute[182935]: 2026-01-21 23:45:59.081 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:45:59 compute-0 nova_compute[182935]: 2026-01-21 23:45:59.650 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:01 compute-0 sshd-session[212971]: Accepted publickey for nova from 192.168.122.102 port 45334 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:46:01 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 23:46:01 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 23:46:01 compute-0 systemd-logind[784]: New session 26 of user nova.
Jan 21 23:46:01 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 23:46:01 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 23:46:01 compute-0 systemd[212975]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:46:01 compute-0 systemd[212975]: Queued start job for default target Main User Target.
Jan 21 23:46:01 compute-0 systemd[212975]: Created slice User Application Slice.
Jan 21 23:46:01 compute-0 systemd[212975]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:46:01 compute-0 systemd[212975]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:46:01 compute-0 systemd[212975]: Reached target Paths.
Jan 21 23:46:01 compute-0 systemd[212975]: Reached target Timers.
Jan 21 23:46:01 compute-0 systemd[212975]: Starting D-Bus User Message Bus Socket...
Jan 21 23:46:01 compute-0 systemd[212975]: Starting Create User's Volatile Files and Directories...
Jan 21 23:46:01 compute-0 systemd[212975]: Finished Create User's Volatile Files and Directories.
Jan 21 23:46:01 compute-0 systemd[212975]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:46:01 compute-0 systemd[212975]: Reached target Sockets.
Jan 21 23:46:01 compute-0 systemd[212975]: Reached target Basic System.
Jan 21 23:46:01 compute-0 systemd[212975]: Reached target Main User Target.
Jan 21 23:46:01 compute-0 systemd[212975]: Startup finished in 134ms.
Jan 21 23:46:01 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 23:46:01 compute-0 systemd[1]: Started Session 26 of User nova.
Jan 21 23:46:01 compute-0 sshd-session[212971]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:46:01 compute-0 sshd-session[212989]: Received disconnect from 192.168.122.102 port 45334:11: disconnected by user
Jan 21 23:46:01 compute-0 sshd-session[212989]: Disconnected from user nova 192.168.122.102 port 45334
Jan 21 23:46:01 compute-0 sshd-session[212971]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:46:01 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Jan 21 23:46:01 compute-0 systemd-logind[784]: Session 26 logged out. Waiting for processes to exit.
Jan 21 23:46:01 compute-0 systemd-logind[784]: Removed session 26.
Jan 21 23:46:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:03.179 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:03.181 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:03.182 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:03 compute-0 nova_compute[182935]: 2026-01-21 23:46:03.457 182939 DEBUG nova.compute.manager [req-b963aa59-f3f3-41e2-95c7-a2d2bf7011df req-4f526522-f7bc-4804-86a9-fa2bae5a7367 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-unplugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:03 compute-0 nova_compute[182935]: 2026-01-21 23:46:03.457 182939 DEBUG oslo_concurrency.lockutils [req-b963aa59-f3f3-41e2-95c7-a2d2bf7011df req-4f526522-f7bc-4804-86a9-fa2bae5a7367 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:03 compute-0 nova_compute[182935]: 2026-01-21 23:46:03.458 182939 DEBUG oslo_concurrency.lockutils [req-b963aa59-f3f3-41e2-95c7-a2d2bf7011df req-4f526522-f7bc-4804-86a9-fa2bae5a7367 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:03 compute-0 nova_compute[182935]: 2026-01-21 23:46:03.458 182939 DEBUG oslo_concurrency.lockutils [req-b963aa59-f3f3-41e2-95c7-a2d2bf7011df req-4f526522-f7bc-4804-86a9-fa2bae5a7367 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:03 compute-0 nova_compute[182935]: 2026-01-21 23:46:03.458 182939 DEBUG nova.compute.manager [req-b963aa59-f3f3-41e2-95c7-a2d2bf7011df req-4f526522-f7bc-4804-86a9-fa2bae5a7367 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] No waiting events found dispatching network-vif-unplugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:03 compute-0 nova_compute[182935]: 2026-01-21 23:46:03.458 182939 DEBUG nova.compute.manager [req-b963aa59-f3f3-41e2-95c7-a2d2bf7011df req-4f526522-f7bc-4804-86a9-fa2bae5a7367 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-unplugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.042 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.083 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.119 182939 INFO nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Took 7.81 seconds for pre_live_migration on destination host compute-2.ctlplane.example.com.
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.119 182939 DEBUG nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.136 182939 DEBUG nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpf6wm0dwd',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='69dceb72-db44-4bfc-9b98-cc8b39885ae7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(d33d7270-640c-4eb6-9e9e-624ebad15196),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.157 182939 DEBUG nova.objects.instance [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lazy-loading 'migration_context' on Instance uuid 69dceb72-db44-4bfc-9b98-cc8b39885ae7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.158 182939 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.159 182939 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.159 182939 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.177 182939 DEBUG nova.virt.libvirt.vif [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:45:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1333035319',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1333035319',id=12,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:45:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1298204af0f241dc8b63851b2046cf5c',ramdisk_id='',reservation_id='r-45mgwptl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1063342224',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1063342224-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:45:47Z,user_data=None,user_id='553fdc065acf4000a185abac43878ab4',uuid=69dceb72-db44-4bfc-9b98-cc8b39885ae7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.177 182939 DEBUG nova.network.os_vif_util [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converting VIF {"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.178 182939 DEBUG nova.network.os_vif_util [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.179 182939 DEBUG nova.virt.libvirt.migration [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 23:46:04 compute-0 nova_compute[182935]:   <mac address="fa:16:3e:6f:ac:86"/>
Jan 21 23:46:04 compute-0 nova_compute[182935]:   <model type="virtio"/>
Jan 21 23:46:04 compute-0 nova_compute[182935]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:46:04 compute-0 nova_compute[182935]:   <mtu size="1442"/>
Jan 21 23:46:04 compute-0 nova_compute[182935]:   <target dev="tapbae5fde2-5e"/>
Jan 21 23:46:04 compute-0 nova_compute[182935]: </interface>
Jan 21 23:46:04 compute-0 nova_compute[182935]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.179 182939 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.652 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.661 182939 DEBUG nova.virt.libvirt.migration [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.662 182939 INFO nova.virt.libvirt.migration [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 23:46:04 compute-0 podman[212991]: 2026-01-21 23:46:04.701873759 +0000 UTC m=+0.063589500 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 21 23:46:04 compute-0 nova_compute[182935]: 2026-01-21 23:46:04.820 182939 INFO nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 23:46:05 compute-0 nova_compute[182935]: 2026-01-21 23:46:05.323 182939 DEBUG nova.virt.libvirt.migration [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:46:05 compute-0 nova_compute[182935]: 2026-01-21 23:46:05.324 182939 DEBUG nova.virt.libvirt.migration [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:46:05 compute-0 nova_compute[182935]: 2026-01-21 23:46:05.829 182939 DEBUG nova.virt.libvirt.migration [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:46:05 compute-0 nova_compute[182935]: 2026-01-21 23:46:05.830 182939 DEBUG nova.virt.libvirt.migration [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.049 182939 DEBUG nova.compute.manager [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.050 182939 DEBUG oslo_concurrency.lockutils [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.050 182939 DEBUG oslo_concurrency.lockutils [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.050 182939 DEBUG oslo_concurrency.lockutils [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.050 182939 DEBUG nova.compute.manager [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] No waiting events found dispatching network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.051 182939 WARNING nova.compute.manager [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received unexpected event network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea for instance with vm_state active and task_state migrating.
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.051 182939 DEBUG nova.compute.manager [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-changed-bae5fde2-5ead-4ae5-90dd-1d6d468541ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.051 182939 DEBUG nova.compute.manager [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Refreshing instance network info cache due to event network-changed-bae5fde2-5ead-4ae5-90dd-1d6d468541ea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.051 182939 DEBUG oslo_concurrency.lockutils [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.052 182939 DEBUG oslo_concurrency.lockutils [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.052 182939 DEBUG nova.network.neutron [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Refreshing network info cache for port bae5fde2-5ead-4ae5-90dd-1d6d468541ea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:46:06 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 21 23:46:06 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000d.scope: Consumed 13.231s CPU time.
Jan 21 23:46:06 compute-0 systemd-machined[154182]: Machine qemu-5-instance-0000000d terminated.
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.257 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039166.256307, 69dceb72-db44-4bfc-9b98-cc8b39885ae7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.258 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] VM Paused (Lifecycle Event)
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.299 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.303 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:46:06 compute-0 podman[213018]: 2026-01-21 23:46:06.307247459 +0000 UTC m=+0.067407303 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.expose-services=)
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.334 182939 DEBUG nova.virt.libvirt.migration [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.334 182939 DEBUG nova.virt.libvirt.migration [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.339 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 23:46:06 compute-0 kernel: tapbae5fde2-5e (unregistering): left promiscuous mode
Jan 21 23:46:06 compute-0 NetworkManager[55139]: <info>  [1769039166.4256] device (tapbae5fde2-5e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.476 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:06 compute-0 ovn_controller[95047]: 2026-01-21T23:46:06Z|00043|binding|INFO|Releasing lport bae5fde2-5ead-4ae5-90dd-1d6d468541ea from this chassis (sb_readonly=0)
Jan 21 23:46:06 compute-0 ovn_controller[95047]: 2026-01-21T23:46:06Z|00044|binding|INFO|Setting lport bae5fde2-5ead-4ae5-90dd-1d6d468541ea down in Southbound
Jan 21 23:46:06 compute-0 ovn_controller[95047]: 2026-01-21T23:46:06Z|00045|binding|INFO|Removing iface tapbae5fde2-5e ovn-installed in OVS
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.484 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.492 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:ac:86 10.100.0.6'], port_security=['fa:16:3e:6f:ac:86 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-2.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'ce4b296c-26ac-415a-aa87-9634754eb3d3'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '69dceb72-db44-4bfc-9b98-cc8b39885ae7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1298204af0f241dc8b63851b2046cf5c', 'neutron:revision_number': '18', 'neutron:security_group_ids': '4fca0662-11c4-4183-96b8-546eae3304ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c50c611d-d348-436f-bd12-bc6add278699, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=bae5fde2-5ead-4ae5-90dd-1d6d468541ea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.493 104408 INFO neutron.agent.ovn.metadata.agent [-] Port bae5fde2-5ead-4ae5-90dd-1d6d468541ea in datapath b7816b8e-52c1-4d60-84f7-524ebe7dfa5c unbound from our chassis
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.494 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.495 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.496 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[22e268cd-f209-4e48-8f07-d390110693ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.497 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c namespace which is not needed anymore
Jan 21 23:46:06 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 21 23:46:06 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Consumed 2.682s CPU time.
Jan 21 23:46:06 compute-0 systemd-machined[154182]: Machine qemu-4-instance-0000000c terminated.
Jan 21 23:46:06 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[212786]: [NOTICE]   (212790) : haproxy version is 2.8.14-c23fe91
Jan 21 23:46:06 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[212786]: [NOTICE]   (212790) : path to executable is /usr/sbin/haproxy
Jan 21 23:46:06 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[212786]: [WARNING]  (212790) : Exiting Master process...
Jan 21 23:46:06 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[212786]: [ALERT]    (212790) : Current worker (212792) exited with code 143 (Terminated)
Jan 21 23:46:06 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[212786]: [WARNING]  (212790) : All workers exited. Exiting... (0)
Jan 21 23:46:06 compute-0 systemd[1]: libpod-c0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088.scope: Deactivated successfully.
Jan 21 23:46:06 compute-0 NetworkManager[55139]: <info>  [1769039166.6242] manager: (tapbae5fde2-5e): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Jan 21 23:46:06 compute-0 podman[213070]: 2026-01-21 23:46:06.624054812 +0000 UTC m=+0.041715027 container died c0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.625 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.629 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088-userdata-shm.mount: Deactivated successfully.
Jan 21 23:46:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-0771fa1ecb758cd107c2ae3a6e37629369844487d10fce55aa4fb6256741aa8e-merged.mount: Deactivated successfully.
Jan 21 23:46:06 compute-0 podman[213070]: 2026-01-21 23:46:06.662179304 +0000 UTC m=+0.079839509 container cleanup c0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.664 182939 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.665 182939 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.665 182939 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 23:46:06 compute-0 systemd[1]: libpod-conmon-c0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088.scope: Deactivated successfully.
Jan 21 23:46:06 compute-0 podman[213114]: 2026-01-21 23:46:06.72518083 +0000 UTC m=+0.038911951 container remove c0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.730 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4f65d2b2-fa3a-4c84-9218-95f44454ffc4]: (4, ('Wed Jan 21 11:46:06 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c (c0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088)\nc0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088\nWed Jan 21 11:46:06 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c (c0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088)\nc0ff3301dd962002ce9592f468f0fb4783ff2b9c07679cc2b16edfa2ef305088\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.732 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fb00564a-db21-41ec-8c17-a30f97ae6bd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.733 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7816b8e-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.735 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:06 compute-0 kernel: tapb7816b8e-50: left promiscuous mode
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.764 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.773 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b219c218-a5d9-4446-80f1-1346289534d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.788 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5cdd4068-ebed-4827-9b74-2793f62acfda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.789 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[81583509-e4a4-4e1a-a51e-0fa62fe2080f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.812 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[62486f0b-7341-4f71-babe-6e5cfafd3dfc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364175, 'reachable_time': 42491, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213133, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:06 compute-0 systemd[1]: run-netns-ovnmeta\x2db7816b8e\x2d52c1\x2d4d60\x2d84f7\x2d524ebe7dfa5c.mount: Deactivated successfully.
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.816 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:46:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:06.816 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[1c1ba30f-285b-423d-bc1e-da5ca42b3939]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.836 182939 DEBUG nova.virt.libvirt.guest [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '69dceb72-db44-4bfc-9b98-cc8b39885ae7' (instance-0000000c) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.837 182939 INFO nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Migration operation has completed
Jan 21 23:46:06 compute-0 nova_compute[182935]: 2026-01-21 23:46:06.837 182939 INFO nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] _post_live_migration() is started..
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.061 182939 INFO nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance shutdown successfully after 13 seconds.
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.066 182939 INFO nova.virt.libvirt.driver [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance destroyed successfully.
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.071 182939 INFO nova.virt.libvirt.driver [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance destroyed successfully.
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.071 182939 INFO nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Deleting instance files /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01_del
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.072 182939 INFO nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Deletion of /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01_del complete
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.616 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.617 182939 INFO nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Creating image(s)
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.617 182939 DEBUG oslo_concurrency.lockutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Acquiring lock "/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.617 182939 DEBUG oslo_concurrency.lockutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lock "/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.618 182939 DEBUG oslo_concurrency.lockutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lock "/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.633 182939 DEBUG oslo_concurrency.processutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.656 182939 DEBUG nova.compute.manager [req-8bf39d49-f354-4663-bafb-855603414588 req-efcca456-6471-452a-8df5-5530649b1278 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-unplugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.657 182939 DEBUG oslo_concurrency.lockutils [req-8bf39d49-f354-4663-bafb-855603414588 req-efcca456-6471-452a-8df5-5530649b1278 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.657 182939 DEBUG oslo_concurrency.lockutils [req-8bf39d49-f354-4663-bafb-855603414588 req-efcca456-6471-452a-8df5-5530649b1278 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.657 182939 DEBUG oslo_concurrency.lockutils [req-8bf39d49-f354-4663-bafb-855603414588 req-efcca456-6471-452a-8df5-5530649b1278 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.657 182939 DEBUG nova.compute.manager [req-8bf39d49-f354-4663-bafb-855603414588 req-efcca456-6471-452a-8df5-5530649b1278 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] No waiting events found dispatching network-vif-unplugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.658 182939 DEBUG nova.compute.manager [req-8bf39d49-f354-4663-bafb-855603414588 req-efcca456-6471-452a-8df5-5530649b1278 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-unplugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.709 182939 DEBUG oslo_concurrency.processutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.710 182939 DEBUG oslo_concurrency.lockutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.710 182939 DEBUG oslo_concurrency.lockutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.721 182939 DEBUG oslo_concurrency.processutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.787 182939 DEBUG oslo_concurrency.processutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.788 182939 DEBUG oslo_concurrency.processutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.823 182939 DEBUG oslo_concurrency.processutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.824 182939 DEBUG oslo_concurrency.lockutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.824 182939 DEBUG oslo_concurrency.processutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.880 182939 DEBUG oslo_concurrency.processutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.881 182939 DEBUG nova.virt.disk.api [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Checking if we can resize image /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.881 182939 DEBUG oslo_concurrency.processutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.977 182939 DEBUG oslo_concurrency.processutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.978 182939 DEBUG nova.virt.disk.api [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Cannot resize image /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.979 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.979 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Ensure instance console log exists: /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.980 182939 DEBUG oslo_concurrency.lockutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.980 182939 DEBUG oslo_concurrency.lockutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.981 182939 DEBUG oslo_concurrency.lockutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.982 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.988 182939 WARNING nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.997 182939 DEBUG nova.virt.libvirt.host [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:46:07 compute-0 nova_compute[182935]: 2026-01-21 23:46:07.998 182939 DEBUG nova.virt.libvirt.host [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.003 182939 DEBUG nova.virt.libvirt.host [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.003 182939 DEBUG nova.virt.libvirt.host [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.005 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.005 182939 DEBUG nova.virt.hardware [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.005 182939 DEBUG nova.virt.hardware [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.006 182939 DEBUG nova.virt.hardware [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.006 182939 DEBUG nova.virt.hardware [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.006 182939 DEBUG nova.virt.hardware [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.007 182939 DEBUG nova.virt.hardware [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.007 182939 DEBUG nova.virt.hardware [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.007 182939 DEBUG nova.virt.hardware [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.007 182939 DEBUG nova.virt.hardware [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.008 182939 DEBUG nova.virt.hardware [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.008 182939 DEBUG nova.virt.hardware [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.008 182939 DEBUG nova.objects.instance [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lazy-loading 'vcpu_model' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.048 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:46:08 compute-0 nova_compute[182935]:   <uuid>eae14f31-b7b2-4d6d-8b75-16323b2c7a01</uuid>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   <name>instance-0000000d</name>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersAdmin275Test-server-530043739</nova:name>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:46:07</nova:creationTime>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:46:08 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:46:08 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:46:08 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:46:08 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:46:08 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:46:08 compute-0 nova_compute[182935]:         <nova:user uuid="2804acabdff34faf94b1505dceaa0282">tempest-ServersAdmin275Test-820539767-project-member</nova:user>
Jan 21 23:46:08 compute-0 nova_compute[182935]:         <nova:project uuid="be00ef0f55574d99a46f805f6d04ce41">tempest-ServersAdmin275Test-820539767</nova:project>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <system>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <entry name="serial">eae14f31-b7b2-4d6d-8b75-16323b2c7a01</entry>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <entry name="uuid">eae14f31-b7b2-4d6d-8b75-16323b2c7a01</entry>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     </system>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   <os>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   </os>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   <features>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   </features>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.config"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/console.log" append="off"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <video>
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     </video>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:46:08 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:46:08 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:46:08 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:46:08 compute-0 nova_compute[182935]: </domain>
Jan 21 23:46:08 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.137 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.138 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.139 182939 INFO nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Using config drive
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.184 182939 DEBUG nova.objects.instance [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lazy-loading 'ec2_ids' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.233 182939 DEBUG nova.objects.instance [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lazy-loading 'keypairs' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.674 182939 INFO nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Creating config drive at /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.config
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.678 182939 DEBUG oslo_concurrency.processutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbalp592n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:08 compute-0 nova_compute[182935]: 2026-01-21 23:46:08.803 182939 DEBUG oslo_concurrency.processutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbalp592n" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:08 compute-0 systemd-machined[154182]: New machine qemu-6-instance-0000000d.
Jan 21 23:46:08 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-0000000d.
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.085 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.139 182939 DEBUG nova.network.neutron [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Activated binding for port bae5fde2-5ead-4ae5-90dd-1d6d468541ea and host compute-2.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.140 182939 DEBUG nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.141 182939 DEBUG nova.virt.libvirt.vif [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:45:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1333035319',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1333035319',id=12,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:45:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1298204af0f241dc8b63851b2046cf5c',ramdisk_id='',reservation_id='r-45mgwptl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1063342224',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1063342224-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:45:53Z,user_data=None,user_id='553fdc065acf4000a185abac43878ab4',uuid=69dceb72-db44-4bfc-9b98-cc8b39885ae7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.142 182939 DEBUG nova.network.os_vif_util [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converting VIF {"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.143 182939 DEBUG nova.network.os_vif_util [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.143 182939 DEBUG os_vif [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.146 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.147 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbae5fde2-5e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.151 182939 DEBUG nova.compute.manager [req-461d047b-5348-465c-a335-54386e0010f6 req-5904d4af-c878-472c-86c9-0379e102b253 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-unplugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.152 182939 DEBUG oslo_concurrency.lockutils [req-461d047b-5348-465c-a335-54386e0010f6 req-5904d4af-c878-472c-86c9-0379e102b253 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.153 182939 DEBUG oslo_concurrency.lockutils [req-461d047b-5348-465c-a335-54386e0010f6 req-5904d4af-c878-472c-86c9-0379e102b253 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.153 182939 DEBUG oslo_concurrency.lockutils [req-461d047b-5348-465c-a335-54386e0010f6 req-5904d4af-c878-472c-86c9-0379e102b253 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.154 182939 DEBUG nova.compute.manager [req-461d047b-5348-465c-a335-54386e0010f6 req-5904d4af-c878-472c-86c9-0379e102b253 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] No waiting events found dispatching network-vif-unplugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.154 182939 DEBUG nova.compute.manager [req-461d047b-5348-465c-a335-54386e0010f6 req-5904d4af-c878-472c-86c9-0379e102b253 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-unplugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.154 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.156 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.159 182939 INFO os_vif [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=bae5fde2-5ead-4ae5-90dd-1d6d468541ea,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae5fde2-5e')
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.160 182939 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.160 182939 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.160 182939 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.161 182939 DEBUG nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.161 182939 INFO nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Deleting instance files /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7_del
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.162 182939 INFO nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Deletion of /var/lib/nova/instances/69dceb72-db44-4bfc-9b98-cc8b39885ae7_del complete
Jan 21 23:46:09 compute-0 ovn_controller[95047]: 2026-01-21T23:46:09Z|00046|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.375 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for eae14f31-b7b2-4d6d-8b75-16323b2c7a01 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.376 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039169.3751156, eae14f31-b7b2-4d6d-8b75-16323b2c7a01 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.376 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] VM Resumed (Lifecycle Event)
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.379 182939 DEBUG nova.compute.manager [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.379 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.382 182939 INFO nova.virt.libvirt.driver [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance spawned successfully.
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.383 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.436 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.443 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.446 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.447 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.447 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.447 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.448 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.448 182939 DEBUG nova.virt.libvirt.driver [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.484 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.485 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039169.3762417, eae14f31-b7b2-4d6d-8b75-16323b2c7a01 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.485 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] VM Started (Lifecycle Event)
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.524 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.528 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.556 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.654 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.677 182939 DEBUG nova.compute.manager [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.849 182939 DEBUG oslo_concurrency.lockutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.850 182939 DEBUG oslo_concurrency.lockutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.850 182939 DEBUG nova.objects.instance [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.921 182939 DEBUG nova.compute.manager [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.921 182939 DEBUG oslo_concurrency.lockutils [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.922 182939 DEBUG oslo_concurrency.lockutils [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.922 182939 DEBUG oslo_concurrency.lockutils [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.923 182939 DEBUG nova.compute.manager [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] No waiting events found dispatching network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.923 182939 WARNING nova.compute.manager [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received unexpected event network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea for instance with vm_state active and task_state migrating.
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.923 182939 DEBUG nova.compute.manager [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.924 182939 DEBUG oslo_concurrency.lockutils [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.924 182939 DEBUG oslo_concurrency.lockutils [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.924 182939 DEBUG oslo_concurrency.lockutils [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.924 182939 DEBUG nova.compute.manager [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] No waiting events found dispatching network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.925 182939 WARNING nova.compute.manager [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received unexpected event network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea for instance with vm_state active and task_state migrating.
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.925 182939 DEBUG nova.compute.manager [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.925 182939 DEBUG oslo_concurrency.lockutils [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.926 182939 DEBUG oslo_concurrency.lockutils [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.926 182939 DEBUG oslo_concurrency.lockutils [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.926 182939 DEBUG nova.compute.manager [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] No waiting events found dispatching network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.927 182939 WARNING nova.compute.manager [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received unexpected event network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea for instance with vm_state active and task_state migrating.
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.927 182939 DEBUG nova.compute.manager [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received event network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.927 182939 DEBUG oslo_concurrency.lockutils [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.928 182939 DEBUG oslo_concurrency.lockutils [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.928 182939 DEBUG oslo_concurrency.lockutils [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.928 182939 DEBUG nova.compute.manager [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] No waiting events found dispatching network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:09 compute-0 nova_compute[182935]: 2026-01-21 23:46:09.929 182939 WARNING nova.compute.manager [req-61e68902-663c-4ebc-9eca-fa33eae439dc req-7913b1a1-4981-4c49-86b7-d39a3aeccf1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Received unexpected event network-vif-plugged-bae5fde2-5ead-4ae5-90dd-1d6d468541ea for instance with vm_state active and task_state migrating.
Jan 21 23:46:10 compute-0 nova_compute[182935]: 2026-01-21 23:46:10.263 182939 DEBUG oslo_concurrency.lockutils [None req-154201ce-9576-43f1-be68-919d75df3b2a ea22805b616f456e8ba7c990243061ce cf445faa22d0476f94de3bc5c149f837 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:11 compute-0 sshd-session[213177]: Received disconnect from 45.148.10.151 port 30204:11:  [preauth]
Jan 21 23:46:11 compute-0 sshd-session[213177]: Disconnected from authenticating user root 45.148.10.151 port 30204 [preauth]
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.363 182939 DEBUG oslo_concurrency.lockutils [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "eae14f31-b7b2-4d6d-8b75-16323b2c7a01" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.364 182939 DEBUG oslo_concurrency.lockutils [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "eae14f31-b7b2-4d6d-8b75-16323b2c7a01" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.365 182939 DEBUG oslo_concurrency.lockutils [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "eae14f31-b7b2-4d6d-8b75-16323b2c7a01-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.365 182939 DEBUG oslo_concurrency.lockutils [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "eae14f31-b7b2-4d6d-8b75-16323b2c7a01-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.366 182939 DEBUG oslo_concurrency.lockutils [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "eae14f31-b7b2-4d6d-8b75-16323b2c7a01-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.377 182939 INFO nova.compute.manager [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Terminating instance
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.386 182939 DEBUG oslo_concurrency.lockutils [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "refresh_cache-eae14f31-b7b2-4d6d-8b75-16323b2c7a01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.386 182939 DEBUG oslo_concurrency.lockutils [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquired lock "refresh_cache-eae14f31-b7b2-4d6d-8b75-16323b2c7a01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.387 182939 DEBUG nova.network.neutron [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.467 182939 DEBUG nova.network.neutron [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Updated VIF entry in instance network info cache for port bae5fde2-5ead-4ae5-90dd-1d6d468541ea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.468 182939 DEBUG nova.network.neutron [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Updating instance_info_cache with network_info: [{"id": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "address": "fa:16:3e:6f:ac:86", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae5fde2-5e", "ovs_interfaceid": "bae5fde2-5ead-4ae5-90dd-1d6d468541ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-2.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.494 182939 DEBUG oslo_concurrency.lockutils [req-fcedffaa-e278-4808-9dcc-4cd59a06977d req-01a9c120-fe70-4c24-983b-3ddbc71f43df 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-69dceb72-db44-4bfc-9b98-cc8b39885ae7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:46:11 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 23:46:11 compute-0 systemd[212975]: Activating special unit Exit the Session...
Jan 21 23:46:11 compute-0 systemd[212975]: Stopped target Main User Target.
Jan 21 23:46:11 compute-0 systemd[212975]: Stopped target Basic System.
Jan 21 23:46:11 compute-0 systemd[212975]: Stopped target Paths.
Jan 21 23:46:11 compute-0 systemd[212975]: Stopped target Sockets.
Jan 21 23:46:11 compute-0 systemd[212975]: Stopped target Timers.
Jan 21 23:46:11 compute-0 systemd[212975]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:46:11 compute-0 systemd[212975]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:46:11 compute-0 systemd[212975]: Closed D-Bus User Message Bus Socket.
Jan 21 23:46:11 compute-0 systemd[212975]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:46:11 compute-0 systemd[212975]: Removed slice User Application Slice.
Jan 21 23:46:11 compute-0 systemd[212975]: Reached target Shutdown.
Jan 21 23:46:11 compute-0 systemd[212975]: Finished Exit the Session.
Jan 21 23:46:11 compute-0 systemd[212975]: Reached target Exit the Session.
Jan 21 23:46:11 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 23:46:11 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 23:46:11 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 23:46:11 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 23:46:11 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 23:46:11 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 23:46:11 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 23:46:11 compute-0 nova_compute[182935]: 2026-01-21 23:46:11.691 182939 DEBUG nova.network.neutron [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:46:12 compute-0 nova_compute[182935]: 2026-01-21 23:46:12.584 182939 DEBUG nova.network.neutron [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:46:12 compute-0 nova_compute[182935]: 2026-01-21 23:46:12.612 182939 DEBUG oslo_concurrency.lockutils [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Releasing lock "refresh_cache-eae14f31-b7b2-4d6d-8b75-16323b2c7a01" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:46:12 compute-0 nova_compute[182935]: 2026-01-21 23:46:12.613 182939 DEBUG nova.compute.manager [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:46:12 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Jan 21 23:46:12 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000d.scope: Consumed 3.522s CPU time.
Jan 21 23:46:12 compute-0 systemd-machined[154182]: Machine qemu-6-instance-0000000d terminated.
Jan 21 23:46:12 compute-0 nova_compute[182935]: 2026-01-21 23:46:12.867 182939 INFO nova.virt.libvirt.driver [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance destroyed successfully.
Jan 21 23:46:12 compute-0 nova_compute[182935]: 2026-01-21 23:46:12.867 182939 DEBUG nova.objects.instance [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lazy-loading 'resources' on Instance uuid eae14f31-b7b2-4d6d-8b75-16323b2c7a01 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:12 compute-0 nova_compute[182935]: 2026-01-21 23:46:12.888 182939 INFO nova.virt.libvirt.driver [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Deleting instance files /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01_del
Jan 21 23:46:12 compute-0 nova_compute[182935]: 2026-01-21 23:46:12.888 182939 INFO nova.virt.libvirt.driver [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Deletion of /var/lib/nova/instances/eae14f31-b7b2-4d6d-8b75-16323b2c7a01_del complete
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.025 182939 INFO nova.compute.manager [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.025 182939 DEBUG oslo.service.loopingcall [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.025 182939 DEBUG nova.compute.manager [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.026 182939 DEBUG nova.network.neutron [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.081 182939 DEBUG oslo_concurrency.processutils [None req-211e6e4b-5606-42a8-886f-f3686b671de8 e0dca25563dd4b3eac6a8e55e10215b6 a93ebed23d554bf3aa85d310b0f441e4 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.111 182939 DEBUG oslo_concurrency.processutils [None req-211e6e4b-5606-42a8-886f-f3686b671de8 e0dca25563dd4b3eac6a8e55e10215b6 a93ebed23d554bf3aa85d310b0f441e4 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.316 182939 DEBUG nova.network.neutron [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.336 182939 DEBUG nova.network.neutron [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.356 182939 INFO nova.compute.manager [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Took 0.33 seconds to deallocate network for instance.
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.467 182939 DEBUG oslo_concurrency.lockutils [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.468 182939 DEBUG oslo_concurrency.lockutils [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.595 182939 DEBUG nova.compute.provider_tree [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.625 182939 DEBUG nova.scheduler.client.report [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.658 182939 DEBUG oslo_concurrency.lockutils [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.715 182939 INFO nova.scheduler.client.report [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Deleted allocations for instance eae14f31-b7b2-4d6d-8b75-16323b2c7a01
Jan 21 23:46:13 compute-0 nova_compute[182935]: 2026-01-21 23:46:13.833 182939 DEBUG oslo_concurrency.lockutils [None req-5d953822-1c50-43a8-87fe-cd547bef8939 2804acabdff34faf94b1505dceaa0282 be00ef0f55574d99a46f805f6d04ce41 - - default default] Lock "eae14f31-b7b2-4d6d-8b75-16323b2c7a01" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:14 compute-0 nova_compute[182935]: 2026-01-21 23:46:14.150 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:14 compute-0 nova_compute[182935]: 2026-01-21 23:46:14.656 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:15 compute-0 podman[213191]: 2026-01-21 23:46:15.708847967 +0000 UTC m=+0.064851641 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:46:15 compute-0 podman[213190]: 2026-01-21 23:46:15.773356959 +0000 UTC m=+0.138629115 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 23:46:15 compute-0 nova_compute[182935]: 2026-01-21 23:46:15.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.350 182939 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.351 182939 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.351 182939 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "69dceb72-db44-4bfc-9b98-cc8b39885ae7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.388 182939 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.389 182939 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.389 182939 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.389 182939 DEBUG nova.compute.resource_tracker [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.583 182939 WARNING nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.584 182939 DEBUG nova.compute.resource_tracker [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5675MB free_disk=73.34698486328125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.584 182939 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.584 182939 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.759 182939 DEBUG nova.compute.resource_tracker [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Migration for instance 69dceb72-db44-4bfc-9b98-cc8b39885ae7 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.805 182939 DEBUG nova.compute.resource_tracker [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.878 182939 DEBUG nova.compute.resource_tracker [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Migration d33d7270-640c-4eb6-9e9e-624ebad15196 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.878 182939 DEBUG nova.compute.resource_tracker [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.879 182939 DEBUG nova.compute.resource_tracker [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.961 182939 DEBUG nova.compute.provider_tree [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:46:17 compute-0 nova_compute[182935]: 2026-01-21 23:46:17.988 182939 DEBUG nova.scheduler.client.report [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.026 182939 DEBUG nova.compute.resource_tracker [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.026 182939 DEBUG oslo_concurrency.lockutils [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.442s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.044 182939 INFO nova.compute.manager [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Migrating instance to compute-2.ctlplane.example.com finished successfully.
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.227 182939 INFO nova.scheduler.client.report [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Deleted allocation for migration d33d7270-640c-4eb6-9e9e-624ebad15196
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.228 182939 DEBUG nova.virt.libvirt.driver [None req-86d0787f-6d68-47f6-a9a4-8fcab7e1a9b1 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.820 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.821 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.822 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.855 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.856 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.856 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:18 compute-0 nova_compute[182935]: 2026-01-21 23:46:18.857 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:46:19 compute-0 nova_compute[182935]: 2026-01-21 23:46:19.089 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:46:19 compute-0 nova_compute[182935]: 2026-01-21 23:46:19.090 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5676MB free_disk=73.34698486328125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:46:19 compute-0 nova_compute[182935]: 2026-01-21 23:46:19.090 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:19 compute-0 nova_compute[182935]: 2026-01-21 23:46:19.091 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:19 compute-0 nova_compute[182935]: 2026-01-21 23:46:19.153 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:19 compute-0 nova_compute[182935]: 2026-01-21 23:46:19.169 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:46:19 compute-0 nova_compute[182935]: 2026-01-21 23:46:19.170 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:46:19 compute-0 nova_compute[182935]: 2026-01-21 23:46:19.241 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:46:19 compute-0 nova_compute[182935]: 2026-01-21 23:46:19.265 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:46:19 compute-0 nova_compute[182935]: 2026-01-21 23:46:19.268 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:46:19 compute-0 nova_compute[182935]: 2026-01-21 23:46:19.268 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:19 compute-0 nova_compute[182935]: 2026-01-21 23:46:19.658 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:20 compute-0 nova_compute[182935]: 2026-01-21 23:46:20.240 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:20 compute-0 nova_compute[182935]: 2026-01-21 23:46:20.241 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:20 compute-0 nova_compute[182935]: 2026-01-21 23:46:20.241 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:46:21 compute-0 nova_compute[182935]: 2026-01-21 23:46:21.661 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039166.6603713, 69dceb72-db44-4bfc-9b98-cc8b39885ae7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:21 compute-0 nova_compute[182935]: 2026-01-21 23:46:21.662 182939 INFO nova.compute.manager [-] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] VM Stopped (Lifecycle Event)
Jan 21 23:46:21 compute-0 nova_compute[182935]: 2026-01-21 23:46:21.694 182939 DEBUG nova.compute.manager [None req-d771af8c-db36-4d92-b1f7-6afbc482982b - - - - - -] [instance: 69dceb72-db44-4bfc-9b98-cc8b39885ae7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:21 compute-0 nova_compute[182935]: 2026-01-21 23:46:21.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:46:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:46:23 compute-0 podman[213243]: 2026-01-21 23:46:23.686407142 +0000 UTC m=+0.055229012 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 23:46:24 compute-0 nova_compute[182935]: 2026-01-21 23:46:24.155 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:24 compute-0 nova_compute[182935]: 2026-01-21 23:46:24.661 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:24 compute-0 nova_compute[182935]: 2026-01-21 23:46:24.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.177 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "d183f0c7-af09-4462-a1ff-805f65bac401" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.178 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "d183f0c7-af09-4462-a1ff-805f65bac401" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.209 182939 DEBUG nova.compute.manager [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.442 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.443 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.450 182939 DEBUG nova.virt.hardware [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.451 182939 INFO nova.compute.claims [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.645 182939 DEBUG nova.compute.provider_tree [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.671 182939 DEBUG nova.scheduler.client.report [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.700 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.701 182939 DEBUG nova.compute.manager [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.792 182939 DEBUG nova.compute.manager [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.793 182939 DEBUG nova.network.neutron [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.822 182939 INFO nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:46:25 compute-0 nova_compute[182935]: 2026-01-21 23:46:25.855 182939 DEBUG nova.compute.manager [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.110 182939 DEBUG nova.compute.manager [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.112 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.112 182939 INFO nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Creating image(s)
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.113 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "/var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.114 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "/var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.115 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "/var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.147 182939 DEBUG oslo_concurrency.processutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.219 182939 DEBUG oslo_concurrency.processutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.221 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.223 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.252 182939 DEBUG oslo_concurrency.processutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.319 182939 DEBUG oslo_concurrency.processutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.321 182939 DEBUG oslo_concurrency.processutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.372 182939 DEBUG oslo_concurrency.processutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.374 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.375 182939 DEBUG oslo_concurrency.processutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.440 182939 DEBUG oslo_concurrency.processutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.442 182939 DEBUG nova.virt.disk.api [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Checking if we can resize image /var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.443 182939 DEBUG oslo_concurrency.processutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.508 182939 DEBUG oslo_concurrency.processutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.510 182939 DEBUG nova.virt.disk.api [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Cannot resize image /var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.511 182939 DEBUG nova.objects.instance [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lazy-loading 'migration_context' on Instance uuid d183f0c7-af09-4462-a1ff-805f65bac401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.548 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.549 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Ensure instance console log exists: /var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.549 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.550 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.551 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:26 compute-0 podman[213282]: 2026-01-21 23:46:26.693224624 +0000 UTC m=+0.063076289 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.902 182939 DEBUG nova.network.neutron [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.903 182939 DEBUG nova.compute.manager [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.905 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.911 182939 WARNING nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.917 182939 DEBUG nova.virt.libvirt.host [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.918 182939 DEBUG nova.virt.libvirt.host [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.921 182939 DEBUG nova.virt.libvirt.host [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.922 182939 DEBUG nova.virt.libvirt.host [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.924 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.925 182939 DEBUG nova.virt.hardware [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.925 182939 DEBUG nova.virt.hardware [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.926 182939 DEBUG nova.virt.hardware [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.926 182939 DEBUG nova.virt.hardware [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.927 182939 DEBUG nova.virt.hardware [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.927 182939 DEBUG nova.virt.hardware [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.928 182939 DEBUG nova.virt.hardware [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.929 182939 DEBUG nova.virt.hardware [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.929 182939 DEBUG nova.virt.hardware [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.930 182939 DEBUG nova.virt.hardware [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.930 182939 DEBUG nova.virt.hardware [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:46:26 compute-0 nova_compute[182935]: 2026-01-21 23:46:26.937 182939 DEBUG nova.objects.instance [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lazy-loading 'pci_devices' on Instance uuid d183f0c7-af09-4462-a1ff-805f65bac401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:27 compute-0 nova_compute[182935]: 2026-01-21 23:46:27.010 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:46:27 compute-0 nova_compute[182935]:   <uuid>d183f0c7-af09-4462-a1ff-805f65bac401</uuid>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   <name>instance-00000010</name>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersAdminNegativeTestJSON-server-735358358</nova:name>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:46:26</nova:creationTime>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:46:27 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:46:27 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:46:27 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:46:27 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:46:27 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:46:27 compute-0 nova_compute[182935]:         <nova:user uuid="aa25befcc85f49009cc03d3f9a7af21a">tempest-ServersAdminNegativeTestJSON-1173112993-project-member</nova:user>
Jan 21 23:46:27 compute-0 nova_compute[182935]:         <nova:project uuid="7b93a1e09d8a4019807c39b0826b8c31">tempest-ServersAdminNegativeTestJSON-1173112993</nova:project>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <system>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <entry name="serial">d183f0c7-af09-4462-a1ff-805f65bac401</entry>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <entry name="uuid">d183f0c7-af09-4462-a1ff-805f65bac401</entry>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     </system>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   <os>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   </os>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   <features>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   </features>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk.config"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/console.log" append="off"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <video>
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     </video>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:46:27 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:46:27 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:46:27 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:46:27 compute-0 nova_compute[182935]: </domain>
Jan 21 23:46:27 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:46:27 compute-0 nova_compute[182935]: 2026-01-21 23:46:27.207 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:46:27 compute-0 nova_compute[182935]: 2026-01-21 23:46:27.208 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:46:27 compute-0 nova_compute[182935]: 2026-01-21 23:46:27.208 182939 INFO nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Using config drive
Jan 21 23:46:27 compute-0 nova_compute[182935]: 2026-01-21 23:46:27.529 182939 INFO nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Creating config drive at /var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk.config
Jan 21 23:46:27 compute-0 nova_compute[182935]: 2026-01-21 23:46:27.534 182939 DEBUG oslo_concurrency.processutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9d3j7cma execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:27 compute-0 nova_compute[182935]: 2026-01-21 23:46:27.665 182939 DEBUG oslo_concurrency.processutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9d3j7cma" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:27 compute-0 systemd-machined[154182]: New machine qemu-7-instance-00000010.
Jan 21 23:46:27 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000010.
Jan 21 23:46:27 compute-0 nova_compute[182935]: 2026-01-21 23:46:27.866 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039172.8647764, eae14f31-b7b2-4d6d-8b75-16323b2c7a01 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:27 compute-0 nova_compute[182935]: 2026-01-21 23:46:27.867 182939 INFO nova.compute.manager [-] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] VM Stopped (Lifecycle Event)
Jan 21 23:46:27 compute-0 nova_compute[182935]: 2026-01-21 23:46:27.894 182939 DEBUG nova.compute.manager [None req-8f5b8644-f61f-49ad-9872-9544e0c4beb6 - - - - - -] [instance: eae14f31-b7b2-4d6d-8b75-16323b2c7a01] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.080 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039188.0799541, d183f0c7-af09-4462-a1ff-805f65bac401 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.082 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] VM Resumed (Lifecycle Event)
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.085 182939 DEBUG nova.compute.manager [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.086 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.094 182939 INFO nova.virt.libvirt.driver [-] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Instance spawned successfully.
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.095 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.129 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.140 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.145 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.145 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.146 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.146 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.147 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.147 182939 DEBUG nova.virt.libvirt.driver [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.194 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.194 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039188.0819216, d183f0c7-af09-4462-a1ff-805f65bac401 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.195 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] VM Started (Lifecycle Event)
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.215 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.220 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.249 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.261 182939 INFO nova.compute.manager [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Took 2.15 seconds to spawn the instance on the hypervisor.
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.262 182939 DEBUG nova.compute.manager [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.413 182939 INFO nova.compute.manager [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Took 3.03 seconds to build instance.
Jan 21 23:46:28 compute-0 nova_compute[182935]: 2026-01-21 23:46:28.450 182939 DEBUG oslo_concurrency.lockutils [None req-91b78a70-36ce-4b55-975b-9e6e55af03f4 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "d183f0c7-af09-4462-a1ff-805f65bac401" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:29 compute-0 nova_compute[182935]: 2026-01-21 23:46:29.157 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:29 compute-0 nova_compute[182935]: 2026-01-21 23:46:29.664 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:30 compute-0 nova_compute[182935]: 2026-01-21 23:46:30.881 182939 DEBUG nova.objects.instance [None req-1ec0f550-af9a-4118-9ea3-be48b6108f7a 46fc1e7193e04fb2b459c07f796af02b 1e04d41ab45f46589cc66f37ca912cfa - - default default] Lazy-loading 'pci_devices' on Instance uuid d183f0c7-af09-4462-a1ff-805f65bac401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:30 compute-0 nova_compute[182935]: 2026-01-21 23:46:30.919 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039190.9184601, d183f0c7-af09-4462-a1ff-805f65bac401 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:30 compute-0 nova_compute[182935]: 2026-01-21 23:46:30.920 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] VM Paused (Lifecycle Event)
Jan 21 23:46:30 compute-0 nova_compute[182935]: 2026-01-21 23:46:30.946 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:30 compute-0 nova_compute[182935]: 2026-01-21 23:46:30.951 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:46:30 compute-0 nova_compute[182935]: 2026-01-21 23:46:30.979 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 21 23:46:31 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 21 23:46:31 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000010.scope: Consumed 3.296s CPU time.
Jan 21 23:46:31 compute-0 systemd-machined[154182]: Machine qemu-7-instance-00000010 terminated.
Jan 21 23:46:31 compute-0 nova_compute[182935]: 2026-01-21 23:46:31.823 182939 DEBUG nova.compute.manager [None req-1ec0f550-af9a-4118-9ea3-be48b6108f7a 46fc1e7193e04fb2b459c07f796af02b 1e04d41ab45f46589cc66f37ca912cfa - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:34 compute-0 nova_compute[182935]: 2026-01-21 23:46:34.161 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:34 compute-0 nova_compute[182935]: 2026-01-21 23:46:34.667 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:35.223 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:46:35 compute-0 nova_compute[182935]: 2026-01-21 23:46:35.224 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:35.224 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:46:35 compute-0 podman[213342]: 2026-01-21 23:46:35.737401307 +0000 UTC m=+0.091080639 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 21 23:46:36 compute-0 podman[213362]: 2026-01-21 23:46:36.733672044 +0000 UTC m=+0.091269063 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.076 182939 DEBUG oslo_concurrency.lockutils [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "d183f0c7-af09-4462-a1ff-805f65bac401" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.077 182939 DEBUG oslo_concurrency.lockutils [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "d183f0c7-af09-4462-a1ff-805f65bac401" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.078 182939 DEBUG oslo_concurrency.lockutils [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "d183f0c7-af09-4462-a1ff-805f65bac401-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.078 182939 DEBUG oslo_concurrency.lockutils [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "d183f0c7-af09-4462-a1ff-805f65bac401-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.078 182939 DEBUG oslo_concurrency.lockutils [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "d183f0c7-af09-4462-a1ff-805f65bac401-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.093 182939 INFO nova.compute.manager [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Terminating instance
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.105 182939 DEBUG oslo_concurrency.lockutils [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "refresh_cache-d183f0c7-af09-4462-a1ff-805f65bac401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.106 182939 DEBUG oslo_concurrency.lockutils [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquired lock "refresh_cache-d183f0c7-af09-4462-a1ff-805f65bac401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.106 182939 DEBUG nova.network.neutron [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.415 182939 DEBUG nova.network.neutron [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.773 182939 DEBUG nova.network.neutron [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.803 182939 DEBUG oslo_concurrency.lockutils [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Releasing lock "refresh_cache-d183f0c7-af09-4462-a1ff-805f65bac401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.805 182939 DEBUG nova.compute.manager [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.815 182939 INFO nova.virt.libvirt.driver [-] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Instance destroyed successfully.
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.816 182939 DEBUG nova.objects.instance [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lazy-loading 'resources' on Instance uuid d183f0c7-af09-4462-a1ff-805f65bac401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.835 182939 INFO nova.virt.libvirt.driver [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Deleting instance files /var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401_del
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.836 182939 INFO nova.virt.libvirt.driver [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Deletion of /var/lib/nova/instances/d183f0c7-af09-4462-a1ff-805f65bac401_del complete
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.967 182939 INFO nova.compute.manager [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Took 0.16 seconds to destroy the instance on the hypervisor.
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.968 182939 DEBUG oslo.service.loopingcall [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.969 182939 DEBUG nova.compute.manager [-] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:46:38 compute-0 nova_compute[182935]: 2026-01-21 23:46:38.969 182939 DEBUG nova.network.neutron [-] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:46:39 compute-0 nova_compute[182935]: 2026-01-21 23:46:39.138 182939 DEBUG nova.network.neutron [-] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:46:39 compute-0 nova_compute[182935]: 2026-01-21 23:46:39.158 182939 DEBUG nova.network.neutron [-] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:46:39 compute-0 nova_compute[182935]: 2026-01-21 23:46:39.163 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:39 compute-0 nova_compute[182935]: 2026-01-21 23:46:39.184 182939 INFO nova.compute.manager [-] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Took 0.21 seconds to deallocate network for instance.
Jan 21 23:46:39 compute-0 nova_compute[182935]: 2026-01-21 23:46:39.259 182939 DEBUG oslo_concurrency.lockutils [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:39 compute-0 nova_compute[182935]: 2026-01-21 23:46:39.260 182939 DEBUG oslo_concurrency.lockutils [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:39 compute-0 nova_compute[182935]: 2026-01-21 23:46:39.344 182939 DEBUG nova.compute.provider_tree [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:46:39 compute-0 nova_compute[182935]: 2026-01-21 23:46:39.369 182939 DEBUG nova.scheduler.client.report [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:46:39 compute-0 nova_compute[182935]: 2026-01-21 23:46:39.398 182939 DEBUG oslo_concurrency.lockutils [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:39 compute-0 nova_compute[182935]: 2026-01-21 23:46:39.427 182939 INFO nova.scheduler.client.report [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Deleted allocations for instance d183f0c7-af09-4462-a1ff-805f65bac401
Jan 21 23:46:39 compute-0 nova_compute[182935]: 2026-01-21 23:46:39.586 182939 DEBUG oslo_concurrency.lockutils [None req-4ad08212-3a27-4a1b-9113-b10476a88376 aa25befcc85f49009cc03d3f9a7af21a 7b93a1e09d8a4019807c39b0826b8c31 - - default default] Lock "d183f0c7-af09-4462-a1ff-805f65bac401" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.509s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:39 compute-0 nova_compute[182935]: 2026-01-21 23:46:39.670 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:41 compute-0 sshd-session[213384]: Invalid user tomcat from 188.166.69.60 port 49142
Jan 21 23:46:41 compute-0 sshd-session[213384]: Connection closed by invalid user tomcat 188.166.69.60 port 49142 [preauth]
Jan 21 23:46:42 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:42.227 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:44 compute-0 nova_compute[182935]: 2026-01-21 23:46:44.166 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:44 compute-0 nova_compute[182935]: 2026-01-21 23:46:44.672 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:45 compute-0 nova_compute[182935]: 2026-01-21 23:46:45.172 182939 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Creating tmpfile /var/lib/nova/instances/tmp43n8huxx to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 21 23:46:45 compute-0 nova_compute[182935]: 2026-01-21 23:46:45.174 182939 DEBUG nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp43n8huxx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.089 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.090 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.120 182939 DEBUG nova.compute.manager [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.262 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.263 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.281 182939 DEBUG nova.virt.hardware [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.282 182939 INFO nova.compute.claims [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.456 182939 DEBUG nova.compute.provider_tree [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.475 182939 DEBUG nova.scheduler.client.report [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.502 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.503 182939 DEBUG nova.compute.manager [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.560 182939 DEBUG nova.compute.manager [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.561 182939 DEBUG nova.network.neutron [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.589 182939 INFO nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.616 182939 DEBUG nova.compute.manager [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:46:46 compute-0 podman[213387]: 2026-01-21 23:46:46.704353295 +0000 UTC m=+0.067738700 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:46:46 compute-0 podman[213386]: 2026-01-21 23:46:46.768950519 +0000 UTC m=+0.125672504 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.771 182939 DEBUG nova.compute.manager [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.773 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.773 182939 INFO nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Creating image(s)
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.774 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.775 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.775 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.794 182939 DEBUG oslo_concurrency.processutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.823 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039191.8224537, d183f0c7-af09-4462-a1ff-805f65bac401 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.824 182939 INFO nova.compute.manager [-] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] VM Stopped (Lifecycle Event)
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.853 182939 DEBUG nova.compute.manager [None req-af2e7d59-faf8-4cb4-9dc9-1a58673317aa - - - - - -] [instance: d183f0c7-af09-4462-a1ff-805f65bac401] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.855 182939 DEBUG oslo_concurrency.processutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.855 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.856 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.867 182939 DEBUG oslo_concurrency.processutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.886 182939 DEBUG nova.policy [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4a6034ff39094b6486bac680b7ed5a57', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.924 182939 DEBUG oslo_concurrency.processutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.925 182939 DEBUG oslo_concurrency.processutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.965 182939 DEBUG oslo_concurrency.processutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.966 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:46 compute-0 nova_compute[182935]: 2026-01-21 23:46:46.967 182939 DEBUG oslo_concurrency.processutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.060 182939 DEBUG oslo_concurrency.processutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.061 182939 DEBUG nova.virt.disk.api [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Checking if we can resize image /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.062 182939 DEBUG oslo_concurrency.processutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.155 182939 DEBUG oslo_concurrency.processutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.156 182939 DEBUG nova.virt.disk.api [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Cannot resize image /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.157 182939 DEBUG nova.objects.instance [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'migration_context' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.174 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.175 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Ensure instance console log exists: /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.175 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.176 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.176 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.875 182939 DEBUG nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp43n8huxx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1080912-4a1f-4504-ae59-a0ad89963886',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.933 182939 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.934 182939 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquired lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.935 182939 DEBUG nova.network.neutron [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:46:47 compute-0 nova_compute[182935]: 2026-01-21 23:46:47.945 182939 DEBUG nova.network.neutron [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Successfully created port: 25b6ea25-2c24-4a07-9772-28913505aec2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.169 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.194 182939 DEBUG nova.network.neutron [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Successfully updated port: 25b6ea25-2c24-4a07-9772-28913505aec2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.213 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "refresh_cache-a6a89006-02c9-49b1-8bfb-8640ba1b495f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.213 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquired lock "refresh_cache-a6a89006-02c9-49b1-8bfb-8640ba1b495f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.213 182939 DEBUG nova.network.neutron [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.441 182939 DEBUG nova.compute.manager [req-4847b07b-d5e7-4119-8a9d-8d5d86f4472b req-1c4ac41f-ac2a-41ad-a5c4-5da9121eec95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-changed-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.442 182939 DEBUG nova.compute.manager [req-4847b07b-d5e7-4119-8a9d-8d5d86f4472b req-1c4ac41f-ac2a-41ad-a5c4-5da9121eec95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Refreshing instance network info cache due to event network-changed-25b6ea25-2c24-4a07-9772-28913505aec2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.442 182939 DEBUG oslo_concurrency.lockutils [req-4847b07b-d5e7-4119-8a9d-8d5d86f4472b req-1c4ac41f-ac2a-41ad-a5c4-5da9121eec95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-a6a89006-02c9-49b1-8bfb-8640ba1b495f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.633 182939 DEBUG nova.network.neutron [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.675 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.798 182939 DEBUG nova.network.neutron [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Updating instance_info_cache with network_info: [{"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.842 182939 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Releasing lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.857 182939 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp43n8huxx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1080912-4a1f-4504-ae59-a0ad89963886',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.858 182939 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Creating instance directory: /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.858 182939 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Creating disk.info with the contents: {'/var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk': 'qcow2', '/var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.859 182939 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.859 182939 DEBUG nova.objects.instance [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b1080912-4a1f-4504-ae59-a0ad89963886 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.886 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.974 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.975 182939 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.976 182939 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:49 compute-0 nova_compute[182935]: 2026-01-21 23:46:49.986 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.062 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.063 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.112 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.114 182939 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.114 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.205 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.207 182939 DEBUG nova.virt.disk.api [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Checking if we can resize image /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.208 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.273 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.275 182939 DEBUG nova.virt.disk.api [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Cannot resize image /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.276 182939 DEBUG nova.objects.instance [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lazy-loading 'migration_context' on Instance uuid b1080912-4a1f-4504-ae59-a0ad89963886 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.300 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.333 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.config 485376" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.336 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.config to /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.337 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.config /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.904 182939 DEBUG oslo_concurrency.processutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886/disk.config /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886" returned: 0 in 0.567s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.906 182939 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.908 182939 DEBUG nova.virt.libvirt.vif [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:46:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1283276848',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1283276848',id=17,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:46:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1298204af0f241dc8b63851b2046cf5c',ramdisk_id='',reservation_id='r-b25ryamg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1063342224',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1063342224-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:46:39Z,user_data=None,user_id='553fdc065acf4000a185abac43878ab4',uuid=b1080912-4a1f-4504-ae59-a0ad89963886,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.909 182939 DEBUG nova.network.os_vif_util [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converting VIF {"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.912 182939 DEBUG nova.network.os_vif_util [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.913 182939 DEBUG os_vif [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.915 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.916 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.918 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.923 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.924 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc16d8d18-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.925 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc16d8d18-66, col_values=(('external_ids', {'iface-id': 'c16d8d18-6610-45c3-8172-54b8b99474ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d8:d5:91', 'vm-uuid': 'b1080912-4a1f-4504-ae59-a0ad89963886'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.928 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:50 compute-0 NetworkManager[55139]: <info>  [1769039210.9295] manager: (tapc16d8d18-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.934 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.941 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.943 182939 INFO os_vif [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66')
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.944 182939 DEBUG nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.944 182939 DEBUG nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp43n8huxx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1080912-4a1f-4504-ae59-a0ad89963886',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.962 182939 DEBUG nova.network.neutron [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Updating instance_info_cache with network_info: [{"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.987 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Releasing lock "refresh_cache-a6a89006-02c9-49b1-8bfb-8640ba1b495f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.988 182939 DEBUG nova.compute.manager [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance network_info: |[{"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.988 182939 DEBUG oslo_concurrency.lockutils [req-4847b07b-d5e7-4119-8a9d-8d5d86f4472b req-1c4ac41f-ac2a-41ad-a5c4-5da9121eec95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-a6a89006-02c9-49b1-8bfb-8640ba1b495f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.989 182939 DEBUG nova.network.neutron [req-4847b07b-d5e7-4119-8a9d-8d5d86f4472b req-1c4ac41f-ac2a-41ad-a5c4-5da9121eec95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Refreshing network info cache for port 25b6ea25-2c24-4a07-9772-28913505aec2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.992 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Start _get_guest_xml network_info=[{"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:46:50 compute-0 nova_compute[182935]: 2026-01-21 23:46:50.998 182939 WARNING nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.003 182939 DEBUG nova.virt.libvirt.host [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.004 182939 DEBUG nova.virt.libvirt.host [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.009 182939 DEBUG nova.virt.libvirt.host [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.010 182939 DEBUG nova.virt.libvirt.host [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.012 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.012 182939 DEBUG nova.virt.hardware [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.012 182939 DEBUG nova.virt.hardware [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.012 182939 DEBUG nova.virt.hardware [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.013 182939 DEBUG nova.virt.hardware [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.013 182939 DEBUG nova.virt.hardware [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.013 182939 DEBUG nova.virt.hardware [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.013 182939 DEBUG nova.virt.hardware [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.014 182939 DEBUG nova.virt.hardware [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.014 182939 DEBUG nova.virt.hardware [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.014 182939 DEBUG nova.virt.hardware [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.014 182939 DEBUG nova.virt.hardware [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.019 182939 DEBUG nova.virt.libvirt.vif [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:46:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-168641085',display_name='tempest-ServersAdminTestJSON-server-168641085',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-168641085',id=18,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-evbqme7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:46:46Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=a6a89006-02c9-49b1-8bfb-8640ba1b495f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.019 182939 DEBUG nova.network.os_vif_util [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.020 182939 DEBUG nova.network.os_vif_util [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.021 182939 DEBUG nova.objects.instance [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'pci_devices' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.035 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:46:51 compute-0 nova_compute[182935]:   <uuid>a6a89006-02c9-49b1-8bfb-8640ba1b495f</uuid>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   <name>instance-00000012</name>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersAdminTestJSON-server-168641085</nova:name>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:46:50</nova:creationTime>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:46:51 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:46:51 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:46:51 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:46:51 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:46:51 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:46:51 compute-0 nova_compute[182935]:         <nova:user uuid="4a6034ff39094b6486bac680b7ed5a57">tempest-ServersAdminTestJSON-1815099341-project-member</nova:user>
Jan 21 23:46:51 compute-0 nova_compute[182935]:         <nova:project uuid="4d40fc03fb534b5689415f3d8a3de1fc">tempest-ServersAdminTestJSON-1815099341</nova:project>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:46:51 compute-0 nova_compute[182935]:         <nova:port uuid="25b6ea25-2c24-4a07-9772-28913505aec2">
Jan 21 23:46:51 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <system>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <entry name="serial">a6a89006-02c9-49b1-8bfb-8640ba1b495f</entry>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <entry name="uuid">a6a89006-02c9-49b1-8bfb-8640ba1b495f</entry>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     </system>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   <os>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   </os>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   <features>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   </features>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.config"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:0d:1c:84"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <target dev="tap25b6ea25-2c"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/console.log" append="off"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <video>
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     </video>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:46:51 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:46:51 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:46:51 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:46:51 compute-0 nova_compute[182935]: </domain>
Jan 21 23:46:51 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.035 182939 DEBUG nova.compute.manager [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Preparing to wait for external event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.035 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.036 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.036 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.037 182939 DEBUG nova.virt.libvirt.vif [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:46:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-168641085',display_name='tempest-ServersAdminTestJSON-server-168641085',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-168641085',id=18,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-evbqme7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:46:46Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=a6a89006-02c9-49b1-8bfb-8640ba1b495f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.037 182939 DEBUG nova.network.os_vif_util [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.038 182939 DEBUG nova.network.os_vif_util [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.038 182939 DEBUG os_vif [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.039 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.039 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.039 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.042 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.043 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25b6ea25-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.043 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap25b6ea25-2c, col_values=(('external_ids', {'iface-id': '25b6ea25-2c24-4a07-9772-28913505aec2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:1c:84', 'vm-uuid': 'a6a89006-02c9-49b1-8bfb-8640ba1b495f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.045 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:51 compute-0 NetworkManager[55139]: <info>  [1769039211.0467] manager: (tap25b6ea25-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.047 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.057 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.058 182939 INFO os_vif [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c')
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.146 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.147 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.147 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No VIF found with MAC fa:16:3e:0d:1c:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.148 182939 INFO nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Using config drive
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.657 182939 INFO nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Creating config drive at /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.config
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.664 182939 DEBUG oslo_concurrency.processutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6q7j9gns execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.796 182939 DEBUG oslo_concurrency.processutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6q7j9gns" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:46:51 compute-0 kernel: tap25b6ea25-2c: entered promiscuous mode
Jan 21 23:46:51 compute-0 NetworkManager[55139]: <info>  [1769039211.8927] manager: (tap25b6ea25-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.895 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:51 compute-0 ovn_controller[95047]: 2026-01-21T23:46:51Z|00047|binding|INFO|Claiming lport 25b6ea25-2c24-4a07-9772-28913505aec2 for this chassis.
Jan 21 23:46:51 compute-0 ovn_controller[95047]: 2026-01-21T23:46:51Z|00048|binding|INFO|25b6ea25-2c24-4a07-9772-28913505aec2: Claiming fa:16:3e:0d:1c:84 10.100.0.8
Jan 21 23:46:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:51.913 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:1c:84 10.100.0.8'], port_security=['fa:16:3e:0d:1c:84 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a6a89006-02c9-49b1-8bfb-8640ba1b495f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=25b6ea25-2c24-4a07-9772-28913505aec2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:46:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:51.914 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 25b6ea25-2c24-4a07-9772-28913505aec2 in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 bound to our chassis
Jan 21 23:46:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:51.916 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:46:51 compute-0 systemd-udevd[213496]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:46:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:51.933 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f015852c-2ef6-4df3-b083-cacb5f28cf27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:51.935 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1530a22a-f1 in ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:46:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:51.938 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1530a22a-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:46:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:51.939 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[84054cfe-4825-49a8-ac04-9f43b001c968]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:51.941 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0f9a4edc-750f-49eb-9899-16257f4315fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:51 compute-0 systemd-machined[154182]: New machine qemu-8-instance-00000012.
Jan 21 23:46:51 compute-0 NetworkManager[55139]: <info>  [1769039211.9471] device (tap25b6ea25-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:46:51 compute-0 NetworkManager[55139]: <info>  [1769039211.9482] device (tap25b6ea25-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.951 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:51.957 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[5ce13d98-b492-4f0b-8629-ac2167182177]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:51 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000012.
Jan 21 23:46:51 compute-0 ovn_controller[95047]: 2026-01-21T23:46:51Z|00049|binding|INFO|Setting lport 25b6ea25-2c24-4a07-9772-28913505aec2 ovn-installed in OVS
Jan 21 23:46:51 compute-0 ovn_controller[95047]: 2026-01-21T23:46:51Z|00050|binding|INFO|Setting lport 25b6ea25-2c24-4a07-9772-28913505aec2 up in Southbound
Jan 21 23:46:51 compute-0 nova_compute[182935]: 2026-01-21 23:46:51.964 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:51.980 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dc8613ce-104c-4879-8a12-f283f6a11955]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.021 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c7dd9d32-9bac-428d-9b66-2fb05077b712]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.031 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ec5a07-6abd-4eaf-bc86-5344f7cc8a4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:52 compute-0 NetworkManager[55139]: <info>  [1769039212.0332] manager: (tap1530a22a-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.087 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[42371c1d-43ce-4d0b-8f2b-6656cda42301]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.092 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[e8b00fe3-dc6a-4a0a-8f7a-98bc9140af12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:52 compute-0 NetworkManager[55139]: <info>  [1769039212.1262] device (tap1530a22a-f0): carrier: link connected
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.137 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[16df1f1b-644a-4a94-bacf-d320af18fc66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.169 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[68b1b1f1-e5bf-4732-9a6f-fd379a0af5e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371486, 'reachable_time': 18847, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213535, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.194 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[90dfc9a2-298b-4de0-a677-f08b191fb696]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea6:bf13'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371486, 'tstamp': 371486}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213536, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.219 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[82486a35-057c-4f01-a2c2-6d0bf9de0f4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371486, 'reachable_time': 18847, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213537, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.276 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[155b26ac-eaa8-4609-888a-f892ceefd3bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.375 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f4bab089-5bea-4823-b9c4-19c131566b82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.377 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.378 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.379 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1530a22a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:52 compute-0 NetworkManager[55139]: <info>  [1769039212.3821] manager: (tap1530a22a-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 21 23:46:52 compute-0 kernel: tap1530a22a-f0: entered promiscuous mode
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.383 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.387 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.388 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1530a22a-f0, col_values=(('external_ids', {'iface-id': '1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:46:52 compute-0 ovn_controller[95047]: 2026-01-21T23:46:52Z|00051|binding|INFO|Releasing lport 1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd from this chassis (sb_readonly=0)
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.416 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.422 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.423 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1530a22a-f758-407d-b1aa-fd922904fe07.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1530a22a-f758-407d-b1aa-fd922904fe07.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.424 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ed6a34ee-099f-4638-8c97-819e2e1fbd3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.425 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/1530a22a-f758-407d-b1aa-fd922904fe07.pid.haproxy
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:46:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:46:52.427 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'env', 'PROCESS_TAG=haproxy-1530a22a-f758-407d-b1aa-fd922904fe07', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1530a22a-f758-407d-b1aa-fd922904fe07.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.653 182939 DEBUG nova.compute.manager [req-b55bbb47-822b-4132-bcc0-72d9548d7e90 req-89bcc0ad-f5f3-4b5f-a0cb-044a103d05e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.653 182939 DEBUG oslo_concurrency.lockutils [req-b55bbb47-822b-4132-bcc0-72d9548d7e90 req-89bcc0ad-f5f3-4b5f-a0cb-044a103d05e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.654 182939 DEBUG oslo_concurrency.lockutils [req-b55bbb47-822b-4132-bcc0-72d9548d7e90 req-89bcc0ad-f5f3-4b5f-a0cb-044a103d05e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.654 182939 DEBUG oslo_concurrency.lockutils [req-b55bbb47-822b-4132-bcc0-72d9548d7e90 req-89bcc0ad-f5f3-4b5f-a0cb-044a103d05e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.654 182939 DEBUG nova.compute.manager [req-b55bbb47-822b-4132-bcc0-72d9548d7e90 req-89bcc0ad-f5f3-4b5f-a0cb-044a103d05e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Processing event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:46:52 compute-0 podman[213573]: 2026-01-21 23:46:52.849591785 +0000 UTC m=+0.055741344 container create 473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 23:46:52 compute-0 systemd[1]: Started libpod-conmon-473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c.scope.
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.904 182939 DEBUG nova.compute.manager [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.904 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039212.9032114, a6a89006-02c9-49b1-8bfb-8640ba1b495f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.904 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] VM Started (Lifecycle Event)
Jan 21 23:46:52 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.913 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:46:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59b5eab0a52fc8adb82ffc73f15219c4f5390f51b2d5b372bce629a0d4577c4c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:46:52 compute-0 podman[213573]: 2026-01-21 23:46:52.824500205 +0000 UTC m=+0.030649774 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.917 182939 INFO nova.virt.libvirt.driver [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance spawned successfully.
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.918 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.928 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:52 compute-0 podman[213573]: 2026-01-21 23:46:52.930694334 +0000 UTC m=+0.136843903 container init 473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.932 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:46:52 compute-0 podman[213573]: 2026-01-21 23:46:52.940157461 +0000 UTC m=+0.146307020 container start 473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.950 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.950 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.951 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.951 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.951 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.952 182939 DEBUG nova.virt.libvirt.driver [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.960 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.961 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039212.9034686, a6a89006-02c9-49b1-8bfb-8640ba1b495f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:52 compute-0 nova_compute[182935]: 2026-01-21 23:46:52.961 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] VM Paused (Lifecycle Event)
Jan 21 23:46:52 compute-0 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213595]: [NOTICE]   (213601) : New worker (213603) forked
Jan 21 23:46:52 compute-0 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213595]: [NOTICE]   (213601) : Loading success.
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.002 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.005 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039212.9123425, a6a89006-02c9-49b1-8bfb-8640ba1b495f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.005 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] VM Resumed (Lifecycle Event)
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.033 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.037 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.055 182939 INFO nova.compute.manager [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Took 6.28 seconds to spawn the instance on the hypervisor.
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.056 182939 DEBUG nova.compute.manager [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.066 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.146 182939 INFO nova.compute.manager [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Took 6.92 seconds to build instance.
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.180 182939 DEBUG oslo_concurrency.lockutils [None req-432d3320-e891-490b-b187-3aaeb317e6d3 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.182 182939 DEBUG nova.network.neutron [req-4847b07b-d5e7-4119-8a9d-8d5d86f4472b req-1c4ac41f-ac2a-41ad-a5c4-5da9121eec95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Updated VIF entry in instance network info cache for port 25b6ea25-2c24-4a07-9772-28913505aec2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.182 182939 DEBUG nova.network.neutron [req-4847b07b-d5e7-4119-8a9d-8d5d86f4472b req-1c4ac41f-ac2a-41ad-a5c4-5da9121eec95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Updating instance_info_cache with network_info: [{"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:46:53 compute-0 nova_compute[182935]: 2026-01-21 23:46:53.198 182939 DEBUG oslo_concurrency.lockutils [req-4847b07b-d5e7-4119-8a9d-8d5d86f4472b req-1c4ac41f-ac2a-41ad-a5c4-5da9121eec95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-a6a89006-02c9-49b1-8bfb-8640ba1b495f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.391 182939 DEBUG nova.network.neutron [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Port c16d8d18-6610-45c3-8172-54b8b99474ae updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.408 182939 DEBUG nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp43n8huxx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1080912-4a1f-4504-ae59-a0ad89963886',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.677 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:54 compute-0 podman[213612]: 2026-01-21 23:46:54.725504522 +0000 UTC m=+0.086304735 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.748 182939 DEBUG nova.compute.manager [req-4ff5f7e2-7126-49b6-abfd-2744f38b5c17 req-739f1c05-ac72-464f-adf2-2596551ec604 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.749 182939 DEBUG oslo_concurrency.lockutils [req-4ff5f7e2-7126-49b6-abfd-2744f38b5c17 req-739f1c05-ac72-464f-adf2-2596551ec604 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.749 182939 DEBUG oslo_concurrency.lockutils [req-4ff5f7e2-7126-49b6-abfd-2744f38b5c17 req-739f1c05-ac72-464f-adf2-2596551ec604 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.749 182939 DEBUG oslo_concurrency.lockutils [req-4ff5f7e2-7126-49b6-abfd-2744f38b5c17 req-739f1c05-ac72-464f-adf2-2596551ec604 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.750 182939 DEBUG nova.compute.manager [req-4ff5f7e2-7126-49b6-abfd-2744f38b5c17 req-739f1c05-ac72-464f-adf2-2596551ec604 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] No waiting events found dispatching network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.750 182939 WARNING nova.compute.manager [req-4ff5f7e2-7126-49b6-abfd-2744f38b5c17 req-739f1c05-ac72-464f-adf2-2596551ec604 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received unexpected event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 for instance with vm_state active and task_state None.
Jan 21 23:46:54 compute-0 NetworkManager[55139]: <info>  [1769039214.7614] manager: (tapc16d8d18-66): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Jan 21 23:46:54 compute-0 systemd-udevd[213516]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:46:54 compute-0 kernel: tapc16d8d18-66: entered promiscuous mode
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.765 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:54 compute-0 ovn_controller[95047]: 2026-01-21T23:46:54Z|00052|binding|INFO|Claiming lport c16d8d18-6610-45c3-8172-54b8b99474ae for this additional chassis.
Jan 21 23:46:54 compute-0 ovn_controller[95047]: 2026-01-21T23:46:54Z|00053|binding|INFO|c16d8d18-6610-45c3-8172-54b8b99474ae: Claiming fa:16:3e:d8:d5:91 10.100.0.4
Jan 21 23:46:54 compute-0 ovn_controller[95047]: 2026-01-21T23:46:54Z|00054|binding|INFO|Claiming lport 32683c17-e027-4757-9a64-36df76fef381 for this additional chassis.
Jan 21 23:46:54 compute-0 ovn_controller[95047]: 2026-01-21T23:46:54Z|00055|binding|INFO|32683c17-e027-4757-9a64-36df76fef381: Claiming fa:16:3e:51:cc:06 19.80.0.41
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.769 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:54 compute-0 NetworkManager[55139]: <info>  [1769039214.7883] device (tapc16d8d18-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:46:54 compute-0 NetworkManager[55139]: <info>  [1769039214.7890] device (tapc16d8d18-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.821 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.825 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:54 compute-0 ovn_controller[95047]: 2026-01-21T23:46:54Z|00056|binding|INFO|Setting lport c16d8d18-6610-45c3-8172-54b8b99474ae ovn-installed in OVS
Jan 21 23:46:54 compute-0 nova_compute[182935]: 2026-01-21 23:46:54.826 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:54 compute-0 systemd-machined[154182]: New machine qemu-9-instance-00000011.
Jan 21 23:46:54 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000011.
Jan 21 23:46:55 compute-0 nova_compute[182935]: 2026-01-21 23:46:55.547 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039215.546799, b1080912-4a1f-4504-ae59-a0ad89963886 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:55 compute-0 nova_compute[182935]: 2026-01-21 23:46:55.549 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] VM Started (Lifecycle Event)
Jan 21 23:46:55 compute-0 nova_compute[182935]: 2026-01-21 23:46:55.574 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:56 compute-0 nova_compute[182935]: 2026-01-21 23:46:56.046 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:46:56 compute-0 nova_compute[182935]: 2026-01-21 23:46:56.580 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039216.5805604, b1080912-4a1f-4504-ae59-a0ad89963886 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:46:56 compute-0 nova_compute[182935]: 2026-01-21 23:46:56.582 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] VM Resumed (Lifecycle Event)
Jan 21 23:46:56 compute-0 nova_compute[182935]: 2026-01-21 23:46:56.614 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:46:56 compute-0 nova_compute[182935]: 2026-01-21 23:46:56.618 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:46:56 compute-0 nova_compute[182935]: 2026-01-21 23:46:56.641 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 21 23:46:57 compute-0 podman[213673]: 2026-01-21 23:46:57.751762918 +0000 UTC m=+0.111898445 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:46:59 compute-0 nova_compute[182935]: 2026-01-21 23:46:59.681 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:01 compute-0 nova_compute[182935]: 2026-01-21 23:47:01.049 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:01 compute-0 ovn_controller[95047]: 2026-01-21T23:47:01Z|00057|binding|INFO|Claiming lport c16d8d18-6610-45c3-8172-54b8b99474ae for this chassis.
Jan 21 23:47:01 compute-0 ovn_controller[95047]: 2026-01-21T23:47:01Z|00058|binding|INFO|c16d8d18-6610-45c3-8172-54b8b99474ae: Claiming fa:16:3e:d8:d5:91 10.100.0.4
Jan 21 23:47:01 compute-0 ovn_controller[95047]: 2026-01-21T23:47:01Z|00059|binding|INFO|Claiming lport 32683c17-e027-4757-9a64-36df76fef381 for this chassis.
Jan 21 23:47:01 compute-0 ovn_controller[95047]: 2026-01-21T23:47:01Z|00060|binding|INFO|32683c17-e027-4757-9a64-36df76fef381: Claiming fa:16:3e:51:cc:06 19.80.0.41
Jan 21 23:47:01 compute-0 ovn_controller[95047]: 2026-01-21T23:47:01Z|00061|binding|INFO|Setting lport c16d8d18-6610-45c3-8172-54b8b99474ae up in Southbound
Jan 21 23:47:01 compute-0 ovn_controller[95047]: 2026-01-21T23:47:01Z|00062|binding|INFO|Setting lport 32683c17-e027-4757-9a64-36df76fef381 up in Southbound
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.002 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:d5:91 10.100.0.4'], port_security=['fa:16:3e:d8:d5:91 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1447043042', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b1080912-4a1f-4504-ae59-a0ad89963886', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1447043042', 'neutron:project_id': '1298204af0f241dc8b63851b2046cf5c', 'neutron:revision_number': '11', 'neutron:security_group_ids': '4fca0662-11c4-4183-96b8-546eae3304ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c50c611d-d348-436f-bd12-bc6add278699, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=c16d8d18-6610-45c3-8172-54b8b99474ae) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.004 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:cc:06 19.80.0.41'], port_security=['fa:16:3e:51:cc:06 19.80.0.41'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['c16d8d18-6610-45c3-8172-54b8b99474ae'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1528740689', 'neutron:cidrs': '19.80.0.41/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3af949ae-65f7-4e98-9b88-e75f765a8686', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1528740689', 'neutron:project_id': '1298204af0f241dc8b63851b2046cf5c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4fca0662-11c4-4183-96b8-546eae3304ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=d251bc29-f047-44fe-b77c-1e7f2007e967, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=32683c17-e027-4757-9a64-36df76fef381) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.005 104408 INFO neutron.agent.ovn.metadata.agent [-] Port c16d8d18-6610-45c3-8172-54b8b99474ae in datapath b7816b8e-52c1-4d60-84f7-524ebe7dfa5c bound to our chassis
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.007 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b7816b8e-52c1-4d60-84f7-524ebe7dfa5c
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.025 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[abc0c0dc-2683-4efb-8b23-a4776157d721]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.026 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb7816b8e-51 in ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.029 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb7816b8e-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.029 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3cd0d492-f490-45b0-8113-a8dd03dd2d13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.030 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[80a68d17-8947-4746-aefb-76c26cd59800]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.047 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[a0779fb3-3413-43e9-ac37-486ece0a4fb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.073 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b87ba5ff-84e3-41cb-8d53-ff3903912391]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.101 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9f854154-3b95-4168-9902-b479a7e91534]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.107 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[31803b74-c01c-4fc9-b107-d745591f3138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 NetworkManager[55139]: <info>  [1769039222.1090] manager: (tapb7816b8e-50): new Veth device (/org/freedesktop/NetworkManager/Devices/36)
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.137 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7b57a324-572b-4505-9851-f7b4ac53e929]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.139 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[896f15fb-617c-467e-872b-9e9e391ace23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 systemd-udevd[213698]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:47:02 compute-0 NetworkManager[55139]: <info>  [1769039222.1712] device (tapb7816b8e-50): carrier: link connected
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.177 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7b6cf93c-158a-4d5f-a2f8-39f3f995ea06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.194 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cbac8794-3646-4553-9c40-2112cbaa2b33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb7816b8e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:20:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372490, 'reachable_time': 44296, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213717, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.208 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e77479d4-802e-44e2-8e37-2b341143f63b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb1:20b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372490, 'tstamp': 372490}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213718, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.225 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[52ad1031-43e1-42fa-91ae-bae9758d30e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb7816b8e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b1:20:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372490, 'reachable_time': 44296, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213719, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.253 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b0731b72-81dd-4d27-bec1-d72312b7cf09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.319 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dfcc1c0f-e09b-43ef-b320-e4494f1509f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.321 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7816b8e-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.321 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.321 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7816b8e-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:02 compute-0 nova_compute[182935]: 2026-01-21 23:47:02.372 182939 INFO nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Post operation of migration started
Jan 21 23:47:02 compute-0 kernel: tapb7816b8e-50: entered promiscuous mode
Jan 21 23:47:02 compute-0 nova_compute[182935]: 2026-01-21 23:47:02.382 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:02 compute-0 NetworkManager[55139]: <info>  [1769039222.3832] manager: (tapb7816b8e-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.386 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb7816b8e-50, col_values=(('external_ids', {'iface-id': 'ecebff42-11cb-48b4-9c3d-966172998a49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:02 compute-0 ovn_controller[95047]: 2026-01-21T23:47:02Z|00063|binding|INFO|Releasing lport ecebff42-11cb-48b4-9c3d-966172998a49 from this chassis (sb_readonly=0)
Jan 21 23:47:02 compute-0 nova_compute[182935]: 2026-01-21 23:47:02.388 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.389 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.391 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bae05bc4-0847-4e41-93b1-737894cf11f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.391 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.pid.haproxy
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID b7816b8e-52c1-4d60-84f7-524ebe7dfa5c
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:47:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:02.392 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'env', 'PROCESS_TAG=haproxy-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b7816b8e-52c1-4d60-84f7-524ebe7dfa5c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:47:02 compute-0 nova_compute[182935]: 2026-01-21 23:47:02.400 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:02 compute-0 podman[213751]: 2026-01-21 23:47:02.811026647 +0000 UTC m=+0.066234874 container create 3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 23:47:02 compute-0 podman[213751]: 2026-01-21 23:47:02.772088286 +0000 UTC m=+0.027296493 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:47:02 compute-0 systemd[1]: Started libpod-conmon-3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842.scope.
Jan 21 23:47:02 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:47:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d73a0872e1456f1614d1e6de988bd5011f1887db645108a59c2e674e70dec0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:47:02 compute-0 podman[213751]: 2026-01-21 23:47:02.931358654 +0000 UTC m=+0.186566891 container init 3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:47:02 compute-0 podman[213751]: 2026-01-21 23:47:02.940045061 +0000 UTC m=+0.195253258 container start 3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 23:47:02 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213764]: [NOTICE]   (213771) : New worker (213773) forked
Jan 21 23:47:02 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213764]: [NOTICE]   (213771) : Loading success.
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.012 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 32683c17-e027-4757-9a64-36df76fef381 in datapath 3af949ae-65f7-4e98-9b88-e75f765a8686 unbound from our chassis
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.016 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3af949ae-65f7-4e98-9b88-e75f765a8686
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.034 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[98136bcd-bb77-4f78-bb26-39f6d1660920]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.036 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3af949ae-61 in ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.039 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3af949ae-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.039 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[efc366a5-c1e4-4cae-9046-e16a6d8cf29f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.040 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f95c55de-35ff-41b1-8597-4f1e0e5b07ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.062 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[7e173270-3495-4232-bc85-24658634e5a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.083 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0faceb-5b1a-4f23-882c-f5981a5b17ed]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.120 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c93cb92c-83b0-49ee-86cb-0ec3eee4eb6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 systemd-udevd[213706]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.142 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[174aa55d-2b73-4e22-83df-2844891cdd3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 NetworkManager[55139]: <info>  [1769039223.1434] manager: (tap3af949ae-60): new Veth device (/org/freedesktop/NetworkManager/Devices/38)
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.180 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[bfaad1fd-b36d-4c19-99d3-72febf7865da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.183 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.183 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.184 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.187 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8669f47f-f895-40a1-8302-ebf86be444eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 NetworkManager[55139]: <info>  [1769039223.2205] device (tap3af949ae-60): carrier: link connected
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.227 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[838f016b-04ec-4f60-9a7d-a879a38ad7ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 nova_compute[182935]: 2026-01-21 23:47:03.428 182939 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:03 compute-0 nova_compute[182935]: 2026-01-21 23:47:03.429 182939 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquired lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:03 compute-0 nova_compute[182935]: 2026-01-21 23:47:03.430 182939 DEBUG nova.network.neutron [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.446 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3f955828-ed1d-4bbd-9aa3-72ee5139a4f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3af949ae-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:66:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372595, 'reachable_time': 15718, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213792, 'error': None, 'target': 'ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.465 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ff96cc4b-5411-41a4-bd29-437941c1fafe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe49:668d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372595, 'tstamp': 372595}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213793, 'error': None, 'target': 'ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.490 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f3a09abc-d902-48f7-a58f-16a329c88655]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3af949ae-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:49:66:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 180, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 180, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372595, 'reachable_time': 15718, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213794, 'error': None, 'target': 'ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.531 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0efbe77e-efa7-4c6f-b584-ec3546622a29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.637 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cdfac624-6fe7-4558-a7b0-7822def3f95d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.639 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3af949ae-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.639 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.640 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3af949ae-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:03 compute-0 nova_compute[182935]: 2026-01-21 23:47:03.642 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:03 compute-0 kernel: tap3af949ae-60: entered promiscuous mode
Jan 21 23:47:03 compute-0 NetworkManager[55139]: <info>  [1769039223.6435] manager: (tap3af949ae-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 21 23:47:03 compute-0 nova_compute[182935]: 2026-01-21 23:47:03.645 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.648 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3af949ae-60, col_values=(('external_ids', {'iface-id': 'da91e802-47a1-4124-a7f9-83eac4382374'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:03 compute-0 nova_compute[182935]: 2026-01-21 23:47:03.649 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:03 compute-0 ovn_controller[95047]: 2026-01-21T23:47:03Z|00064|binding|INFO|Releasing lport da91e802-47a1-4124-a7f9-83eac4382374 from this chassis (sb_readonly=0)
Jan 21 23:47:03 compute-0 nova_compute[182935]: 2026-01-21 23:47:03.674 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.677 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3af949ae-65f7-4e98-9b88-e75f765a8686.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3af949ae-65f7-4e98-9b88-e75f765a8686.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.678 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[65283bdf-cb73-4a2a-a9b7-290169273a21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.679 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-3af949ae-65f7-4e98-9b88-e75f765a8686
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/3af949ae-65f7-4e98-9b88-e75f765a8686.pid.haproxy
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 3af949ae-65f7-4e98-9b88-e75f765a8686
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:03.681 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686', 'env', 'PROCESS_TAG=haproxy-3af949ae-65f7-4e98-9b88-e75f765a8686', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3af949ae-65f7-4e98-9b88-e75f765a8686.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:47:04 compute-0 podman[213827]: 2026-01-21 23:47:04.197696496 +0000 UTC m=+0.059448971 container create 94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 21 23:47:04 compute-0 systemd[1]: Started libpod-conmon-94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c.scope.
Jan 21 23:47:04 compute-0 podman[213827]: 2026-01-21 23:47:04.17315242 +0000 UTC m=+0.034904945 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:47:04 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:47:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5da3b4dd3ac1438b8ffd91b369373c6ab733234b145e9a15eef639085db26b60/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:47:04 compute-0 podman[213827]: 2026-01-21 23:47:04.301012917 +0000 UTC m=+0.162765412 container init 94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:47:04 compute-0 podman[213827]: 2026-01-21 23:47:04.306476557 +0000 UTC m=+0.168229032 container start 94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 21 23:47:04 compute-0 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213842]: [NOTICE]   (213846) : New worker (213848) forked
Jan 21 23:47:04 compute-0 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213842]: [NOTICE]   (213846) : Loading success.
Jan 21 23:47:04 compute-0 nova_compute[182935]: 2026-01-21 23:47:04.685 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:04 compute-0 nova_compute[182935]: 2026-01-21 23:47:04.879 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:04 compute-0 nova_compute[182935]: 2026-01-21 23:47:04.879 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:04 compute-0 nova_compute[182935]: 2026-01-21 23:47:04.902 182939 DEBUG nova.compute.manager [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.022 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.023 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.031 182939 DEBUG nova.virt.hardware [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.032 182939 INFO nova.compute.claims [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.477 182939 DEBUG nova.compute.provider_tree [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.502 182939 DEBUG nova.scheduler.client.report [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.529 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.506s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.530 182939 DEBUG nova.compute.manager [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.593 182939 DEBUG nova.compute.manager [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.594 182939 DEBUG nova.network.neutron [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.615 182939 INFO nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.635 182939 DEBUG nova.compute.manager [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.845 182939 DEBUG nova.compute.manager [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.846 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.846 182939 INFO nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Creating image(s)
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.847 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "/var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.847 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "/var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.848 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "/var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.863 182939 DEBUG oslo_concurrency.processutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.962 182939 DEBUG oslo_concurrency.processutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.963 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.963 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:05 compute-0 nova_compute[182935]: 2026-01-21 23:47:05.974 182939 DEBUG oslo_concurrency.processutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.051 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.059 182939 DEBUG oslo_concurrency.processutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.059 182939 DEBUG oslo_concurrency.processutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.094 182939 DEBUG oslo_concurrency.processutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.095 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.096 182939 DEBUG oslo_concurrency.processutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.151 182939 DEBUG oslo_concurrency.processutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.153 182939 DEBUG nova.virt.disk.api [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Checking if we can resize image /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.153 182939 DEBUG oslo_concurrency.processutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.205 182939 DEBUG oslo_concurrency.processutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.207 182939 DEBUG nova.virt.disk.api [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Cannot resize image /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.207 182939 DEBUG nova.objects.instance [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'migration_context' on Instance uuid efc683b9-a8d9-4a67-bb19-aeaabfbd5423 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.236 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.237 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Ensure instance console log exists: /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.237 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.240 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.240 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.366 182939 DEBUG nova.network.neutron [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Updating instance_info_cache with network_info: [{"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.387 182939 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Releasing lock "refresh_cache-b1080912-4a1f-4504-ae59-a0ad89963886" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.434 182939 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.435 182939 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.435 182939 DEBUG oslo_concurrency.lockutils [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.440 182939 INFO nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 21 23:47:06 compute-0 virtqemud[182477]: Domain id=9 name='instance-00000011' uuid=b1080912-4a1f-4504-ae59-a0ad89963886 is tainted: custom-monitor
Jan 21 23:47:06 compute-0 nova_compute[182935]: 2026-01-21 23:47:06.680 182939 DEBUG nova.policy [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4a6034ff39094b6486bac680b7ed5a57', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:47:06 compute-0 podman[213888]: 2026-01-21 23:47:06.759795077 +0000 UTC m=+0.108187817 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 23:47:06 compute-0 podman[213909]: 2026-01-21 23:47:06.903924033 +0000 UTC m=+0.090696250 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Jan 21 23:47:07 compute-0 nova_compute[182935]: 2026-01-21 23:47:07.452 182939 INFO nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 21 23:47:07 compute-0 nova_compute[182935]: 2026-01-21 23:47:07.471 182939 DEBUG nova.network.neutron [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Successfully created port: 07de181e-ac7b-4c3f-826a-3b63c1bdb993 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:47:07 compute-0 ovn_controller[95047]: 2026-01-21T23:47:07Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0d:1c:84 10.100.0.8
Jan 21 23:47:07 compute-0 ovn_controller[95047]: 2026-01-21T23:47:07Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0d:1c:84 10.100.0.8
Jan 21 23:47:08 compute-0 nova_compute[182935]: 2026-01-21 23:47:08.439 182939 DEBUG nova.network.neutron [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Successfully updated port: 07de181e-ac7b-4c3f-826a-3b63c1bdb993 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:47:08 compute-0 nova_compute[182935]: 2026-01-21 23:47:08.459 182939 INFO nova.virt.libvirt.driver [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 21 23:47:08 compute-0 nova_compute[182935]: 2026-01-21 23:47:08.473 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "refresh_cache-efc683b9-a8d9-4a67-bb19-aeaabfbd5423" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:08 compute-0 nova_compute[182935]: 2026-01-21 23:47:08.474 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquired lock "refresh_cache-efc683b9-a8d9-4a67-bb19-aeaabfbd5423" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:08 compute-0 nova_compute[182935]: 2026-01-21 23:47:08.474 182939 DEBUG nova.network.neutron [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:47:08 compute-0 nova_compute[182935]: 2026-01-21 23:47:08.616 182939 DEBUG nova.compute.manager [req-02c6d0c7-f362-4b65-91a8-3479347e01f8 req-672ef10b-3eda-48d0-9c2d-8768907945fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received event network-changed-07de181e-ac7b-4c3f-826a-3b63c1bdb993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:08 compute-0 nova_compute[182935]: 2026-01-21 23:47:08.616 182939 DEBUG nova.compute.manager [req-02c6d0c7-f362-4b65-91a8-3479347e01f8 req-672ef10b-3eda-48d0-9c2d-8768907945fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Refreshing instance network info cache due to event network-changed-07de181e-ac7b-4c3f-826a-3b63c1bdb993. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:47:08 compute-0 nova_compute[182935]: 2026-01-21 23:47:08.617 182939 DEBUG oslo_concurrency.lockutils [req-02c6d0c7-f362-4b65-91a8-3479347e01f8 req-672ef10b-3eda-48d0-9c2d-8768907945fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-efc683b9-a8d9-4a67-bb19-aeaabfbd5423" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:08 compute-0 nova_compute[182935]: 2026-01-21 23:47:08.709 182939 DEBUG nova.network.neutron [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:47:08 compute-0 nova_compute[182935]: 2026-01-21 23:47:08.865 182939 DEBUG nova.compute.manager [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:08 compute-0 nova_compute[182935]: 2026-01-21 23:47:08.888 182939 DEBUG nova.objects.instance [None req-eb06dbd1-2873-4271-8d09-c8c9ae32bda6 3f3d277fcbf24af7b1dc18c283ed60b3 94733cdd1eb94db1ae000cbc4537c9b6 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.686 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.906 182939 DEBUG nova.network.neutron [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Updating instance_info_cache with network_info: [{"id": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "address": "fa:16:3e:69:70:45", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07de181e-ac", "ovs_interfaceid": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.934 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Releasing lock "refresh_cache-efc683b9-a8d9-4a67-bb19-aeaabfbd5423" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.935 182939 DEBUG nova.compute.manager [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Instance network_info: |[{"id": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "address": "fa:16:3e:69:70:45", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07de181e-ac", "ovs_interfaceid": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.936 182939 DEBUG oslo_concurrency.lockutils [req-02c6d0c7-f362-4b65-91a8-3479347e01f8 req-672ef10b-3eda-48d0-9c2d-8768907945fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-efc683b9-a8d9-4a67-bb19-aeaabfbd5423" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.936 182939 DEBUG nova.network.neutron [req-02c6d0c7-f362-4b65-91a8-3479347e01f8 req-672ef10b-3eda-48d0-9c2d-8768907945fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Refreshing network info cache for port 07de181e-ac7b-4c3f-826a-3b63c1bdb993 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.940 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Start _get_guest_xml network_info=[{"id": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "address": "fa:16:3e:69:70:45", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07de181e-ac", "ovs_interfaceid": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.945 182939 WARNING nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.952 182939 DEBUG nova.virt.libvirt.host [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.952 182939 DEBUG nova.virt.libvirt.host [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.961 182939 DEBUG nova.virt.libvirt.host [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.962 182939 DEBUG nova.virt.libvirt.host [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.964 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.965 182939 DEBUG nova.virt.hardware [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.965 182939 DEBUG nova.virt.hardware [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.966 182939 DEBUG nova.virt.hardware [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.966 182939 DEBUG nova.virt.hardware [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.967 182939 DEBUG nova.virt.hardware [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.967 182939 DEBUG nova.virt.hardware [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.968 182939 DEBUG nova.virt.hardware [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.968 182939 DEBUG nova.virt.hardware [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.969 182939 DEBUG nova.virt.hardware [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.969 182939 DEBUG nova.virt.hardware [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.969 182939 DEBUG nova.virt.hardware [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.974 182939 DEBUG nova.virt.libvirt.vif [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:47:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-466710194',display_name='tempest-ServersAdminTestJSON-server-466710194',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-466710194',id=20,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-85hksn2t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:05Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=efc683b9-a8d9-4a67-bb19-aeaabfbd5423,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "address": "fa:16:3e:69:70:45", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07de181e-ac", "ovs_interfaceid": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.975 182939 DEBUG nova.network.os_vif_util [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "address": "fa:16:3e:69:70:45", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07de181e-ac", "ovs_interfaceid": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.976 182939 DEBUG nova.network.os_vif_util [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:70:45,bridge_name='br-int',has_traffic_filtering=True,id=07de181e-ac7b-4c3f-826a-3b63c1bdb993,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07de181e-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.977 182939 DEBUG nova.objects.instance [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'pci_devices' on Instance uuid efc683b9-a8d9-4a67-bb19-aeaabfbd5423 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.994 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:47:09 compute-0 nova_compute[182935]:   <uuid>efc683b9-a8d9-4a67-bb19-aeaabfbd5423</uuid>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   <name>instance-00000014</name>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersAdminTestJSON-server-466710194</nova:name>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:47:09</nova:creationTime>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:47:09 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:47:09 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:47:09 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:47:09 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:47:09 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:47:09 compute-0 nova_compute[182935]:         <nova:user uuid="4a6034ff39094b6486bac680b7ed5a57">tempest-ServersAdminTestJSON-1815099341-project-member</nova:user>
Jan 21 23:47:09 compute-0 nova_compute[182935]:         <nova:project uuid="4d40fc03fb534b5689415f3d8a3de1fc">tempest-ServersAdminTestJSON-1815099341</nova:project>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:47:09 compute-0 nova_compute[182935]:         <nova:port uuid="07de181e-ac7b-4c3f-826a-3b63c1bdb993">
Jan 21 23:47:09 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <system>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <entry name="serial">efc683b9-a8d9-4a67-bb19-aeaabfbd5423</entry>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <entry name="uuid">efc683b9-a8d9-4a67-bb19-aeaabfbd5423</entry>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     </system>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   <os>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   </os>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   <features>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   </features>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk.config"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:69:70:45"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <target dev="tap07de181e-ac"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/console.log" append="off"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <video>
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     </video>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:47:09 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:47:09 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:47:09 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:47:09 compute-0 nova_compute[182935]: </domain>
Jan 21 23:47:09 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.996 182939 DEBUG nova.compute.manager [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Preparing to wait for external event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.996 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.997 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:09 compute-0 nova_compute[182935]: 2026-01-21 23:47:09.997 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.000 182939 DEBUG nova.virt.libvirt.vif [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:47:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-466710194',display_name='tempest-ServersAdminTestJSON-server-466710194',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-466710194',id=20,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-85hksn2t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:05Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=efc683b9-a8d9-4a67-bb19-aeaabfbd5423,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "address": "fa:16:3e:69:70:45", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07de181e-ac", "ovs_interfaceid": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.000 182939 DEBUG nova.network.os_vif_util [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "address": "fa:16:3e:69:70:45", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07de181e-ac", "ovs_interfaceid": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.001 182939 DEBUG nova.network.os_vif_util [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:70:45,bridge_name='br-int',has_traffic_filtering=True,id=07de181e-ac7b-4c3f-826a-3b63c1bdb993,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07de181e-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.002 182939 DEBUG os_vif [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:70:45,bridge_name='br-int',has_traffic_filtering=True,id=07de181e-ac7b-4c3f-826a-3b63c1bdb993,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07de181e-ac') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.003 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.004 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.004 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.009 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.009 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap07de181e-ac, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.010 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap07de181e-ac, col_values=(('external_ids', {'iface-id': '07de181e-ac7b-4c3f-826a-3b63c1bdb993', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:69:70:45', 'vm-uuid': 'efc683b9-a8d9-4a67-bb19-aeaabfbd5423'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.012 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:10 compute-0 NetworkManager[55139]: <info>  [1769039230.0140] manager: (tap07de181e-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.016 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.026 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.028 182939 INFO os_vif [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:70:45,bridge_name='br-int',has_traffic_filtering=True,id=07de181e-ac7b-4c3f-826a-3b63c1bdb993,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07de181e-ac')
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.123 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.123 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.124 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No VIF found with MAC fa:16:3e:69:70:45, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:47:10 compute-0 nova_compute[182935]: 2026-01-21 23:47:10.125 182939 INFO nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Using config drive
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.211 182939 INFO nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Creating config drive at /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk.config
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.216 182939 DEBUG oslo_concurrency.processutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoo5wd57u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.348 182939 DEBUG oslo_concurrency.processutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoo5wd57u" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:11 compute-0 kernel: tap07de181e-ac: entered promiscuous mode
Jan 21 23:47:11 compute-0 NetworkManager[55139]: <info>  [1769039231.4144] manager: (tap07de181e-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.416 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:11 compute-0 ovn_controller[95047]: 2026-01-21T23:47:11Z|00065|binding|INFO|Claiming lport 07de181e-ac7b-4c3f-826a-3b63c1bdb993 for this chassis.
Jan 21 23:47:11 compute-0 ovn_controller[95047]: 2026-01-21T23:47:11Z|00066|binding|INFO|07de181e-ac7b-4c3f-826a-3b63c1bdb993: Claiming fa:16:3e:69:70:45 10.100.0.5
Jan 21 23:47:11 compute-0 ovn_controller[95047]: 2026-01-21T23:47:11Z|00067|binding|INFO|Setting lport 07de181e-ac7b-4c3f-826a-3b63c1bdb993 ovn-installed in OVS
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.432 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:11 compute-0 ovn_controller[95047]: 2026-01-21T23:47:11Z|00068|binding|INFO|Setting lport 07de181e-ac7b-4c3f-826a-3b63c1bdb993 up in Southbound
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.437 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:70:45 10.100.0.5'], port_security=['fa:16:3e:69:70:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'efc683b9-a8d9-4a67-bb19-aeaabfbd5423', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=07de181e-ac7b-4c3f-826a-3b63c1bdb993) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.438 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.440 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 07de181e-ac7b-4c3f-826a-3b63c1bdb993 in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 bound to our chassis
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.443 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:47:11 compute-0 systemd-udevd[213950]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:47:11 compute-0 systemd-machined[154182]: New machine qemu-10-instance-00000014.
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.467 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1333323c-efce-448a-95cb-2d1d6e47ba1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:11 compute-0 NetworkManager[55139]: <info>  [1769039231.4716] device (tap07de181e-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:47:11 compute-0 NetworkManager[55139]: <info>  [1769039231.4724] device (tap07de181e-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:47:11 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-00000014.
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.501 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[5f01db97-d975-4b00-8322-5703e02b8ee0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.505 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[36c77842-5dab-4bc1-9538-f5db1ed41507]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.539 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a80630-69cd-4220-ab7c-161e5324dfb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.558 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[da31b445-131a-41d3-afb8-8d5bd97c1b8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371486, 'reachable_time': 18847, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213964, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.581 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8604d890-0ef7-49d1-ad84-3b1c57747706]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371506, 'tstamp': 371506}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213966, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371510, 'tstamp': 371510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213966, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.583 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.604 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.606 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1530a22a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.606 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.606 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.606 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1530a22a-f0, col_values=(('external_ids', {'iface-id': '1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:11.607 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.951 182939 DEBUG nova.compute.manager [req-1ea2fd20-710d-4b29-b3e1-89dea72fc736 req-adcb0bb4-b530-4249-aa0f-b9c4111b9fb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.952 182939 DEBUG oslo_concurrency.lockutils [req-1ea2fd20-710d-4b29-b3e1-89dea72fc736 req-adcb0bb4-b530-4249-aa0f-b9c4111b9fb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.953 182939 DEBUG oslo_concurrency.lockutils [req-1ea2fd20-710d-4b29-b3e1-89dea72fc736 req-adcb0bb4-b530-4249-aa0f-b9c4111b9fb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.953 182939 DEBUG oslo_concurrency.lockutils [req-1ea2fd20-710d-4b29-b3e1-89dea72fc736 req-adcb0bb4-b530-4249-aa0f-b9c4111b9fb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:11 compute-0 nova_compute[182935]: 2026-01-21 23:47:11.953 182939 DEBUG nova.compute.manager [req-1ea2fd20-710d-4b29-b3e1-89dea72fc736 req-adcb0bb4-b530-4249-aa0f-b9c4111b9fb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Processing event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.128 182939 DEBUG nova.compute.manager [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.130 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039232.1279175, efc683b9-a8d9-4a67-bb19-aeaabfbd5423 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.130 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] VM Started (Lifecycle Event)
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.135 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.139 182939 INFO nova.virt.libvirt.driver [-] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Instance spawned successfully.
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.140 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.156 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.162 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.165 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.166 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.166 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.166 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.167 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.167 182939 DEBUG nova.virt.libvirt.driver [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.193 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.194 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039232.134063, efc683b9-a8d9-4a67-bb19-aeaabfbd5423 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.194 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] VM Paused (Lifecycle Event)
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.222 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.227 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039232.1358986, efc683b9-a8d9-4a67-bb19-aeaabfbd5423 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.227 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] VM Resumed (Lifecycle Event)
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.242 182939 INFO nova.compute.manager [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Took 6.40 seconds to spawn the instance on the hypervisor.
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.243 182939 DEBUG nova.compute.manager [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.249 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.251 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.293 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.352 182939 INFO nova.compute.manager [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Took 7.38 seconds to build instance.
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.372 182939 DEBUG oslo_concurrency.lockutils [None req-794c96e4-4b07-44ec-a5d6-a10086d19abe 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:12 compute-0 nova_compute[182935]: 2026-01-21 23:47:12.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.538 182939 DEBUG nova.network.neutron [req-02c6d0c7-f362-4b65-91a8-3479347e01f8 req-672ef10b-3eda-48d0-9c2d-8768907945fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Updated VIF entry in instance network info cache for port 07de181e-ac7b-4c3f-826a-3b63c1bdb993. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.539 182939 DEBUG nova.network.neutron [req-02c6d0c7-f362-4b65-91a8-3479347e01f8 req-672ef10b-3eda-48d0-9c2d-8768907945fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Updating instance_info_cache with network_info: [{"id": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "address": "fa:16:3e:69:70:45", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07de181e-ac", "ovs_interfaceid": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.566 182939 DEBUG oslo_concurrency.lockutils [req-02c6d0c7-f362-4b65-91a8-3479347e01f8 req-672ef10b-3eda-48d0-9c2d-8768907945fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-efc683b9-a8d9-4a67-bb19-aeaabfbd5423" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.724 182939 DEBUG oslo_concurrency.lockutils [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.724 182939 DEBUG oslo_concurrency.lockutils [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.725 182939 DEBUG oslo_concurrency.lockutils [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.725 182939 DEBUG oslo_concurrency.lockutils [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.725 182939 DEBUG oslo_concurrency.lockutils [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.740 182939 INFO nova.compute.manager [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Terminating instance
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.752 182939 DEBUG nova.compute.manager [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:47:13 compute-0 kernel: tapc16d8d18-66 (unregistering): left promiscuous mode
Jan 21 23:47:13 compute-0 NetworkManager[55139]: <info>  [1769039233.7823] device (tapc16d8d18-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.796 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:13 compute-0 ovn_controller[95047]: 2026-01-21T23:47:13Z|00069|binding|INFO|Releasing lport c16d8d18-6610-45c3-8172-54b8b99474ae from this chassis (sb_readonly=0)
Jan 21 23:47:13 compute-0 ovn_controller[95047]: 2026-01-21T23:47:13Z|00070|binding|INFO|Setting lport c16d8d18-6610-45c3-8172-54b8b99474ae down in Southbound
Jan 21 23:47:13 compute-0 ovn_controller[95047]: 2026-01-21T23:47:13Z|00071|binding|INFO|Releasing lport 32683c17-e027-4757-9a64-36df76fef381 from this chassis (sb_readonly=0)
Jan 21 23:47:13 compute-0 ovn_controller[95047]: 2026-01-21T23:47:13Z|00072|binding|INFO|Setting lport 32683c17-e027-4757-9a64-36df76fef381 down in Southbound
Jan 21 23:47:13 compute-0 ovn_controller[95047]: 2026-01-21T23:47:13Z|00073|binding|INFO|Removing iface tapc16d8d18-66 ovn-installed in OVS
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.810 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:13 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:13.812 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:d5:91 10.100.0.4'], port_security=['fa:16:3e:d8:d5:91 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1447043042', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b1080912-4a1f-4504-ae59-a0ad89963886', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1447043042', 'neutron:project_id': '1298204af0f241dc8b63851b2046cf5c', 'neutron:revision_number': '13', 'neutron:security_group_ids': '4fca0662-11c4-4183-96b8-546eae3304ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c50c611d-d348-436f-bd12-bc6add278699, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=c16d8d18-6610-45c3-8172-54b8b99474ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.813 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:13 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:13.816 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:cc:06 19.80.0.41'], port_security=['fa:16:3e:51:cc:06 19.80.0.41'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['c16d8d18-6610-45c3-8172-54b8b99474ae'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1528740689', 'neutron:cidrs': '19.80.0.41/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3af949ae-65f7-4e98-9b88-e75f765a8686', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1528740689', 'neutron:project_id': '1298204af0f241dc8b63851b2046cf5c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '4fca0662-11c4-4183-96b8-546eae3304ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=d251bc29-f047-44fe-b77c-1e7f2007e967, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=32683c17-e027-4757-9a64-36df76fef381) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:13 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:13.820 104408 INFO neutron.agent.ovn.metadata.agent [-] Port c16d8d18-6610-45c3-8172-54b8b99474ae in datapath b7816b8e-52c1-4d60-84f7-524ebe7dfa5c unbound from our chassis
Jan 21 23:47:13 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:13.824 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:47:13 compute-0 nova_compute[182935]: 2026-01-21 23:47:13.826 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:13 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:13.827 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c115b7-e706-42a7-929a-ec4e6896aa48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:13 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:13.828 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c namespace which is not needed anymore
Jan 21 23:47:13 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000011.scope: Deactivated successfully.
Jan 21 23:47:13 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000011.scope: Consumed 2.178s CPU time.
Jan 21 23:47:13 compute-0 systemd-machined[154182]: Machine qemu-9-instance-00000011 terminated.
Jan 21 23:47:14 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213764]: [NOTICE]   (213771) : haproxy version is 2.8.14-c23fe91
Jan 21 23:47:14 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213764]: [NOTICE]   (213771) : path to executable is /usr/sbin/haproxy
Jan 21 23:47:14 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213764]: [WARNING]  (213771) : Exiting Master process...
Jan 21 23:47:14 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213764]: [WARNING]  (213771) : Exiting Master process...
Jan 21 23:47:14 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213764]: [ALERT]    (213771) : Current worker (213773) exited with code 143 (Terminated)
Jan 21 23:47:14 compute-0 neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c[213764]: [WARNING]  (213771) : All workers exited. Exiting... (0)
Jan 21 23:47:14 compute-0 systemd[1]: libpod-3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842.scope: Deactivated successfully.
Jan 21 23:47:14 compute-0 podman[213995]: 2026-01-21 23:47:14.029232673 +0000 UTC m=+0.074678616 container died 3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.032 182939 INFO nova.virt.libvirt.driver [-] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Instance destroyed successfully.
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.033 182939 DEBUG nova.objects.instance [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lazy-loading 'resources' on Instance uuid b1080912-4a1f-4504-ae59-a0ad89963886 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.056 182939 DEBUG nova.virt.libvirt.vif [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:46:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1283276848',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1283276848',id=17,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:46:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1298204af0f241dc8b63851b2046cf5c',ramdisk_id='',reservation_id='r-b25ryamg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1063342224',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1063342224-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:47:08Z,user_data=None,user_id='553fdc065acf4000a185abac43878ab4',uuid=b1080912-4a1f-4504-ae59-a0ad89963886,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.057 182939 DEBUG nova.network.os_vif_util [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Converting VIF {"id": "c16d8d18-6610-45c3-8172-54b8b99474ae", "address": "fa:16:3e:d8:d5:91", "network": {"id": "b7816b8e-52c1-4d60-84f7-524ebe7dfa5c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1363967197-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1298204af0f241dc8b63851b2046cf5c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc16d8d18-66", "ovs_interfaceid": "c16d8d18-6610-45c3-8172-54b8b99474ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842-userdata-shm.mount: Deactivated successfully.
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.058 182939 DEBUG nova.network.os_vif_util [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.058 182939 DEBUG os_vif [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:47:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-41d73a0872e1456f1614d1e6de988bd5011f1887db645108a59c2e674e70dec0-merged.mount: Deactivated successfully.
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.061 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.061 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc16d8d18-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.069 182939 DEBUG nova.compute.manager [req-882d4d06-a1d2-499a-89d1-6c14a849b43e req-a05472be-0640-453c-bf51-3aee20f363d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.070 182939 DEBUG oslo_concurrency.lockutils [req-882d4d06-a1d2-499a-89d1-6c14a849b43e req-a05472be-0640-453c-bf51-3aee20f363d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.070 182939 DEBUG oslo_concurrency.lockutils [req-882d4d06-a1d2-499a-89d1-6c14a849b43e req-a05472be-0640-453c-bf51-3aee20f363d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.070 182939 DEBUG oslo_concurrency.lockutils [req-882d4d06-a1d2-499a-89d1-6c14a849b43e req-a05472be-0640-453c-bf51-3aee20f363d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.071 182939 DEBUG nova.compute.manager [req-882d4d06-a1d2-499a-89d1-6c14a849b43e req-a05472be-0640-453c-bf51-3aee20f363d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] No waiting events found dispatching network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.072 182939 WARNING nova.compute.manager [req-882d4d06-a1d2-499a-89d1-6c14a849b43e req-a05472be-0640-453c-bf51-3aee20f363d9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received unexpected event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 for instance with vm_state active and task_state None.
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.109 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.112 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:14 compute-0 podman[213995]: 2026-01-21 23:47:14.112588966 +0000 UTC m=+0.158034899 container cleanup 3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.115 182939 INFO os_vif [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:d5:91,bridge_name='br-int',has_traffic_filtering=True,id=c16d8d18-6610-45c3-8172-54b8b99474ae,network=Network(b7816b8e-52c1-4d60-84f7-524ebe7dfa5c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc16d8d18-66')
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.116 182939 INFO nova.virt.libvirt.driver [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Deleting instance files /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886_del
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.117 182939 INFO nova.virt.libvirt.driver [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Deletion of /var/lib/nova/instances/b1080912-4a1f-4504-ae59-a0ad89963886_del complete
Jan 21 23:47:14 compute-0 systemd[1]: libpod-conmon-3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842.scope: Deactivated successfully.
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.185 182939 INFO nova.compute.manager [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.186 182939 DEBUG oslo.service.loopingcall [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.186 182939 DEBUG nova.compute.manager [-] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.186 182939 DEBUG nova.network.neutron [-] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:47:14 compute-0 podman[214037]: 2026-01-21 23:47:14.20271094 +0000 UTC m=+0.057199318 container remove 3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.208 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[462d5ee4-7a4c-4972-a80d-fef9374b53f2]: (4, ('Wed Jan 21 11:47:13 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c (3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842)\n3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842\nWed Jan 21 11:47:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c (3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842)\n3ce4baa40c0494816f852622138c56122842a1f49ee4f390cf0f144a0f3fa842\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.210 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e25cf8cf-3e85-4eaa-8f91-73e4d92ba10a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.211 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7816b8e-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.213 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:14 compute-0 kernel: tapb7816b8e-50: left promiscuous mode
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.220 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.224 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0d4f65ac-2e49-4211-a8ea-5b1a3fceb773]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.232 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.245 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a84799bd-214c-4828-8b94-e9c6ba56e3f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.246 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ed952b5e-366a-45c3-9d10-2ff419b66c67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.267 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[675d4d24-7323-4066-ab0d-94954db04cd3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372483, 'reachable_time': 40136, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214053, 'error': None, 'target': 'ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.271 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b7816b8e-52c1-4d60-84f7-524ebe7dfa5c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.271 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[46a8963c-812f-43c8-8d13-ca7f52d782a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 systemd[1]: run-netns-ovnmeta\x2db7816b8e\x2d52c1\x2d4d60\x2d84f7\x2d524ebe7dfa5c.mount: Deactivated successfully.
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.273 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 32683c17-e027-4757-9a64-36df76fef381 in datapath 3af949ae-65f7-4e98-9b88-e75f765a8686 unbound from our chassis
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.276 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3af949ae-65f7-4e98-9b88-e75f765a8686, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.278 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[098a29ce-f9e9-4ad3-8fc5-994267d36762]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.278 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686 namespace which is not needed anymore
Jan 21 23:47:14 compute-0 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213842]: [NOTICE]   (213846) : haproxy version is 2.8.14-c23fe91
Jan 21 23:47:14 compute-0 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213842]: [NOTICE]   (213846) : path to executable is /usr/sbin/haproxy
Jan 21 23:47:14 compute-0 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213842]: [WARNING]  (213846) : Exiting Master process...
Jan 21 23:47:14 compute-0 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213842]: [ALERT]    (213846) : Current worker (213848) exited with code 143 (Terminated)
Jan 21 23:47:14 compute-0 neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686[213842]: [WARNING]  (213846) : All workers exited. Exiting... (0)
Jan 21 23:47:14 compute-0 systemd[1]: libpod-94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c.scope: Deactivated successfully.
Jan 21 23:47:14 compute-0 podman[214072]: 2026-01-21 23:47:14.426398258 +0000 UTC m=+0.053982992 container died 94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:47:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c-userdata-shm.mount: Deactivated successfully.
Jan 21 23:47:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-5da3b4dd3ac1438b8ffd91b369373c6ab733234b145e9a15eef639085db26b60-merged.mount: Deactivated successfully.
Jan 21 23:47:14 compute-0 podman[214072]: 2026-01-21 23:47:14.482654392 +0000 UTC m=+0.110239116 container cleanup 94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 23:47:14 compute-0 systemd[1]: libpod-conmon-94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c.scope: Deactivated successfully.
Jan 21 23:47:14 compute-0 podman[214102]: 2026-01-21 23:47:14.565570044 +0000 UTC m=+0.054618266 container remove 94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.575 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[98edeb8f-a63b-4162-9ae7-7717e4612089]: (4, ('Wed Jan 21 11:47:14 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686 (94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c)\n94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c\nWed Jan 21 11:47:14 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686 (94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c)\n94621e6eb78a2fbd40817b6f4d5e4d2bb938b128607963374aff7ca970a86d1c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.577 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7c6cc0bd-9444-4e1f-b96f-a856ba75fc96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.578 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3af949ae-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.581 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:14 compute-0 kernel: tap3af949ae-60: left promiscuous mode
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.607 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.612 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[697a45bd-8a2f-4730-8b24-30c4fb08f205]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.627 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf1eb4d-1134-4023-9600-b0b1b7cc9c30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.629 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7d1e0642-1ef0-4e00-83df-f5ff3eaa02d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.653 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc1454f-c71f-40db-96e9-d54f102b349e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372585, 'reachable_time': 21093, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214115, 'error': None, 'target': 'ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.656 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3af949ae-65f7-4e98-9b88-e75f765a8686 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:47:14 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:14.656 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[23353de3-5c07-4763-a7e6-81eeb9e069d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.688 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:14 compute-0 nova_compute[182935]: 2026-01-21 23:47:14.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 23:47:15 compute-0 systemd[1]: run-netns-ovnmeta\x2d3af949ae\x2d65f7\x2d4e98\x2d9b88\x2de75f765a8686.mount: Deactivated successfully.
Jan 21 23:47:15 compute-0 nova_compute[182935]: 2026-01-21 23:47:15.548 182939 DEBUG nova.compute.manager [req-c21746e2-f306-481e-b562-c2e1a8ffc112 req-5889985c-50dd-468a-980b-f961dfda473e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-unplugged-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:15 compute-0 nova_compute[182935]: 2026-01-21 23:47:15.548 182939 DEBUG oslo_concurrency.lockutils [req-c21746e2-f306-481e-b562-c2e1a8ffc112 req-5889985c-50dd-468a-980b-f961dfda473e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:15 compute-0 nova_compute[182935]: 2026-01-21 23:47:15.549 182939 DEBUG oslo_concurrency.lockutils [req-c21746e2-f306-481e-b562-c2e1a8ffc112 req-5889985c-50dd-468a-980b-f961dfda473e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:15 compute-0 nova_compute[182935]: 2026-01-21 23:47:15.549 182939 DEBUG oslo_concurrency.lockutils [req-c21746e2-f306-481e-b562-c2e1a8ffc112 req-5889985c-50dd-468a-980b-f961dfda473e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:15 compute-0 nova_compute[182935]: 2026-01-21 23:47:15.550 182939 DEBUG nova.compute.manager [req-c21746e2-f306-481e-b562-c2e1a8ffc112 req-5889985c-50dd-468a-980b-f961dfda473e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] No waiting events found dispatching network-vif-unplugged-c16d8d18-6610-45c3-8172-54b8b99474ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:15 compute-0 nova_compute[182935]: 2026-01-21 23:47:15.550 182939 DEBUG nova.compute.manager [req-c21746e2-f306-481e-b562-c2e1a8ffc112 req-5889985c-50dd-468a-980b-f961dfda473e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-unplugged-c16d8d18-6610-45c3-8172-54b8b99474ae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:47:16 compute-0 nova_compute[182935]: 2026-01-21 23:47:16.866 182939 DEBUG nova.network.neutron [-] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:16 compute-0 nova_compute[182935]: 2026-01-21 23:47:16.901 182939 INFO nova.compute.manager [-] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Took 2.71 seconds to deallocate network for instance.
Jan 21 23:47:17 compute-0 nova_compute[182935]: 2026-01-21 23:47:17.035 182939 DEBUG oslo_concurrency.lockutils [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:17 compute-0 nova_compute[182935]: 2026-01-21 23:47:17.036 182939 DEBUG oslo_concurrency.lockutils [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:17 compute-0 nova_compute[182935]: 2026-01-21 23:47:17.042 182939 DEBUG oslo_concurrency.lockutils [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:17 compute-0 nova_compute[182935]: 2026-01-21 23:47:17.247 182939 INFO nova.scheduler.client.report [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Deleted allocations for instance b1080912-4a1f-4504-ae59-a0ad89963886
Jan 21 23:47:17 compute-0 nova_compute[182935]: 2026-01-21 23:47:17.455 182939 DEBUG oslo_concurrency.lockutils [None req-7f288855-ce2b-407b-bed9-b2781f2b8753 553fdc065acf4000a185abac43878ab4 1298204af0f241dc8b63851b2046cf5c - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:17 compute-0 nova_compute[182935]: 2026-01-21 23:47:17.703 182939 DEBUG nova.compute.manager [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:17 compute-0 nova_compute[182935]: 2026-01-21 23:47:17.704 182939 DEBUG oslo_concurrency.lockutils [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:17 compute-0 nova_compute[182935]: 2026-01-21 23:47:17.704 182939 DEBUG oslo_concurrency.lockutils [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:17 compute-0 nova_compute[182935]: 2026-01-21 23:47:17.704 182939 DEBUG oslo_concurrency.lockutils [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1080912-4a1f-4504-ae59-a0ad89963886-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:17 compute-0 nova_compute[182935]: 2026-01-21 23:47:17.704 182939 DEBUG nova.compute.manager [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] No waiting events found dispatching network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:17 compute-0 nova_compute[182935]: 2026-01-21 23:47:17.704 182939 WARNING nova.compute.manager [req-da943313-3a3e-4fdd-b1bc-19bc801689ce req-480baee5-1be4-414c-aef2-9e68aedf961d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Received unexpected event network-vif-plugged-c16d8d18-6610-45c3-8172-54b8b99474ae for instance with vm_state deleted and task_state None.
Jan 21 23:47:17 compute-0 podman[214122]: 2026-01-21 23:47:17.719180006 +0000 UTC m=+0.073022757 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:47:17 compute-0 podman[214121]: 2026-01-21 23:47:17.763771132 +0000 UTC m=+0.123371671 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 21 23:47:17 compute-0 nova_compute[182935]: 2026-01-21 23:47:17.816 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:18 compute-0 nova_compute[182935]: 2026-01-21 23:47:18.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:18 compute-0 nova_compute[182935]: 2026-01-21 23:47:18.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:47:18 compute-0 nova_compute[182935]: 2026-01-21 23:47:18.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:47:19 compute-0 nova_compute[182935]: 2026-01-21 23:47:19.110 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:19 compute-0 nova_compute[182935]: 2026-01-21 23:47:19.141 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-a6a89006-02c9-49b1-8bfb-8640ba1b495f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:19 compute-0 nova_compute[182935]: 2026-01-21 23:47:19.142 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-a6a89006-02c9-49b1-8bfb-8640ba1b495f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:19 compute-0 nova_compute[182935]: 2026-01-21 23:47:19.142 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:47:19 compute-0 nova_compute[182935]: 2026-01-21 23:47:19.142 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:19 compute-0 nova_compute[182935]: 2026-01-21 23:47:19.753 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:21 compute-0 nova_compute[182935]: 2026-01-21 23:47:21.325 182939 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Creating tmpfile /var/lib/nova/instances/tmp9ku_13i8 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 21 23:47:21 compute-0 nova_compute[182935]: 2026-01-21 23:47:21.326 182939 DEBUG nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9ku_13i8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 21 23:47:22 compute-0 ovn_controller[95047]: 2026-01-21T23:47:22Z|00074|binding|INFO|Releasing lport 1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd from this chassis (sb_readonly=0)
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.223 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.326 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Updating instance_info_cache with network_info: [{"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.358 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-a6a89006-02c9-49b1-8bfb-8640ba1b495f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.358 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.359 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.359 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.359 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.359 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.360 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.360 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.380 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.380 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.380 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.380 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.473 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.554 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.556 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.623 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.631 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.690 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.691 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.755 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.936 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.938 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5300MB free_disk=73.31591796875GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.938 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:22 compute-0 nova_compute[182935]: 2026-01-21 23:47:22.938 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.097 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance a6a89006-02c9-49b1-8bfb-8640ba1b495f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.097 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance efc683b9-a8d9-4a67-bb19-aeaabfbd5423 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.127 182939 WARNING nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 5bdecf5d-9113-4584-ac23-44d59770eade has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.127 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.127 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.326 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.351 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.378 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.379 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.440s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.811 182939 DEBUG nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9ku_13i8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5bdecf5d-9113-4584-ac23-44d59770eade',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.856 182939 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.857 182939 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquired lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:23 compute-0 nova_compute[182935]: 2026-01-21 23:47:23.857 182939 DEBUG nova.network.neutron [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:47:24 compute-0 nova_compute[182935]: 2026-01-21 23:47:24.113 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:24 compute-0 nova_compute[182935]: 2026-01-21 23:47:24.374 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:24 compute-0 nova_compute[182935]: 2026-01-21 23:47:24.754 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:24 compute-0 nova_compute[182935]: 2026-01-21 23:47:24.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:24 compute-0 nova_compute[182935]: 2026-01-21 23:47:24.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:24 compute-0 nova_compute[182935]: 2026-01-21 23:47:24.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 23:47:24 compute-0 nova_compute[182935]: 2026-01-21 23:47:24.820 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 23:47:25 compute-0 sshd-session[214188]: Invalid user tomcat from 188.166.69.60 port 49438
Jan 21 23:47:25 compute-0 podman[214202]: 2026-01-21 23:47:25.090791443 +0000 UTC m=+0.067351991 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 23:47:25 compute-0 sshd-session[214188]: Connection closed by invalid user tomcat 188.166.69.60 port 49438 [preauth]
Jan 21 23:47:25 compute-0 ovn_controller[95047]: 2026-01-21T23:47:25Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:69:70:45 10.100.0.5
Jan 21 23:47:25 compute-0 ovn_controller[95047]: 2026-01-21T23:47:25Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:69:70:45 10.100.0.5
Jan 21 23:47:26 compute-0 nova_compute[182935]: 2026-01-21 23:47:26.794 182939 DEBUG nova.network.neutron [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updating instance_info_cache with network_info: [{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:26 compute-0 nova_compute[182935]: 2026-01-21 23:47:26.820 182939 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Releasing lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:26 compute-0 nova_compute[182935]: 2026-01-21 23:47:26.835 182939 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9ku_13i8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5bdecf5d-9113-4584-ac23-44d59770eade',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 21 23:47:26 compute-0 nova_compute[182935]: 2026-01-21 23:47:26.836 182939 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Creating instance directory: /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 21 23:47:26 compute-0 nova_compute[182935]: 2026-01-21 23:47:26.837 182939 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Creating disk.info with the contents: {'/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk': 'qcow2', '/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 21 23:47:26 compute-0 nova_compute[182935]: 2026-01-21 23:47:26.837 182939 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 21 23:47:26 compute-0 nova_compute[182935]: 2026-01-21 23:47:26.838 182939 DEBUG nova.objects.instance [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5bdecf5d-9113-4584-ac23-44d59770eade obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:26 compute-0 nova_compute[182935]: 2026-01-21 23:47:26.865 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:26 compute-0 nova_compute[182935]: 2026-01-21 23:47:26.933 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:26 compute-0 nova_compute[182935]: 2026-01-21 23:47:26.934 182939 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:26 compute-0 nova_compute[182935]: 2026-01-21 23:47:26.935 182939 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:26 compute-0 nova_compute[182935]: 2026-01-21 23:47:26.945 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.002 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.003 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.043 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.044 182939 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.044 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.109 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.110 182939 DEBUG nova.virt.disk.api [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Checking if we can resize image /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.110 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.179 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.180 182939 DEBUG nova.virt.disk.api [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Cannot resize image /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.180 182939 DEBUG nova.objects.instance [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lazy-loading 'migration_context' on Instance uuid 5bdecf5d-9113-4584-ac23-44d59770eade obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.196 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.223 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config 485376" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.225 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config to /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.225 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.739 182939 DEBUG oslo_concurrency.processutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk.config /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.740 182939 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.742 182939 DEBUG nova.virt.libvirt.vif [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-821021372',display_name='tempest-LiveMigrationTest-server-821021372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-821021372',id=21,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:47:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-xa5gd7vg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:17Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=5bdecf5d-9113-4584-ac23-44d59770eade,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.742 182939 DEBUG nova.network.os_vif_util [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converting VIF {"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.744 182939 DEBUG nova.network.os_vif_util [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.744 182939 DEBUG os_vif [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.745 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.746 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.746 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.750 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.750 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf9aa099-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.751 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdf9aa099-aa, col_values=(('external_ids', {'iface-id': 'df9aa099-aa41-4111-b46c-c8a593762a53', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:4f:85', 'vm-uuid': '5bdecf5d-9113-4584-ac23-44d59770eade'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.752 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:27 compute-0 NetworkManager[55139]: <info>  [1769039247.7536] manager: (tapdf9aa099-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.755 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.761 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.765 182939 INFO os_vif [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa')
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.765 182939 DEBUG nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 21 23:47:27 compute-0 nova_compute[182935]: 2026-01-21 23:47:27.766 182939 DEBUG nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9ku_13i8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5bdecf5d-9113-4584-ac23-44d59770eade',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 21 23:47:28 compute-0 podman[214248]: 2026-01-21 23:47:28.698063542 +0000 UTC m=+0.064133605 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 23:47:28 compute-0 nova_compute[182935]: 2026-01-21 23:47:28.933 182939 DEBUG nova.network.neutron [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Port df9aa099-aa41-4111-b46c-c8a593762a53 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 21 23:47:28 compute-0 nova_compute[182935]: 2026-01-21 23:47:28.950 182939 DEBUG nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9ku_13i8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5bdecf5d-9113-4584-ac23-44d59770eade',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 21 23:47:29 compute-0 nova_compute[182935]: 2026-01-21 23:47:29.029 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039234.0285006, b1080912-4a1f-4504-ae59-a0ad89963886 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:29 compute-0 nova_compute[182935]: 2026-01-21 23:47:29.030 182939 INFO nova.compute.manager [-] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] VM Stopped (Lifecycle Event)
Jan 21 23:47:29 compute-0 nova_compute[182935]: 2026-01-21 23:47:29.059 182939 DEBUG nova.compute.manager [None req-ae6f1d63-f31a-45f5-b404-711c288b30c1 - - - - - -] [instance: b1080912-4a1f-4504-ae59-a0ad89963886] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:29 compute-0 kernel: tapdf9aa099-aa: entered promiscuous mode
Jan 21 23:47:29 compute-0 NetworkManager[55139]: <info>  [1769039249.2267] manager: (tapdf9aa099-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Jan 21 23:47:29 compute-0 ovn_controller[95047]: 2026-01-21T23:47:29Z|00075|binding|INFO|Claiming lport df9aa099-aa41-4111-b46c-c8a593762a53 for this additional chassis.
Jan 21 23:47:29 compute-0 ovn_controller[95047]: 2026-01-21T23:47:29Z|00076|binding|INFO|df9aa099-aa41-4111-b46c-c8a593762a53: Claiming fa:16:3e:8f:4f:85 10.100.0.6
Jan 21 23:47:29 compute-0 nova_compute[182935]: 2026-01-21 23:47:29.230 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:29 compute-0 systemd-udevd[214282]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:47:29 compute-0 NetworkManager[55139]: <info>  [1769039249.2696] device (tapdf9aa099-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:47:29 compute-0 NetworkManager[55139]: <info>  [1769039249.2703] device (tapdf9aa099-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:47:29 compute-0 systemd-machined[154182]: New machine qemu-11-instance-00000015.
Jan 21 23:47:29 compute-0 nova_compute[182935]: 2026-01-21 23:47:29.319 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:29 compute-0 nova_compute[182935]: 2026-01-21 23:47:29.323 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:29 compute-0 ovn_controller[95047]: 2026-01-21T23:47:29Z|00077|binding|INFO|Setting lport df9aa099-aa41-4111-b46c-c8a593762a53 ovn-installed in OVS
Jan 21 23:47:29 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-00000015.
Jan 21 23:47:29 compute-0 nova_compute[182935]: 2026-01-21 23:47:29.324 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:29 compute-0 nova_compute[182935]: 2026-01-21 23:47:29.758 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:30 compute-0 nova_compute[182935]: 2026-01-21 23:47:30.334 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039250.3327045, 5bdecf5d-9113-4584-ac23-44d59770eade => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:30 compute-0 nova_compute[182935]: 2026-01-21 23:47:30.334 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] VM Started (Lifecycle Event)
Jan 21 23:47:30 compute-0 nova_compute[182935]: 2026-01-21 23:47:30.504 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:31 compute-0 nova_compute[182935]: 2026-01-21 23:47:31.927 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039251.9267743, 5bdecf5d-9113-4584-ac23-44d59770eade => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:31 compute-0 nova_compute[182935]: 2026-01-21 23:47:31.928 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] VM Resumed (Lifecycle Event)
Jan 21 23:47:31 compute-0 nova_compute[182935]: 2026-01-21 23:47:31.962 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:31 compute-0 nova_compute[182935]: 2026-01-21 23:47:31.966 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:31 compute-0 nova_compute[182935]: 2026-01-21 23:47:31.996 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 21 23:47:32 compute-0 nova_compute[182935]: 2026-01-21 23:47:32.755 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:33 compute-0 ovn_controller[95047]: 2026-01-21T23:47:33Z|00078|binding|INFO|Claiming lport df9aa099-aa41-4111-b46c-c8a593762a53 for this chassis.
Jan 21 23:47:33 compute-0 ovn_controller[95047]: 2026-01-21T23:47:33Z|00079|binding|INFO|df9aa099-aa41-4111-b46c-c8a593762a53: Claiming fa:16:3e:8f:4f:85 10.100.0.6
Jan 21 23:47:33 compute-0 ovn_controller[95047]: 2026-01-21T23:47:33Z|00080|binding|INFO|Setting lport df9aa099-aa41-4111-b46c-c8a593762a53 up in Southbound
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.150 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:4f:85 10.100.0.6'], port_security=['fa:16:3e:8f:4f:85 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2df233d-b255-4dda-925c-3ccab3a032ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ceab9906-340c-4566-81ac-4c6dd292f58f, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=df9aa099-aa41-4111-b46c-c8a593762a53) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.152 104408 INFO neutron.agent.ovn.metadata.agent [-] Port df9aa099-aa41-4111-b46c-c8a593762a53 in datapath b2df233d-b255-4dda-925c-3ccab3a032ee bound to our chassis
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.155 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.178 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c910cd37-2eef-429b-a064-438123d8e5d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.180 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2df233d-b1 in ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.185 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2df233d-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.185 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[86ceff44-2a35-4ecf-9f61-437605f9dfc6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.187 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[175c9383-7b5f-4a4b-bdba-13c410a90544]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.201 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[3706383e-59f3-4e44-af4f-8fc0860c81f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.223 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[844e53cf-f0b9-4008-9692-c479d013b31a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.254 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[1205467b-023f-4d17-bb16-2dc1a32177d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.260 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[75bec328-ff18-4147-bc29-e3e7b2f9c75a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 NetworkManager[55139]: <info>  [1769039253.2619] manager: (tapb2df233d-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/44)
Jan 21 23:47:33 compute-0 systemd-udevd[214314]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.296 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[99728ca4-4673-417d-ac26-8b23d3887aa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.302 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[19da1b2a-5e59-437c-84c3-5e26291b18c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 NetworkManager[55139]: <info>  [1769039253.3336] device (tapb2df233d-b0): carrier: link connected
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.337 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[544d5c30-016b-4cc1-a7a6-c0d21f5ae351]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.358 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d9f36bcd-ce49-4287-be2f-b306a2057261]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2df233d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:e6:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375607, 'reachable_time': 21594, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214333, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.382 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[910f28a0-04e2-4735-9401-7735d1739693]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe45:e636'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375607, 'tstamp': 375607}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214334, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.398 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3f9005da-6696-4f7f-a893-a19cc9adddb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2df233d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:e6:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375607, 'reachable_time': 21594, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214335, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.442 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7e6dc51c-e466-4ac4-8553-00a72ea25b71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.518 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[634561c0-c8c9-4821-8790-857625656b48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.520 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2df233d-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.520 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.520 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2df233d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:33 compute-0 NetworkManager[55139]: <info>  [1769039253.5236] manager: (tapb2df233d-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 21 23:47:33 compute-0 nova_compute[182935]: 2026-01-21 23:47:33.523 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:33 compute-0 kernel: tapb2df233d-b0: entered promiscuous mode
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.527 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2df233d-b0, col_values=(('external_ids', {'iface-id': '75454af0-da31-4238-b248-a6678c575f51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:33 compute-0 nova_compute[182935]: 2026-01-21 23:47:33.529 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:33 compute-0 ovn_controller[95047]: 2026-01-21T23:47:33Z|00081|binding|INFO|Releasing lport 75454af0-da31-4238-b248-a6678c575f51 from this chassis (sb_readonly=0)
Jan 21 23:47:33 compute-0 nova_compute[182935]: 2026-01-21 23:47:33.530 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.530 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2df233d-b255-4dda-925c-3ccab3a032ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2df233d-b255-4dda-925c-3ccab3a032ee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.531 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[404a7b18-3b02-44b4-9196-5b29b8544c9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.532 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/b2df233d-b255-4dda-925c-3ccab3a032ee.pid.haproxy
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:47:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:33.533 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'env', 'PROCESS_TAG=haproxy-b2df233d-b255-4dda-925c-3ccab3a032ee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2df233d-b255-4dda-925c-3ccab3a032ee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:47:33 compute-0 nova_compute[182935]: 2026-01-21 23:47:33.541 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:33 compute-0 nova_compute[182935]: 2026-01-21 23:47:33.667 182939 INFO nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Post operation of migration started
Jan 21 23:47:34 compute-0 podman[214368]: 2026-01-21 23:47:33.940628391 +0000 UTC m=+0.026593717 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:47:34 compute-0 podman[214368]: 2026-01-21 23:47:34.039771522 +0000 UTC m=+0.125736828 container create c089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:47:34 compute-0 systemd[1]: Started libpod-conmon-c089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b.scope.
Jan 21 23:47:34 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:47:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14d01be49af245df849a37f6abc28f95ab740675bbc6f7a9463cb000671d1772/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:47:34 compute-0 podman[214368]: 2026-01-21 23:47:34.151022351 +0000 UTC m=+0.236987677 container init c089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 21 23:47:34 compute-0 podman[214368]: 2026-01-21 23:47:34.163168021 +0000 UTC m=+0.249133377 container start c089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:47:34 compute-0 nova_compute[182935]: 2026-01-21 23:47:34.167 182939 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:34 compute-0 nova_compute[182935]: 2026-01-21 23:47:34.168 182939 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquired lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:34 compute-0 nova_compute[182935]: 2026-01-21 23:47:34.169 182939 DEBUG nova.network.neutron [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:47:34 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214383]: [NOTICE]   (214387) : New worker (214389) forked
Jan 21 23:47:34 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214383]: [NOTICE]   (214387) : Loading success.
Jan 21 23:47:34 compute-0 nova_compute[182935]: 2026-01-21 23:47:34.759 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:35 compute-0 nova_compute[182935]: 2026-01-21 23:47:35.822 182939 DEBUG nova.network.neutron [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updating instance_info_cache with network_info: [{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:35 compute-0 nova_compute[182935]: 2026-01-21 23:47:35.849 182939 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Releasing lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:35 compute-0 nova_compute[182935]: 2026-01-21 23:47:35.883 182939 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:35 compute-0 nova_compute[182935]: 2026-01-21 23:47:35.884 182939 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:35 compute-0 nova_compute[182935]: 2026-01-21 23:47:35.884 182939 DEBUG oslo_concurrency.lockutils [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:35 compute-0 nova_compute[182935]: 2026-01-21 23:47:35.891 182939 INFO nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 21 23:47:35 compute-0 virtqemud[182477]: Domain id=11 name='instance-00000015' uuid=5bdecf5d-9113-4584-ac23-44d59770eade is tainted: custom-monitor
Jan 21 23:47:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:36.175 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:36 compute-0 nova_compute[182935]: 2026-01-21 23:47:36.175 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:36.178 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:47:36 compute-0 nova_compute[182935]: 2026-01-21 23:47:36.903 182939 INFO nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 21 23:47:37 compute-0 nova_compute[182935]: 2026-01-21 23:47:37.758 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:37 compute-0 podman[214398]: 2026-01-21 23:47:37.773484431 +0000 UTC m=+0.117443719 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, architecture=x86_64)
Jan 21 23:47:37 compute-0 podman[214399]: 2026-01-21 23:47:37.798907039 +0000 UTC m=+0.136512395 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Jan 21 23:47:37 compute-0 nova_compute[182935]: 2026-01-21 23:47:37.911 182939 INFO nova.virt.libvirt.driver [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 21 23:47:37 compute-0 nova_compute[182935]: 2026-01-21 23:47:37.917 182939 DEBUG nova.compute.manager [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:37 compute-0 nova_compute[182935]: 2026-01-21 23:47:37.945 182939 DEBUG nova.objects.instance [None req-c15d4748-f712-4aac-9314-66c8c072b909 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:47:38 compute-0 nova_compute[182935]: 2026-01-21 23:47:38.293 182939 INFO nova.compute.manager [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Rebuilding instance
Jan 21 23:47:38 compute-0 nova_compute[182935]: 2026-01-21 23:47:38.585 182939 DEBUG nova.compute.manager [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:38 compute-0 nova_compute[182935]: 2026-01-21 23:47:38.658 182939 DEBUG nova.objects.instance [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'pci_requests' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:38 compute-0 nova_compute[182935]: 2026-01-21 23:47:38.675 182939 DEBUG nova.objects.instance [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'pci_devices' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:38 compute-0 nova_compute[182935]: 2026-01-21 23:47:38.688 182939 DEBUG nova.objects.instance [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'resources' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:38 compute-0 nova_compute[182935]: 2026-01-21 23:47:38.700 182939 DEBUG nova.objects.instance [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'migration_context' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:38 compute-0 nova_compute[182935]: 2026-01-21 23:47:38.718 182939 DEBUG nova.objects.instance [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:47:38 compute-0 nova_compute[182935]: 2026-01-21 23:47:38.723 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:47:39 compute-0 nova_compute[182935]: 2026-01-21 23:47:39.762 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.091 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.123 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Triggering sync for uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.123 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Triggering sync for uuid efc683b9-a8d9-4a67-bb19-aeaabfbd5423 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.124 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Triggering sync for uuid 5bdecf5d-9113-4584-ac23-44d59770eade _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.124 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.125 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.125 182939 INFO nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] During sync_power_state the instance has a pending task (rebuilding). Skip.
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.126 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.126 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.127 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.127 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.128 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "5bdecf5d-9113-4584-ac23-44d59770eade" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.164 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:40 compute-0 nova_compute[182935]: 2026-01-21 23:47:40.166 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "5bdecf5d-9113-4584-ac23-44d59770eade" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:41 compute-0 kernel: tap25b6ea25-2c (unregistering): left promiscuous mode
Jan 21 23:47:41 compute-0 NetworkManager[55139]: <info>  [1769039261.0099] device (tap25b6ea25-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:47:41 compute-0 ovn_controller[95047]: 2026-01-21T23:47:41Z|00082|binding|INFO|Releasing lport 25b6ea25-2c24-4a07-9772-28913505aec2 from this chassis (sb_readonly=0)
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.018 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:41 compute-0 ovn_controller[95047]: 2026-01-21T23:47:41Z|00083|binding|INFO|Setting lport 25b6ea25-2c24-4a07-9772-28913505aec2 down in Southbound
Jan 21 23:47:41 compute-0 ovn_controller[95047]: 2026-01-21T23:47:41Z|00084|binding|INFO|Removing iface tap25b6ea25-2c ovn-installed in OVS
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.020 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.028 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:1c:84 10.100.0.8'], port_security=['fa:16:3e:0d:1c:84 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a6a89006-02c9-49b1-8bfb-8640ba1b495f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=25b6ea25-2c24-4a07-9772-28913505aec2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.029 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 25b6ea25-2c24-4a07-9772-28913505aec2 in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 unbound from our chassis
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.031 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.036 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.048 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a7c99afd-1146-4ed7-b122-3d04f5f6b8b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:41 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 21 23:47:41 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000012.scope: Consumed 17.323s CPU time.
Jan 21 23:47:41 compute-0 systemd-machined[154182]: Machine qemu-8-instance-00000012 terminated.
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.091 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8e176380-b9c2-4e65-a5db-e234b123db5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.095 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[eb049867-a8f9-423f-a604-d21e6893cb83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.139 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[774f528a-abd2-48bb-911d-0d55c733361d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.170 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[73112e7e-a23f-42b0-ae01-5b5ac14b320d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371486, 'reachable_time': 18847, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214460, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.195 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e1110c08-3b52-41f5-b42f-8eab387a7de5]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371506, 'tstamp': 371506}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214461, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371510, 'tstamp': 371510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214461, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.198 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.200 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.206 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.208 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1530a22a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.209 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.210 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1530a22a-f0, col_values=(('external_ids', {'iface-id': '1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:41.211 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.746 182939 INFO nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance shutdown successfully after 3 seconds.
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.753 182939 INFO nova.virt.libvirt.driver [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance destroyed successfully.
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.760 182939 INFO nova.virt.libvirt.driver [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance destroyed successfully.
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.762 182939 DEBUG nova.virt.libvirt.vif [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:46:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-168641085',display_name='tempest-ServersAdminTestJSON-server-168641085',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-168641085',id=18,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:46:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-evbqme7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:37Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=a6a89006-02c9-49b1-8bfb-8640ba1b495f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.763 182939 DEBUG nova.network.os_vif_util [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.764 182939 DEBUG nova.network.os_vif_util [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.765 182939 DEBUG os_vif [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.768 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.769 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25b6ea25-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.775 182939 DEBUG nova.compute.manager [req-e3461d57-a582-4ada-b913-72b2cbb1def5 req-74d75ec7-0142-4a58-84b6-0ae312d6de8f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-unplugged-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.775 182939 DEBUG oslo_concurrency.lockutils [req-e3461d57-a582-4ada-b913-72b2cbb1def5 req-74d75ec7-0142-4a58-84b6-0ae312d6de8f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.775 182939 DEBUG oslo_concurrency.lockutils [req-e3461d57-a582-4ada-b913-72b2cbb1def5 req-74d75ec7-0142-4a58-84b6-0ae312d6de8f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.775 182939 DEBUG oslo_concurrency.lockutils [req-e3461d57-a582-4ada-b913-72b2cbb1def5 req-74d75ec7-0142-4a58-84b6-0ae312d6de8f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.776 182939 DEBUG nova.compute.manager [req-e3461d57-a582-4ada-b913-72b2cbb1def5 req-74d75ec7-0142-4a58-84b6-0ae312d6de8f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] No waiting events found dispatching network-vif-unplugged-25b6ea25-2c24-4a07-9772-28913505aec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.776 182939 WARNING nova.compute.manager [req-e3461d57-a582-4ada-b913-72b2cbb1def5 req-74d75ec7-0142-4a58-84b6-0ae312d6de8f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received unexpected event network-vif-unplugged-25b6ea25-2c24-4a07-9772-28913505aec2 for instance with vm_state error and task_state rebuilding.
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.776 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.777 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.781 182939 INFO os_vif [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c')
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.781 182939 INFO nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Deleting instance files /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f_del
Jan 21 23:47:41 compute-0 nova_compute[182935]: 2026-01-21 23:47:41.782 182939 INFO nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Deletion of /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f_del complete
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.037 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.038 182939 INFO nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Creating image(s)
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.039 182939 DEBUG oslo_concurrency.lockutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.039 182939 DEBUG oslo_concurrency.lockutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.040 182939 DEBUG oslo_concurrency.lockutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.055 182939 DEBUG oslo_concurrency.processutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.140 182939 DEBUG oslo_concurrency.processutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.141 182939 DEBUG oslo_concurrency.lockutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.142 182939 DEBUG oslo_concurrency.lockutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.153 182939 DEBUG oslo_concurrency.processutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:42 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:42.181 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.219 182939 DEBUG oslo_concurrency.processutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.220 182939 DEBUG oslo_concurrency.processutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.257 182939 DEBUG oslo_concurrency.processutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.259 182939 DEBUG oslo_concurrency.lockutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.260 182939 DEBUG oslo_concurrency.processutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.333 182939 DEBUG oslo_concurrency.processutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.334 182939 DEBUG nova.virt.disk.api [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Checking if we can resize image /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.335 182939 DEBUG oslo_concurrency.processutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.431 182939 DEBUG oslo_concurrency.processutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.433 182939 DEBUG nova.virt.disk.api [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Cannot resize image /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.433 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.434 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Ensure instance console log exists: /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.434 182939 DEBUG oslo_concurrency.lockutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.434 182939 DEBUG oslo_concurrency.lockutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.435 182939 DEBUG oslo_concurrency.lockutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.437 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Start _get_guest_xml network_info=[{"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.442 182939 WARNING nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.452 182939 DEBUG nova.virt.libvirt.host [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.453 182939 DEBUG nova.virt.libvirt.host [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.456 182939 DEBUG nova.virt.libvirt.host [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.457 182939 DEBUG nova.virt.libvirt.host [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.459 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.459 182939 DEBUG nova.virt.hardware [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.460 182939 DEBUG nova.virt.hardware [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.460 182939 DEBUG nova.virt.hardware [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.460 182939 DEBUG nova.virt.hardware [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.460 182939 DEBUG nova.virt.hardware [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.460 182939 DEBUG nova.virt.hardware [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.461 182939 DEBUG nova.virt.hardware [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.461 182939 DEBUG nova.virt.hardware [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.461 182939 DEBUG nova.virt.hardware [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.461 182939 DEBUG nova.virt.hardware [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.462 182939 DEBUG nova.virt.hardware [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.462 182939 DEBUG nova.objects.instance [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'vcpu_model' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.483 182939 DEBUG nova.virt.libvirt.vif [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:46:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-168641085',display_name='tempest-ServersAdminTestJSON-server-168641085',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-168641085',id=18,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:46:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-evbqme7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:41Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=a6a89006-02c9-49b1-8bfb-8640ba1b495f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.484 182939 DEBUG nova.network.os_vif_util [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.485 182939 DEBUG nova.network.os_vif_util [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.486 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:47:42 compute-0 nova_compute[182935]:   <uuid>a6a89006-02c9-49b1-8bfb-8640ba1b495f</uuid>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   <name>instance-00000012</name>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersAdminTestJSON-server-168641085</nova:name>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:47:42</nova:creationTime>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:47:42 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:47:42 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:47:42 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:47:42 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:47:42 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:47:42 compute-0 nova_compute[182935]:         <nova:user uuid="4a6034ff39094b6486bac680b7ed5a57">tempest-ServersAdminTestJSON-1815099341-project-member</nova:user>
Jan 21 23:47:42 compute-0 nova_compute[182935]:         <nova:project uuid="4d40fc03fb534b5689415f3d8a3de1fc">tempest-ServersAdminTestJSON-1815099341</nova:project>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="3e1dda74-3c6a-4d29-8792-32134d1c36c5"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:47:42 compute-0 nova_compute[182935]:         <nova:port uuid="25b6ea25-2c24-4a07-9772-28913505aec2">
Jan 21 23:47:42 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <system>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <entry name="serial">a6a89006-02c9-49b1-8bfb-8640ba1b495f</entry>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <entry name="uuid">a6a89006-02c9-49b1-8bfb-8640ba1b495f</entry>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     </system>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   <os>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   </os>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   <features>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   </features>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.config"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:0d:1c:84"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <target dev="tap25b6ea25-2c"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/console.log" append="off"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <video>
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     </video>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:47:42 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:47:42 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:47:42 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:47:42 compute-0 nova_compute[182935]: </domain>
Jan 21 23:47:42 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.489 182939 DEBUG nova.virt.libvirt.vif [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:46:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-168641085',display_name='tempest-ServersAdminTestJSON-server-168641085',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-168641085',id=18,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:46:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-evbqme7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:41Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=a6a89006-02c9-49b1-8bfb-8640ba1b495f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.490 182939 DEBUG nova.network.os_vif_util [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.492 182939 DEBUG nova.network.os_vif_util [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.493 182939 DEBUG os_vif [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.494 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.495 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.496 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.500 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.501 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25b6ea25-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.502 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap25b6ea25-2c, col_values=(('external_ids', {'iface-id': '25b6ea25-2c24-4a07-9772-28913505aec2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:1c:84', 'vm-uuid': 'a6a89006-02c9-49b1-8bfb-8640ba1b495f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.504 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:42 compute-0 NetworkManager[55139]: <info>  [1769039262.5050] manager: (tap25b6ea25-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.506 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.514 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.516 182939 INFO os_vif [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c')
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.568 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.569 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.569 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No VIF found with MAC fa:16:3e:0d:1c:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.570 182939 INFO nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Using config drive
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.594 182939 DEBUG nova.objects.instance [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'ec2_ids' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:42 compute-0 nova_compute[182935]: 2026-01-21 23:47:42.627 182939 DEBUG nova.objects.instance [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'keypairs' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.171 182939 INFO nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Creating config drive at /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.config
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.178 182939 DEBUG oslo_concurrency.processutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppwoo2lur execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.319 182939 DEBUG oslo_concurrency.processutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppwoo2lur" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:43 compute-0 kernel: tap25b6ea25-2c: entered promiscuous mode
Jan 21 23:47:43 compute-0 NetworkManager[55139]: <info>  [1769039263.3878] manager: (tap25b6ea25-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.387 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:43 compute-0 ovn_controller[95047]: 2026-01-21T23:47:43Z|00085|binding|INFO|Claiming lport 25b6ea25-2c24-4a07-9772-28913505aec2 for this chassis.
Jan 21 23:47:43 compute-0 ovn_controller[95047]: 2026-01-21T23:47:43Z|00086|binding|INFO|25b6ea25-2c24-4a07-9772-28913505aec2: Claiming fa:16:3e:0d:1c:84 10.100.0.8
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.391 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:43 compute-0 systemd-udevd[214451]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.399 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:1c:84 10.100.0.8'], port_security=['fa:16:3e:0d:1c:84 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a6a89006-02c9-49b1-8bfb-8640ba1b495f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '5', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=25b6ea25-2c24-4a07-9772-28913505aec2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.400 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 25b6ea25-2c24-4a07-9772-28913505aec2 in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 bound to our chassis
Jan 21 23:47:43 compute-0 ovn_controller[95047]: 2026-01-21T23:47:43Z|00087|binding|INFO|Setting lport 25b6ea25-2c24-4a07-9772-28913505aec2 ovn-installed in OVS
Jan 21 23:47:43 compute-0 ovn_controller[95047]: 2026-01-21T23:47:43Z|00088|binding|INFO|Setting lport 25b6ea25-2c24-4a07-9772-28913505aec2 up in Southbound
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.401 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.402 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:43 compute-0 ovn_controller[95047]: 2026-01-21T23:47:43Z|00089|binding|INFO|Releasing lport 75454af0-da31-4238-b248-a6678c575f51 from this chassis (sb_readonly=0)
Jan 21 23:47:43 compute-0 ovn_controller[95047]: 2026-01-21T23:47:43Z|00090|binding|INFO|Releasing lport 1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd from this chassis (sb_readonly=0)
Jan 21 23:47:43 compute-0 NetworkManager[55139]: <info>  [1769039263.4183] device (tap25b6ea25-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:47:43 compute-0 NetworkManager[55139]: <info>  [1769039263.4200] device (tap25b6ea25-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.427 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6956f6e9-d7cb-4e94-88ad-d54fce84c369]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:43 compute-0 systemd-machined[154182]: New machine qemu-12-instance-00000012.
Jan 21 23:47:43 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000012.
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.468 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f54946b1-690f-4e0d-987f-0ac46935429b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.472 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b19684fa-4421-4e4e-bbc1-4089d4058a4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.503 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[2de4be8c-0209-4c06-89db-114dbe722809]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.525 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b38cb475-8b13-4dd7-961b-f43bffbb9424]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371486, 'reachable_time': 18847, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214528, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.550 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[646d4b85-4878-4d92-a5ee-d810cae5dda9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371506, 'tstamp': 371506}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214530, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371510, 'tstamp': 371510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214530, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.552 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.554 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.555 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.555 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1530a22a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.555 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.556 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1530a22a-f0, col_values=(('external_ids', {'iface-id': '1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:43.556 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.860 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for a6a89006-02c9-49b1-8bfb-8640ba1b495f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.861 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039263.8601944, a6a89006-02c9-49b1-8bfb-8640ba1b495f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.862 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] VM Resumed (Lifecycle Event)
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.865 182939 DEBUG nova.compute.manager [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.865 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.870 182939 INFO nova.virt.libvirt.driver [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance spawned successfully.
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.870 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.885 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.893 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.898 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.899 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.900 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.900 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.901 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.901 182939 DEBUG nova.virt.libvirt.driver [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.929 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.930 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039263.8648481, a6a89006-02c9-49b1-8bfb-8640ba1b495f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.930 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] VM Started (Lifecycle Event)
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.969 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:43 compute-0 nova_compute[182935]: 2026-01-21 23:47:43.974 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.005 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.008 182939 DEBUG nova.compute.manager [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.008 182939 DEBUG oslo_concurrency.lockutils [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.008 182939 DEBUG oslo_concurrency.lockutils [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.009 182939 DEBUG oslo_concurrency.lockutils [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.009 182939 DEBUG nova.compute.manager [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] No waiting events found dispatching network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.009 182939 WARNING nova.compute.manager [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received unexpected event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 for instance with vm_state error and task_state rebuild_spawning.
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.009 182939 DEBUG nova.compute.manager [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.010 182939 DEBUG oslo_concurrency.lockutils [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.010 182939 DEBUG oslo_concurrency.lockutils [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.010 182939 DEBUG oslo_concurrency.lockutils [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.010 182939 DEBUG nova.compute.manager [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] No waiting events found dispatching network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.011 182939 WARNING nova.compute.manager [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received unexpected event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 for instance with vm_state error and task_state rebuild_spawning.
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.011 182939 DEBUG nova.compute.manager [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.011 182939 DEBUG oslo_concurrency.lockutils [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.011 182939 DEBUG oslo_concurrency.lockutils [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.012 182939 DEBUG oslo_concurrency.lockutils [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.012 182939 DEBUG nova.compute.manager [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] No waiting events found dispatching network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.012 182939 WARNING nova.compute.manager [req-ada4081c-fa40-4062-b4da-754a583a7b46 req-06404ff8-ae05-4c3c-beac-3eb03cef248b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received unexpected event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 for instance with vm_state error and task_state rebuild_spawning.
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.018 182939 DEBUG nova.compute.manager [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.117 182939 DEBUG oslo_concurrency.lockutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.118 182939 DEBUG oslo_concurrency.lockutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.118 182939 DEBUG nova.objects.instance [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.202 182939 DEBUG oslo_concurrency.lockutils [None req-bb893aa6-3f9b-4411-8a83-3eed3af7350e 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.205 182939 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Check if temp file /var/lib/nova/instances/tmp34j3k16n exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.210 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.304 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.305 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.401 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.403 182939 DEBUG nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp34j3k16n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5bdecf5d-9113-4584-ac23-44d59770eade',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 21 23:47:44 compute-0 nova_compute[182935]: 2026-01-21 23:47:44.765 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:45 compute-0 nova_compute[182935]: 2026-01-21 23:47:45.556 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:45 compute-0 nova_compute[182935]: 2026-01-21 23:47:45.628 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:45 compute-0 nova_compute[182935]: 2026-01-21 23:47:45.630 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:47:45 compute-0 nova_compute[182935]: 2026-01-21 23:47:45.690 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:47:47 compute-0 nova_compute[182935]: 2026-01-21 23:47:47.483 182939 INFO nova.compute.manager [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Rebuilding instance
Jan 21 23:47:47 compute-0 nova_compute[182935]: 2026-01-21 23:47:47.506 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:47 compute-0 nova_compute[182935]: 2026-01-21 23:47:47.794 182939 DEBUG nova.compute.manager [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:47 compute-0 nova_compute[182935]: 2026-01-21 23:47:47.904 182939 DEBUG nova.objects.instance [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'pci_requests' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:47 compute-0 nova_compute[182935]: 2026-01-21 23:47:47.920 182939 DEBUG nova.objects.instance [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'pci_devices' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:47 compute-0 nova_compute[182935]: 2026-01-21 23:47:47.934 182939 DEBUG nova.objects.instance [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'resources' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:47 compute-0 nova_compute[182935]: 2026-01-21 23:47:47.947 182939 DEBUG nova.objects.instance [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'migration_context' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:47 compute-0 nova_compute[182935]: 2026-01-21 23:47:47.959 182939 DEBUG nova.objects.instance [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:47:47 compute-0 nova_compute[182935]: 2026-01-21 23:47:47.963 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:47:48 compute-0 sshd-session[214550]: Accepted publickey for nova from 192.168.122.101 port 54434 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:47:48 compute-0 systemd-logind[784]: New session 28 of user nova.
Jan 21 23:47:48 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 23:47:48 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 23:47:48 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 23:47:48 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 23:47:48 compute-0 systemd[214579]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:47:48 compute-0 podman[214554]: 2026-01-21 23:47:48.680168169 +0000 UTC m=+0.107103651 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:47:48 compute-0 podman[214552]: 2026-01-21 23:47:48.700616798 +0000 UTC m=+0.135591923 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 21 23:47:48 compute-0 systemd[214579]: Queued start job for default target Main User Target.
Jan 21 23:47:48 compute-0 systemd[214579]: Created slice User Application Slice.
Jan 21 23:47:48 compute-0 systemd[214579]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:47:48 compute-0 systemd[214579]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:47:48 compute-0 systemd[214579]: Reached target Paths.
Jan 21 23:47:48 compute-0 systemd[214579]: Reached target Timers.
Jan 21 23:47:48 compute-0 systemd[214579]: Starting D-Bus User Message Bus Socket...
Jan 21 23:47:48 compute-0 systemd[214579]: Starting Create User's Volatile Files and Directories...
Jan 21 23:47:48 compute-0 systemd[214579]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:47:48 compute-0 systemd[214579]: Reached target Sockets.
Jan 21 23:47:48 compute-0 systemd[214579]: Finished Create User's Volatile Files and Directories.
Jan 21 23:47:48 compute-0 systemd[214579]: Reached target Basic System.
Jan 21 23:47:48 compute-0 systemd[214579]: Reached target Main User Target.
Jan 21 23:47:48 compute-0 systemd[214579]: Startup finished in 154ms.
Jan 21 23:47:48 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 23:47:48 compute-0 systemd[1]: Started Session 28 of User nova.
Jan 21 23:47:48 compute-0 sshd-session[214550]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:47:48 compute-0 sshd-session[214617]: Received disconnect from 192.168.122.101 port 54434:11: disconnected by user
Jan 21 23:47:48 compute-0 sshd-session[214617]: Disconnected from user nova 192.168.122.101 port 54434
Jan 21 23:47:48 compute-0 sshd-session[214550]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:47:48 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Jan 21 23:47:48 compute-0 systemd-logind[784]: Session 28 logged out. Waiting for processes to exit.
Jan 21 23:47:48 compute-0 systemd-logind[784]: Removed session 28.
Jan 21 23:47:49 compute-0 nova_compute[182935]: 2026-01-21 23:47:49.768 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:50 compute-0 nova_compute[182935]: 2026-01-21 23:47:50.228 182939 DEBUG nova.compute.manager [req-7f978936-c6ac-48ef-823c-8cea496659ea req-9c026b7b-cf8f-4253-83e2-7818bbbe753f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:50 compute-0 nova_compute[182935]: 2026-01-21 23:47:50.229 182939 DEBUG oslo_concurrency.lockutils [req-7f978936-c6ac-48ef-823c-8cea496659ea req-9c026b7b-cf8f-4253-83e2-7818bbbe753f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:50 compute-0 nova_compute[182935]: 2026-01-21 23:47:50.229 182939 DEBUG oslo_concurrency.lockutils [req-7f978936-c6ac-48ef-823c-8cea496659ea req-9c026b7b-cf8f-4253-83e2-7818bbbe753f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:50 compute-0 nova_compute[182935]: 2026-01-21 23:47:50.230 182939 DEBUG oslo_concurrency.lockutils [req-7f978936-c6ac-48ef-823c-8cea496659ea req-9c026b7b-cf8f-4253-83e2-7818bbbe753f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:50 compute-0 nova_compute[182935]: 2026-01-21 23:47:50.230 182939 DEBUG nova.compute.manager [req-7f978936-c6ac-48ef-823c-8cea496659ea req-9c026b7b-cf8f-4253-83e2-7818bbbe753f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:50 compute-0 nova_compute[182935]: 2026-01-21 23:47:50.230 182939 DEBUG nova.compute.manager [req-7f978936-c6ac-48ef-823c-8cea496659ea req-9c026b7b-cf8f-4253-83e2-7818bbbe753f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.215 182939 INFO nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Took 5.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.217 182939 DEBUG nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.234 182939 DEBUG nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp34j3k16n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5bdecf5d-9113-4584-ac23-44d59770eade',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(a95800ca-3e2b-439a-bf8d-a9372550d7ea),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.255 182939 DEBUG nova.objects.instance [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lazy-loading 'migration_context' on Instance uuid 5bdecf5d-9113-4584-ac23-44d59770eade obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.257 182939 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.259 182939 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.260 182939 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.283 182939 DEBUG nova.virt.libvirt.vif [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-821021372',display_name='tempest-LiveMigrationTest-server-821021372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-821021372',id=21,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:47:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-xa5gd7vg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:47:38Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=5bdecf5d-9113-4584-ac23-44d59770eade,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.284 182939 DEBUG nova.network.os_vif_util [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converting VIF {"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.285 182939 DEBUG nova.network.os_vif_util [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.286 182939 DEBUG nova.virt.libvirt.migration [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updating guest XML with vif config: <interface type="ethernet">
Jan 21 23:47:51 compute-0 nova_compute[182935]:   <mac address="fa:16:3e:8f:4f:85"/>
Jan 21 23:47:51 compute-0 nova_compute[182935]:   <model type="virtio"/>
Jan 21 23:47:51 compute-0 nova_compute[182935]:   <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:47:51 compute-0 nova_compute[182935]:   <mtu size="1442"/>
Jan 21 23:47:51 compute-0 nova_compute[182935]:   <target dev="tapdf9aa099-aa"/>
Jan 21 23:47:51 compute-0 nova_compute[182935]: </interface>
Jan 21 23:47:51 compute-0 nova_compute[182935]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.287 182939 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.763 182939 DEBUG nova.virt.libvirt.migration [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.764 182939 INFO nova.virt.libvirt.migration [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 21 23:47:51 compute-0 nova_compute[182935]: 2026-01-21 23:47:51.854 182939 INFO nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.338 182939 DEBUG nova.compute.manager [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.339 182939 DEBUG oslo_concurrency.lockutils [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.339 182939 DEBUG oslo_concurrency.lockutils [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.340 182939 DEBUG oslo_concurrency.lockutils [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.340 182939 DEBUG nova.compute.manager [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.340 182939 WARNING nova.compute.manager [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received unexpected event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with vm_state active and task_state migrating.
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.341 182939 DEBUG nova.compute.manager [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-changed-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.341 182939 DEBUG nova.compute.manager [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Refreshing instance network info cache due to event network-changed-df9aa099-aa41-4111-b46c-c8a593762a53. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.341 182939 DEBUG oslo_concurrency.lockutils [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.341 182939 DEBUG oslo_concurrency.lockutils [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.342 182939 DEBUG nova.network.neutron [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Refreshing network info cache for port df9aa099-aa41-4111-b46c-c8a593762a53 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.436 182939 DEBUG nova.virt.libvirt.migration [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.437 182939 DEBUG nova.virt.libvirt.migration [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.510 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.941 182939 DEBUG nova.virt.libvirt.migration [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 21 23:47:52 compute-0 nova_compute[182935]: 2026-01-21 23:47:52.942 182939 DEBUG nova.virt.libvirt.migration [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.009 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039273.0091803, 5bdecf5d-9113-4584-ac23-44d59770eade => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.010 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] VM Paused (Lifecycle Event)
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.030 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.034 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.062 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 21 23:47:53 compute-0 kernel: tapdf9aa099-aa (unregistering): left promiscuous mode
Jan 21 23:47:53 compute-0 NetworkManager[55139]: <info>  [1769039273.1923] device (tapdf9aa099-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:47:53 compute-0 ovn_controller[95047]: 2026-01-21T23:47:53Z|00091|binding|INFO|Releasing lport df9aa099-aa41-4111-b46c-c8a593762a53 from this chassis (sb_readonly=0)
Jan 21 23:47:53 compute-0 ovn_controller[95047]: 2026-01-21T23:47:53Z|00092|binding|INFO|Setting lport df9aa099-aa41-4111-b46c-c8a593762a53 down in Southbound
Jan 21 23:47:53 compute-0 ovn_controller[95047]: 2026-01-21T23:47:53Z|00093|binding|INFO|Removing iface tapdf9aa099-aa ovn-installed in OVS
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.221 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.226 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.233 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.235 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:4f:85 10.100.0.6'], port_security=['fa:16:3e:8f:4f:85 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '74526b6d-b1ca-423f-9094-b845f8b97526'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5bdecf5d-9113-4584-ac23-44d59770eade', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2df233d-b255-4dda-925c-3ccab3a032ee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '18', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ceab9906-340c-4566-81ac-4c6dd292f58f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=df9aa099-aa41-4111-b46c-c8a593762a53) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.240 104408 INFO neutron.agent.ovn.metadata.agent [-] Port df9aa099-aa41-4111-b46c-c8a593762a53 in datapath b2df233d-b255-4dda-925c-3ccab3a032ee unbound from our chassis
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.242 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2df233d-b255-4dda-925c-3ccab3a032ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.246 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[19a1ace4-5ed2-4b27-8006-56a221056c97]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.247 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee namespace which is not needed anymore
Jan 21 23:47:53 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000015.scope: Deactivated successfully.
Jan 21 23:47:53 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000015.scope: Consumed 3.722s CPU time.
Jan 21 23:47:53 compute-0 systemd-machined[154182]: Machine qemu-11-instance-00000015 terminated.
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.444 182939 DEBUG nova.virt.libvirt.guest [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.447 182939 INFO nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migration operation has completed
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.448 182939 INFO nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] _post_live_migration() is started..
Jan 21 23:47:53 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214383]: [NOTICE]   (214387) : haproxy version is 2.8.14-c23fe91
Jan 21 23:47:53 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214383]: [NOTICE]   (214387) : path to executable is /usr/sbin/haproxy
Jan 21 23:47:53 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214383]: [WARNING]  (214387) : Exiting Master process...
Jan 21 23:47:53 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214383]: [ALERT]    (214387) : Current worker (214389) exited with code 143 (Terminated)
Jan 21 23:47:53 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[214383]: [WARNING]  (214387) : All workers exited. Exiting... (0)
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.455 182939 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 21 23:47:53 compute-0 systemd[1]: libpod-c089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b.scope: Deactivated successfully.
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.455 182939 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.456 182939 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 21 23:47:53 compute-0 podman[214654]: 2026-01-21 23:47:53.470549548 +0000 UTC m=+0.091122129 container died c089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 21 23:47:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-14d01be49af245df849a37f6abc28f95ab740675bbc6f7a9463cb000671d1772-merged.mount: Deactivated successfully.
Jan 21 23:47:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b-userdata-shm.mount: Deactivated successfully.
Jan 21 23:47:53 compute-0 podman[214654]: 2026-01-21 23:47:53.508971446 +0000 UTC m=+0.129544027 container cleanup c089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 21 23:47:53 compute-0 systemd[1]: libpod-conmon-c089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b.scope: Deactivated successfully.
Jan 21 23:47:53 compute-0 podman[214701]: 2026-01-21 23:47:53.587485193 +0000 UTC m=+0.047974838 container remove c089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.595 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ca4c19fb-1f0b-4557-af54-bdb2c54fb3f6]: (4, ('Wed Jan 21 11:47:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee (c089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b)\nc089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b\nWed Jan 21 11:47:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee (c089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b)\nc089aeff25a606d063bb8d4b63ef4d7be1f1ad59e5111abd82ac2376d8b6568b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.598 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a69a85-8af1-4652-8ccd-2cc25c97da47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.599 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2df233d-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.602 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:53 compute-0 kernel: tapb2df233d-b0: left promiscuous mode
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.620 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.623 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4de96185-cf4a-4f0a-8fe9-07b2a858a1f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.643 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2953465d-de0d-4284-9104-d4e3a6652bc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.644 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a00efbe6-9566-4a41-9b04-4cf661695e20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.665 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fd2463b7-f997-4ca4-a9f4-93b90089d6c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375598, 'reachable_time': 24834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214719, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:53 compute-0 systemd[1]: run-netns-ovnmeta\x2db2df233d\x2db255\x2d4dda\x2d925c\x2d3ccab3a032ee.mount: Deactivated successfully.
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.673 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:47:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:47:53.674 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd4c6bf-cf71-481b-b4e7-e372ca6a32b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.963 182939 DEBUG nova.network.neutron [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updated VIF entry in instance network info cache for port df9aa099-aa41-4111-b46c-c8a593762a53. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.964 182939 DEBUG nova.network.neutron [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Updating instance_info_cache with network_info: [{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:47:53 compute-0 nova_compute[182935]: 2026-01-21 23:47:53.994 182939 DEBUG oslo_concurrency.lockutils [req-311dbb7e-c4be-41f5-96c7-0f230ffeaba8 req-94d4c00a-9491-4b3e-b72c-793c84431540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-5bdecf5d-9113-4584-ac23-44d59770eade" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.113 182939 DEBUG nova.compute.manager [req-5e71a840-400f-4eb4-82f9-bb5bcc84c271 req-01c07a34-d4a0-4e23-afa5-a69f27de885d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.114 182939 DEBUG oslo_concurrency.lockutils [req-5e71a840-400f-4eb4-82f9-bb5bcc84c271 req-01c07a34-d4a0-4e23-afa5-a69f27de885d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.115 182939 DEBUG oslo_concurrency.lockutils [req-5e71a840-400f-4eb4-82f9-bb5bcc84c271 req-01c07a34-d4a0-4e23-afa5-a69f27de885d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.115 182939 DEBUG oslo_concurrency.lockutils [req-5e71a840-400f-4eb4-82f9-bb5bcc84c271 req-01c07a34-d4a0-4e23-afa5-a69f27de885d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.116 182939 DEBUG nova.compute.manager [req-5e71a840-400f-4eb4-82f9-bb5bcc84c271 req-01c07a34-d4a0-4e23-afa5-a69f27de885d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.117 182939 DEBUG nova.compute.manager [req-5e71a840-400f-4eb4-82f9-bb5bcc84c271 req-01c07a34-d4a0-4e23-afa5-a69f27de885d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-unplugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.341 182939 DEBUG nova.compute.manager [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.486 182939 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.487 182939 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.528 182939 DEBUG nova.objects.instance [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lazy-loading 'pci_requests' on Instance uuid 2977f489-9f9d-43f7-a617-7556b7df5171 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.551 182939 DEBUG nova.virt.hardware [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.552 182939 INFO nova.compute.claims [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.552 182939 DEBUG nova.objects.instance [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lazy-loading 'resources' on Instance uuid 2977f489-9f9d-43f7-a617-7556b7df5171 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.566 182939 DEBUG nova.objects.instance [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2977f489-9f9d-43f7-a617-7556b7df5171 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.581 182939 DEBUG nova.objects.instance [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2977f489-9f9d-43f7-a617-7556b7df5171 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.642 182939 INFO nova.compute.resource_tracker [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Updating resource usage from migration 78e371bb-27b6-4b83-90af-e79567818d7b
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.643 182939 DEBUG nova.compute.resource_tracker [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Starting to track incoming migration 78e371bb-27b6-4b83-90af-e79567818d7b with flavor c3389c03-89c4-4ff5-9e03-1a99d41713d4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.758 182939 DEBUG nova.compute.provider_tree [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.771 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.776 182939 DEBUG nova.scheduler.client.report [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.800 182939 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:54 compute-0 nova_compute[182935]: 2026-01-21 23:47:54.801 182939 INFO nova.compute.manager [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Migrating
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.070 182939 DEBUG nova.network.neutron [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Activated binding for port df9aa099-aa41-4111-b46c-c8a593762a53 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.071 182939 DEBUG nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.072 182939 DEBUG nova.virt.libvirt.vif [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:47:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-821021372',display_name='tempest-LiveMigrationTest-server-821021372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-821021372',id=21,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:47:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-xa5gd7vg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:47:43Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=5bdecf5d-9113-4584-ac23-44d59770eade,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.072 182939 DEBUG nova.network.os_vif_util [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converting VIF {"id": "df9aa099-aa41-4111-b46c-c8a593762a53", "address": "fa:16:3e:8f:4f:85", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf9aa099-aa", "ovs_interfaceid": "df9aa099-aa41-4111-b46c-c8a593762a53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.073 182939 DEBUG nova.network.os_vif_util [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.074 182939 DEBUG os_vif [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.075 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.076 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf9aa099-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.078 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.079 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.080 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.083 182939 INFO os_vif [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:4f:85,bridge_name='br-int',has_traffic_filtering=True,id=df9aa099-aa41-4111-b46c-c8a593762a53,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf9aa099-aa')
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.083 182939 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.084 182939 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.084 182939 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.084 182939 DEBUG nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.085 182939 INFO nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Deleting instance files /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade_del
Jan 21 23:47:55 compute-0 nova_compute[182935]: 2026-01-21 23:47:55.085 182939 INFO nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Deletion of /var/lib/nova/instances/5bdecf5d-9113-4584-ac23-44d59770eade_del complete
Jan 21 23:47:55 compute-0 sshd-session[214722]: Accepted publickey for nova from 192.168.122.101 port 37302 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:47:55 compute-0 systemd-logind[784]: New session 30 of user nova.
Jan 21 23:47:55 compute-0 systemd[1]: Started Session 30 of User nova.
Jan 21 23:47:55 compute-0 sshd-session[214722]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:47:55 compute-0 podman[214721]: 2026-01-21 23:47:55.846587249 +0000 UTC m=+0.087734079 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:47:55 compute-0 sshd-session[214753]: Received disconnect from 192.168.122.101 port 37302:11: disconnected by user
Jan 21 23:47:55 compute-0 sshd-session[214753]: Disconnected from user nova 192.168.122.101 port 37302
Jan 21 23:47:55 compute-0 sshd-session[214722]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:47:55 compute-0 systemd-logind[784]: Session 30 logged out. Waiting for processes to exit.
Jan 21 23:47:55 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Jan 21 23:47:55 compute-0 systemd-logind[784]: Removed session 30.
Jan 21 23:47:56 compute-0 sshd-session[214762]: Accepted publickey for nova from 192.168.122.101 port 37316 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:47:56 compute-0 systemd-logind[784]: New session 31 of user nova.
Jan 21 23:47:56 compute-0 systemd[1]: Started Session 31 of User nova.
Jan 21 23:47:56 compute-0 sshd-session[214762]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:47:56 compute-0 sshd-session[214765]: Received disconnect from 192.168.122.101 port 37316:11: disconnected by user
Jan 21 23:47:56 compute-0 sshd-session[214765]: Disconnected from user nova 192.168.122.101 port 37316
Jan 21 23:47:56 compute-0 sshd-session[214762]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:47:56 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Jan 21 23:47:56 compute-0 systemd-logind[784]: Session 31 logged out. Waiting for processes to exit.
Jan 21 23:47:56 compute-0 systemd-logind[784]: Removed session 31.
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.239 182939 DEBUG nova.compute.manager [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.239 182939 DEBUG oslo_concurrency.lockutils [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.239 182939 DEBUG oslo_concurrency.lockutils [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.240 182939 DEBUG oslo_concurrency.lockutils [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.240 182939 DEBUG nova.compute.manager [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.240 182939 WARNING nova.compute.manager [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received unexpected event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with vm_state active and task_state migrating.
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.240 182939 DEBUG nova.compute.manager [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.240 182939 DEBUG oslo_concurrency.lockutils [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.241 182939 DEBUG oslo_concurrency.lockutils [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.241 182939 DEBUG oslo_concurrency.lockutils [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.241 182939 DEBUG nova.compute.manager [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.241 182939 WARNING nova.compute.manager [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received unexpected event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with vm_state active and task_state migrating.
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.241 182939 DEBUG nova.compute.manager [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.241 182939 DEBUG oslo_concurrency.lockutils [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.242 182939 DEBUG oslo_concurrency.lockutils [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.242 182939 DEBUG oslo_concurrency.lockutils [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.242 182939 DEBUG nova.compute.manager [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] No waiting events found dispatching network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:47:56 compute-0 nova_compute[182935]: 2026-01-21 23:47:56.242 182939 WARNING nova.compute.manager [req-379813d8-350f-4e75-8f13-0abaf20a06d6 req-2216e3f6-685a-47fe-a1be-9c319ad94c5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Received unexpected event network-vif-plugged-df9aa099-aa41-4111-b46c-c8a593762a53 for instance with vm_state active and task_state migrating.
Jan 21 23:47:57 compute-0 ovn_controller[95047]: 2026-01-21T23:47:57Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0d:1c:84 10.100.0.8
Jan 21 23:47:57 compute-0 ovn_controller[95047]: 2026-01-21T23:47:57Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0d:1c:84 10.100.0.8
Jan 21 23:47:58 compute-0 nova_compute[182935]: 2026-01-21 23:47:58.021 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 21 23:47:59 compute-0 podman[214767]: 2026-01-21 23:47:59.697556758 +0000 UTC m=+0.068292581 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 21 23:47:59 compute-0 nova_compute[182935]: 2026-01-21 23:47:59.774 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.078 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:00 compute-0 kernel: tap25b6ea25-2c (unregistering): left promiscuous mode
Jan 21 23:48:00 compute-0 NetworkManager[55139]: <info>  [1769039280.2136] device (tap25b6ea25-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:48:00 compute-0 ovn_controller[95047]: 2026-01-21T23:48:00Z|00094|binding|INFO|Releasing lport 25b6ea25-2c24-4a07-9772-28913505aec2 from this chassis (sb_readonly=0)
Jan 21 23:48:00 compute-0 ovn_controller[95047]: 2026-01-21T23:48:00Z|00095|binding|INFO|Setting lport 25b6ea25-2c24-4a07-9772-28913505aec2 down in Southbound
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.226 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:00 compute-0 ovn_controller[95047]: 2026-01-21T23:48:00Z|00096|binding|INFO|Removing iface tap25b6ea25-2c ovn-installed in OVS
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.229 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.237 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:1c:84 10.100.0.8'], port_security=['fa:16:3e:0d:1c:84 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a6a89006-02c9-49b1-8bfb-8640ba1b495f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '6', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=25b6ea25-2c24-4a07-9772-28913505aec2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.239 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 25b6ea25-2c24-4a07-9772-28913505aec2 in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 unbound from our chassis
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.240 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.240 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.261 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ecbc6e0d-6c24-447d-a5a3-d7fc3155ad8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:00 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 21 23:48:00 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000012.scope: Consumed 13.685s CPU time.
Jan 21 23:48:00 compute-0 systemd-machined[154182]: Machine qemu-12-instance-00000012 terminated.
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.295 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[23f94c98-0146-46b6-830a-b9a4c389125f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.297 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[90a65f1d-c302-4383-9874-ccf4e4e4b178]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.328 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[71485102-0d72-4ad6-831e-9d3ed75dc700]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.351 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[22493876-5924-41eb-b883-6193a62b4889]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371486, 'reachable_time': 18847, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214798, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.369 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[09ad11bc-7e5d-43a4-9c02-170ca6bf8fbb]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371506, 'tstamp': 371506}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214799, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371510, 'tstamp': 371510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214799, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.371 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.373 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.381 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.381 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1530a22a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.381 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.381 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1530a22a-f0, col_values=(('external_ids', {'iface-id': '1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:00.381 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.421 182939 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.422 182939 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.422 182939 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "5bdecf5d-9113-4584-ac23-44d59770eade-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.450 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.455 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.460 182939 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.460 182939 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.461 182939 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.461 182939 DEBUG nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.533 182939 DEBUG nova.compute.manager [req-b47fe198-5c6c-42f4-b307-96b9aaa80fd3 req-8bea574b-3f05-4b03-8430-38ec82896e73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-unplugged-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.533 182939 DEBUG oslo_concurrency.lockutils [req-b47fe198-5c6c-42f4-b307-96b9aaa80fd3 req-8bea574b-3f05-4b03-8430-38ec82896e73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.534 182939 DEBUG oslo_concurrency.lockutils [req-b47fe198-5c6c-42f4-b307-96b9aaa80fd3 req-8bea574b-3f05-4b03-8430-38ec82896e73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.534 182939 DEBUG oslo_concurrency.lockutils [req-b47fe198-5c6c-42f4-b307-96b9aaa80fd3 req-8bea574b-3f05-4b03-8430-38ec82896e73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.534 182939 DEBUG nova.compute.manager [req-b47fe198-5c6c-42f4-b307-96b9aaa80fd3 req-8bea574b-3f05-4b03-8430-38ec82896e73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] No waiting events found dispatching network-vif-unplugged-25b6ea25-2c24-4a07-9772-28913505aec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.534 182939 WARNING nova.compute.manager [req-b47fe198-5c6c-42f4-b307-96b9aaa80fd3 req-8bea574b-3f05-4b03-8430-38ec82896e73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received unexpected event network-vif-unplugged-25b6ea25-2c24-4a07-9772-28913505aec2 for instance with vm_state active and task_state rebuilding.
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.568 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.658 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.660 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.720 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.727 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.785 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.787 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:00 compute-0 nova_compute[182935]: 2026-01-21 23:48:00.846 182939 DEBUG oslo_concurrency.processutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.041 182939 INFO nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance shutdown successfully after 13 seconds.
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.050 182939 INFO nova.virt.libvirt.driver [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance destroyed successfully.
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.057 182939 INFO nova.virt.libvirt.driver [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance destroyed successfully.
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.058 182939 DEBUG nova.virt.libvirt.vif [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:46:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-168641085',display_name='tempest-ServersAdminTestJSON-server-168641085',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-168641085',id=18,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:47:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-evbqme7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:47:46Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=a6a89006-02c9-49b1-8bfb-8640ba1b495f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.058 182939 DEBUG nova.network.os_vif_util [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.059 182939 DEBUG nova.network.os_vif_util [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.059 182939 DEBUG os_vif [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.063 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.063 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25b6ea25-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.064 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.066 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.067 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.070 182939 INFO os_vif [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c')
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.070 182939 INFO nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Deleting instance files /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f_del
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.071 182939 INFO nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Deletion of /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f_del complete
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.080 182939 WARNING nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.081 182939 DEBUG nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5427MB free_disk=73.28854751586914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.082 182939 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.082 182939 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.188 182939 DEBUG nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Migration for instance 2977f489-9f9d-43f7-a617-7556b7df5171 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.189 182939 DEBUG nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Migration for instance 5bdecf5d-9113-4584-ac23-44d59770eade refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.265 182939 DEBUG nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.266 182939 INFO nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Updating resource usage from migration 78e371bb-27b6-4b83-90af-e79567818d7b
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.266 182939 DEBUG nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Starting to track incoming migration 78e371bb-27b6-4b83-90af-e79567818d7b with flavor c3389c03-89c4-4ff5-9e03-1a99d41713d4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.314 182939 DEBUG nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Instance a6a89006-02c9-49b1-8bfb-8640ba1b495f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.314 182939 DEBUG nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Instance efc683b9-a8d9-4a67-bb19-aeaabfbd5423 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.315 182939 DEBUG nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Migration a95800ca-3e2b-439a-bf8d-a9372550d7ea is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.347 182939 WARNING nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Instance 2977f489-9f9d-43f7-a617-7556b7df5171 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.348 182939 DEBUG nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.348 182939 DEBUG nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.402 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.403 182939 INFO nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Creating image(s)
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.405 182939 DEBUG oslo_concurrency.lockutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.405 182939 DEBUG oslo_concurrency.lockutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.407 182939 DEBUG oslo_concurrency.lockutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.435 182939 DEBUG oslo_concurrency.processutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.487 182939 DEBUG nova.compute.provider_tree [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.502 182939 DEBUG nova.scheduler.client.report [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.511 182939 DEBUG oslo_concurrency.processutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.512 182939 DEBUG oslo_concurrency.lockutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.513 182939 DEBUG oslo_concurrency.lockutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.529 182939 DEBUG oslo_concurrency.processutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.558 182939 DEBUG nova.compute.resource_tracker [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.559 182939 DEBUG oslo_concurrency.lockutils [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.478s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.581 182939 INFO nova.compute.manager [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.605 182939 DEBUG oslo_concurrency.processutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.606 182939 DEBUG oslo_concurrency.processutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.640 182939 DEBUG oslo_concurrency.processutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.641 182939 DEBUG oslo_concurrency.lockutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.641 182939 DEBUG oslo_concurrency.processutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.699 182939 DEBUG oslo_concurrency.processutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.700 182939 DEBUG nova.virt.disk.api [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Checking if we can resize image /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.700 182939 DEBUG oslo_concurrency.processutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.724 182939 INFO nova.scheduler.client.report [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Deleted allocation for migration a95800ca-3e2b-439a-bf8d-a9372550d7ea
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.725 182939 DEBUG nova.virt.libvirt.driver [None req-bd2b8775-609d-4145-8ee6-76c7bc9dac71 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.774 182939 DEBUG oslo_concurrency.processutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.775 182939 DEBUG nova.virt.disk.api [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Cannot resize image /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.776 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.777 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Ensure instance console log exists: /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.778 182939 DEBUG oslo_concurrency.lockutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.779 182939 DEBUG oslo_concurrency.lockutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.779 182939 DEBUG oslo_concurrency.lockutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.783 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Start _get_guest_xml network_info=[{"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.789 182939 WARNING nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.797 182939 DEBUG nova.virt.libvirt.host [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.797 182939 DEBUG nova.virt.libvirt.host [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.801 182939 DEBUG nova.virt.libvirt.host [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.802 182939 DEBUG nova.virt.libvirt.host [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.804 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.805 182939 DEBUG nova.virt.hardware [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.805 182939 DEBUG nova.virt.hardware [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.806 182939 DEBUG nova.virt.hardware [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.806 182939 DEBUG nova.virt.hardware [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.807 182939 DEBUG nova.virt.hardware [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.807 182939 DEBUG nova.virt.hardware [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.807 182939 DEBUG nova.virt.hardware [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.808 182939 DEBUG nova.virt.hardware [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.808 182939 DEBUG nova.virt.hardware [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.809 182939 DEBUG nova.virt.hardware [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.809 182939 DEBUG nova.virt.hardware [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.810 182939 DEBUG nova.objects.instance [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'vcpu_model' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.839 182939 DEBUG nova.virt.libvirt.vif [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:46:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-168641085',display_name='tempest-ServersAdminTestJSON-server-168641085',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-168641085',id=18,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:47:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-evbqme7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:48:01Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=a6a89006-02c9-49b1-8bfb-8640ba1b495f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.840 182939 DEBUG nova.network.os_vif_util [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.841 182939 DEBUG nova.network.os_vif_util [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.843 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:48:01 compute-0 nova_compute[182935]:   <uuid>a6a89006-02c9-49b1-8bfb-8640ba1b495f</uuid>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   <name>instance-00000012</name>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersAdminTestJSON-server-168641085</nova:name>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:48:01</nova:creationTime>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:48:01 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:48:01 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:48:01 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:48:01 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:48:01 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:48:01 compute-0 nova_compute[182935]:         <nova:user uuid="4a6034ff39094b6486bac680b7ed5a57">tempest-ServersAdminTestJSON-1815099341-project-member</nova:user>
Jan 21 23:48:01 compute-0 nova_compute[182935]:         <nova:project uuid="4d40fc03fb534b5689415f3d8a3de1fc">tempest-ServersAdminTestJSON-1815099341</nova:project>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:48:01 compute-0 nova_compute[182935]:         <nova:port uuid="25b6ea25-2c24-4a07-9772-28913505aec2">
Jan 21 23:48:01 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <system>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <entry name="serial">a6a89006-02c9-49b1-8bfb-8640ba1b495f</entry>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <entry name="uuid">a6a89006-02c9-49b1-8bfb-8640ba1b495f</entry>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     </system>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   <os>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   </os>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   <features>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   </features>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.config"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:0d:1c:84"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <target dev="tap25b6ea25-2c"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/console.log" append="off"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <video>
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     </video>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:48:01 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:48:01 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:48:01 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:48:01 compute-0 nova_compute[182935]: </domain>
Jan 21 23:48:01 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.844 182939 DEBUG nova.virt.libvirt.vif [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:46:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-168641085',display_name='tempest-ServersAdminTestJSON-server-168641085',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-168641085',id=18,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:47:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-evbqme7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:48:01Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=a6a89006-02c9-49b1-8bfb-8640ba1b495f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.845 182939 DEBUG nova.network.os_vif_util [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.845 182939 DEBUG nova.network.os_vif_util [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.846 182939 DEBUG os_vif [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.846 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.847 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.848 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.851 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.852 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25b6ea25-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.852 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap25b6ea25-2c, col_values=(('external_ids', {'iface-id': '25b6ea25-2c24-4a07-9772-28913505aec2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:1c:84', 'vm-uuid': 'a6a89006-02c9-49b1-8bfb-8640ba1b495f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.854 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:01 compute-0 NetworkManager[55139]: <info>  [1769039281.8551] manager: (tap25b6ea25-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.856 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.859 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.860 182939 INFO os_vif [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c')
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.939 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.940 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.940 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] No VIF found with MAC fa:16:3e:0d:1c:84, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.940 182939 INFO nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Using config drive
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.955 182939 DEBUG nova.objects.instance [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'ec2_ids' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:01 compute-0 nova_compute[182935]: 2026-01-21 23:48:01.982 182939 DEBUG nova.objects.instance [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'keypairs' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:02 compute-0 nova_compute[182935]: 2026-01-21 23:48:02.343 182939 INFO nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Creating config drive at /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.config
Jan 21 23:48:02 compute-0 nova_compute[182935]: 2026-01-21 23:48:02.349 182939 DEBUG oslo_concurrency.processutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprasfugdy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:02 compute-0 nova_compute[182935]: 2026-01-21 23:48:02.494 182939 DEBUG oslo_concurrency.processutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprasfugdy" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:02 compute-0 kernel: tap25b6ea25-2c: entered promiscuous mode
Jan 21 23:48:02 compute-0 NetworkManager[55139]: <info>  [1769039282.5760] manager: (tap25b6ea25-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Jan 21 23:48:02 compute-0 systemd-udevd[214791]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:48:02 compute-0 NetworkManager[55139]: <info>  [1769039282.5958] device (tap25b6ea25-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:48:02 compute-0 NetworkManager[55139]: <info>  [1769039282.5964] device (tap25b6ea25-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:48:02 compute-0 ovn_controller[95047]: 2026-01-21T23:48:02Z|00097|binding|INFO|Claiming lport 25b6ea25-2c24-4a07-9772-28913505aec2 for this chassis.
Jan 21 23:48:02 compute-0 nova_compute[182935]: 2026-01-21 23:48:02.618 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:02 compute-0 ovn_controller[95047]: 2026-01-21T23:48:02Z|00098|binding|INFO|25b6ea25-2c24-4a07-9772-28913505aec2: Claiming fa:16:3e:0d:1c:84 10.100.0.8
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.627 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:1c:84 10.100.0.8'], port_security=['fa:16:3e:0d:1c:84 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a6a89006-02c9-49b1-8bfb-8640ba1b495f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '7', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=25b6ea25-2c24-4a07-9772-28913505aec2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.628 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 25b6ea25-2c24-4a07-9772-28913505aec2 in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 bound to our chassis
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.630 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:48:02 compute-0 ovn_controller[95047]: 2026-01-21T23:48:02Z|00099|binding|INFO|Setting lport 25b6ea25-2c24-4a07-9772-28913505aec2 ovn-installed in OVS
Jan 21 23:48:02 compute-0 ovn_controller[95047]: 2026-01-21T23:48:02Z|00100|binding|INFO|Setting lport 25b6ea25-2c24-4a07-9772-28913505aec2 up in Southbound
Jan 21 23:48:02 compute-0 nova_compute[182935]: 2026-01-21 23:48:02.644 182939 DEBUG nova.compute.manager [req-1c9f1eca-9711-4ed4-b992-467554b37c1a req-ac7d7cce-eeea-42cb-a46a-115a5a5a22f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:02 compute-0 nova_compute[182935]: 2026-01-21 23:48:02.644 182939 DEBUG oslo_concurrency.lockutils [req-1c9f1eca-9711-4ed4-b992-467554b37c1a req-ac7d7cce-eeea-42cb-a46a-115a5a5a22f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:02 compute-0 nova_compute[182935]: 2026-01-21 23:48:02.645 182939 DEBUG oslo_concurrency.lockutils [req-1c9f1eca-9711-4ed4-b992-467554b37c1a req-ac7d7cce-eeea-42cb-a46a-115a5a5a22f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:02 compute-0 nova_compute[182935]: 2026-01-21 23:48:02.645 182939 DEBUG oslo_concurrency.lockutils [req-1c9f1eca-9711-4ed4-b992-467554b37c1a req-ac7d7cce-eeea-42cb-a46a-115a5a5a22f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:02 compute-0 nova_compute[182935]: 2026-01-21 23:48:02.645 182939 DEBUG nova.compute.manager [req-1c9f1eca-9711-4ed4-b992-467554b37c1a req-ac7d7cce-eeea-42cb-a46a-115a5a5a22f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] No waiting events found dispatching network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:02 compute-0 nova_compute[182935]: 2026-01-21 23:48:02.645 182939 WARNING nova.compute.manager [req-1c9f1eca-9711-4ed4-b992-467554b37c1a req-ac7d7cce-eeea-42cb-a46a-115a5a5a22f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received unexpected event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 for instance with vm_state active and task_state rebuild_spawning.
Jan 21 23:48:02 compute-0 nova_compute[182935]: 2026-01-21 23:48:02.645 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:02 compute-0 systemd-machined[154182]: New machine qemu-13-instance-00000012.
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.657 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[eaf07371-4f6b-4e0c-8e7f-2bf809bcfbfa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:02 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000012.
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.696 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c0f6b6-6ce0-40b7-8e46-23fbb9cd9dbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.699 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[92d5aed3-7517-4c3e-957c-2eb8045b8539]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.742 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[ff554a4b-b497-401b-86f7-d0f91f84f410]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.766 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[18dd805e-4e2e-4618-b840-ff765d35e97b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371486, 'reachable_time': 18847, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214877, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.792 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cb42154d-e62e-49f7-a639-bbe6d9548082]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371506, 'tstamp': 371506}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214878, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371510, 'tstamp': 371510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214878, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.795 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:02 compute-0 nova_compute[182935]: 2026-01-21 23:48:02.797 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.798 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1530a22a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.798 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.799 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1530a22a-f0, col_values=(('external_ids', {'iface-id': '1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:02.799 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:03.182 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:03.182 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:03.183 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.540 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for a6a89006-02c9-49b1-8bfb-8640ba1b495f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.542 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039283.5399623, a6a89006-02c9-49b1-8bfb-8640ba1b495f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.543 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] VM Resumed (Lifecycle Event)
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.549 182939 DEBUG nova.compute.manager [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.551 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.556 182939 INFO nova.virt.libvirt.driver [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance spawned successfully.
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.557 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.825 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.831 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.841 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.841 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.842 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.842 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.843 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.843 182939 DEBUG nova.virt.libvirt.driver [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.873 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.874 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039283.5439217, a6a89006-02c9-49b1-8bfb-8640ba1b495f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.874 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] VM Started (Lifecycle Event)
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.921 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.925 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.962 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 21 23:48:03 compute-0 nova_compute[182935]: 2026-01-21 23:48:03.972 182939 DEBUG nova.compute.manager [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:04 compute-0 nova_compute[182935]: 2026-01-21 23:48:04.061 182939 DEBUG oslo_concurrency.lockutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:04 compute-0 nova_compute[182935]: 2026-01-21 23:48:04.062 182939 DEBUG oslo_concurrency.lockutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:04 compute-0 nova_compute[182935]: 2026-01-21 23:48:04.062 182939 DEBUG nova.objects.instance [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:48:04 compute-0 nova_compute[182935]: 2026-01-21 23:48:04.140 182939 DEBUG oslo_concurrency.lockutils [None req-9e30cf75-00f0-4211-91de-fe2f870d80f1 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:04 compute-0 nova_compute[182935]: 2026-01-21 23:48:04.778 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:04 compute-0 nova_compute[182935]: 2026-01-21 23:48:04.807 182939 DEBUG nova.compute.manager [req-9878517e-b4e4-437b-9e95-2afad15f758e req-4eb4f0a9-d9ab-4d20-ba42-00f41fcb877d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:04 compute-0 nova_compute[182935]: 2026-01-21 23:48:04.808 182939 DEBUG oslo_concurrency.lockutils [req-9878517e-b4e4-437b-9e95-2afad15f758e req-4eb4f0a9-d9ab-4d20-ba42-00f41fcb877d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:04 compute-0 nova_compute[182935]: 2026-01-21 23:48:04.808 182939 DEBUG oslo_concurrency.lockutils [req-9878517e-b4e4-437b-9e95-2afad15f758e req-4eb4f0a9-d9ab-4d20-ba42-00f41fcb877d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:04 compute-0 nova_compute[182935]: 2026-01-21 23:48:04.808 182939 DEBUG oslo_concurrency.lockutils [req-9878517e-b4e4-437b-9e95-2afad15f758e req-4eb4f0a9-d9ab-4d20-ba42-00f41fcb877d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:04 compute-0 nova_compute[182935]: 2026-01-21 23:48:04.808 182939 DEBUG nova.compute.manager [req-9878517e-b4e4-437b-9e95-2afad15f758e req-4eb4f0a9-d9ab-4d20-ba42-00f41fcb877d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] No waiting events found dispatching network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:04 compute-0 nova_compute[182935]: 2026-01-21 23:48:04.809 182939 WARNING nova.compute.manager [req-9878517e-b4e4-437b-9e95-2afad15f758e req-4eb4f0a9-d9ab-4d20-ba42-00f41fcb877d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received unexpected event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 for instance with vm_state active and task_state None.
Jan 21 23:48:06 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 23:48:06 compute-0 systemd[214579]: Activating special unit Exit the Session...
Jan 21 23:48:06 compute-0 systemd[214579]: Stopped target Main User Target.
Jan 21 23:48:06 compute-0 systemd[214579]: Stopped target Basic System.
Jan 21 23:48:06 compute-0 systemd[214579]: Stopped target Paths.
Jan 21 23:48:06 compute-0 systemd[214579]: Stopped target Sockets.
Jan 21 23:48:06 compute-0 systemd[214579]: Stopped target Timers.
Jan 21 23:48:06 compute-0 systemd[214579]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:48:06 compute-0 systemd[214579]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:48:06 compute-0 systemd[214579]: Closed D-Bus User Message Bus Socket.
Jan 21 23:48:06 compute-0 systemd[214579]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:48:06 compute-0 systemd[214579]: Removed slice User Application Slice.
Jan 21 23:48:06 compute-0 systemd[214579]: Reached target Shutdown.
Jan 21 23:48:06 compute-0 systemd[214579]: Finished Exit the Session.
Jan 21 23:48:06 compute-0 systemd[214579]: Reached target Exit the Session.
Jan 21 23:48:06 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 23:48:06 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 23:48:06 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 23:48:06 compute-0 sshd-session[214887]: Invalid user tomcat from 188.166.69.60 port 35422
Jan 21 23:48:06 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 23:48:06 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 23:48:06 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 23:48:06 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 23:48:06 compute-0 sshd-session[214887]: Connection closed by invalid user tomcat 188.166.69.60 port 35422 [preauth]
Jan 21 23:48:06 compute-0 nova_compute[182935]: 2026-01-21 23:48:06.857 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:06 compute-0 nova_compute[182935]: 2026-01-21 23:48:06.904 182939 DEBUG nova.compute.manager [req-e3616c36-bd9b-4540-9152-ed6d5ead35b4 req-8d70ccb5-a232-42f0-b5c0-6eacdbc10317 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:06 compute-0 nova_compute[182935]: 2026-01-21 23:48:06.904 182939 DEBUG oslo_concurrency.lockutils [req-e3616c36-bd9b-4540-9152-ed6d5ead35b4 req-8d70ccb5-a232-42f0-b5c0-6eacdbc10317 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:06 compute-0 nova_compute[182935]: 2026-01-21 23:48:06.905 182939 DEBUG oslo_concurrency.lockutils [req-e3616c36-bd9b-4540-9152-ed6d5ead35b4 req-8d70ccb5-a232-42f0-b5c0-6eacdbc10317 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:06 compute-0 nova_compute[182935]: 2026-01-21 23:48:06.905 182939 DEBUG oslo_concurrency.lockutils [req-e3616c36-bd9b-4540-9152-ed6d5ead35b4 req-8d70ccb5-a232-42f0-b5c0-6eacdbc10317 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:06 compute-0 nova_compute[182935]: 2026-01-21 23:48:06.905 182939 DEBUG nova.compute.manager [req-e3616c36-bd9b-4540-9152-ed6d5ead35b4 req-8d70ccb5-a232-42f0-b5c0-6eacdbc10317 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] No waiting events found dispatching network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:06 compute-0 nova_compute[182935]: 2026-01-21 23:48:06.906 182939 WARNING nova.compute.manager [req-e3616c36-bd9b-4540-9152-ed6d5ead35b4 req-8d70ccb5-a232-42f0-b5c0-6eacdbc10317 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received unexpected event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 for instance with vm_state error and task_state None.
Jan 21 23:48:08 compute-0 nova_compute[182935]: 2026-01-21 23:48:08.456 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039273.4441307, 5bdecf5d-9113-4584-ac23-44d59770eade => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:08 compute-0 nova_compute[182935]: 2026-01-21 23:48:08.456 182939 INFO nova.compute.manager [-] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] VM Stopped (Lifecycle Event)
Jan 21 23:48:08 compute-0 nova_compute[182935]: 2026-01-21 23:48:08.485 182939 DEBUG nova.compute.manager [None req-98460701-9f31-474a-9777-f6838d4f697c - - - - - -] [instance: 5bdecf5d-9113-4584-ac23-44d59770eade] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:08 compute-0 podman[214891]: 2026-01-21 23:48:08.698340992 +0000 UTC m=+0.065345032 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:48:08 compute-0 podman[214890]: 2026-01-21 23:48:08.702967661 +0000 UTC m=+0.071368844 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 23:48:09 compute-0 sshd-session[214930]: Accepted publickey for nova from 192.168.122.101 port 55608 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:48:09 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 23:48:09 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 23:48:09 compute-0 systemd-logind[784]: New session 32 of user nova.
Jan 21 23:48:09 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 23:48:09 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 23:48:09 compute-0 systemd[214934]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:09 compute-0 nova_compute[182935]: 2026-01-21 23:48:09.780 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:09 compute-0 systemd[214934]: Queued start job for default target Main User Target.
Jan 21 23:48:09 compute-0 systemd[214934]: Created slice User Application Slice.
Jan 21 23:48:09 compute-0 systemd[214934]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:48:09 compute-0 systemd[214934]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:48:09 compute-0 systemd[214934]: Reached target Paths.
Jan 21 23:48:09 compute-0 systemd[214934]: Reached target Timers.
Jan 21 23:48:09 compute-0 systemd[214934]: Starting D-Bus User Message Bus Socket...
Jan 21 23:48:09 compute-0 systemd[214934]: Starting Create User's Volatile Files and Directories...
Jan 21 23:48:09 compute-0 systemd[214934]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:48:09 compute-0 systemd[214934]: Reached target Sockets.
Jan 21 23:48:09 compute-0 systemd[214934]: Finished Create User's Volatile Files and Directories.
Jan 21 23:48:09 compute-0 systemd[214934]: Reached target Basic System.
Jan 21 23:48:09 compute-0 systemd[214934]: Reached target Main User Target.
Jan 21 23:48:09 compute-0 systemd[214934]: Startup finished in 140ms.
Jan 21 23:48:09 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 23:48:09 compute-0 systemd[1]: Started Session 32 of User nova.
Jan 21 23:48:09 compute-0 sshd-session[214930]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:10 compute-0 sshd-session[214949]: Received disconnect from 192.168.122.101 port 55608:11: disconnected by user
Jan 21 23:48:10 compute-0 sshd-session[214949]: Disconnected from user nova 192.168.122.101 port 55608
Jan 21 23:48:10 compute-0 sshd-session[214930]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:48:10 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Jan 21 23:48:10 compute-0 systemd-logind[784]: Session 32 logged out. Waiting for processes to exit.
Jan 21 23:48:10 compute-0 systemd-logind[784]: Removed session 32.
Jan 21 23:48:10 compute-0 sshd-session[214951]: Accepted publickey for nova from 192.168.122.101 port 38934 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:48:10 compute-0 systemd-logind[784]: New session 34 of user nova.
Jan 21 23:48:10 compute-0 systemd[1]: Started Session 34 of User nova.
Jan 21 23:48:10 compute-0 sshd-session[214951]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:10 compute-0 sshd-session[214954]: Received disconnect from 192.168.122.101 port 38934:11: disconnected by user
Jan 21 23:48:10 compute-0 sshd-session[214954]: Disconnected from user nova 192.168.122.101 port 38934
Jan 21 23:48:10 compute-0 sshd-session[214951]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:48:10 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Jan 21 23:48:10 compute-0 systemd-logind[784]: Session 34 logged out. Waiting for processes to exit.
Jan 21 23:48:10 compute-0 systemd-logind[784]: Removed session 34.
Jan 21 23:48:10 compute-0 sshd-session[214956]: Accepted publickey for nova from 192.168.122.101 port 38946 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:48:10 compute-0 systemd-logind[784]: New session 35 of user nova.
Jan 21 23:48:10 compute-0 systemd[1]: Started Session 35 of User nova.
Jan 21 23:48:10 compute-0 sshd-session[214956]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:10 compute-0 sshd-session[214959]: Received disconnect from 192.168.122.101 port 38946:11: disconnected by user
Jan 21 23:48:10 compute-0 sshd-session[214959]: Disconnected from user nova 192.168.122.101 port 38946
Jan 21 23:48:10 compute-0 sshd-session[214956]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:48:10 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Jan 21 23:48:10 compute-0 systemd-logind[784]: Session 35 logged out. Waiting for processes to exit.
Jan 21 23:48:10 compute-0 systemd-logind[784]: Removed session 35.
Jan 21 23:48:11 compute-0 nova_compute[182935]: 2026-01-21 23:48:11.696 182939 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquiring lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:48:11 compute-0 nova_compute[182935]: 2026-01-21 23:48:11.699 182939 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquired lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:48:11 compute-0 nova_compute[182935]: 2026-01-21 23:48:11.699 182939 DEBUG nova.network.neutron [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:48:11 compute-0 nova_compute[182935]: 2026-01-21 23:48:11.862 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:12 compute-0 nova_compute[182935]: 2026-01-21 23:48:12.047 182939 DEBUG nova.network.neutron [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:48:12 compute-0 nova_compute[182935]: 2026-01-21 23:48:12.906 182939 DEBUG nova.network.neutron [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:12 compute-0 nova_compute[182935]: 2026-01-21 23:48:12.933 182939 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Releasing lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.071 182939 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.073 182939 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.073 182939 INFO nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Creating image(s)
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.074 182939 DEBUG nova.objects.instance [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2977f489-9f9d-43f7-a617-7556b7df5171 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.103 182939 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.163 182939 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.165 182939 DEBUG nova.virt.disk.api [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Checking if we can resize image /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.165 182939 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.221 182939 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.223 182939 DEBUG nova.virt.disk.api [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Cannot resize image /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.249 182939 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.249 182939 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Ensure instance console log exists: /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.250 182939 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.251 182939 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.251 182939 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.253 182939 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.259 182939 WARNING nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.264 182939 DEBUG nova.virt.libvirt.host [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.265 182939 DEBUG nova.virt.libvirt.host [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.273 182939 DEBUG nova.virt.libvirt.host [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.273 182939 DEBUG nova.virt.libvirt.host [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.275 182939 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.275 182939 DEBUG nova.virt.hardware [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.275 182939 DEBUG nova.virt.hardware [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.276 182939 DEBUG nova.virt.hardware [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.276 182939 DEBUG nova.virt.hardware [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.276 182939 DEBUG nova.virt.hardware [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.276 182939 DEBUG nova.virt.hardware [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.277 182939 DEBUG nova.virt.hardware [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.277 182939 DEBUG nova.virt.hardware [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.278 182939 DEBUG nova.virt.hardware [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.278 182939 DEBUG nova.virt.hardware [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.278 182939 DEBUG nova.virt.hardware [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.278 182939 DEBUG nova.objects.instance [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2977f489-9f9d-43f7-a617-7556b7df5171 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.317 182939 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.372 182939 DEBUG oslo_concurrency.processutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.config --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.373 182939 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Acquiring lock "/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.373 182939 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.374 182939 DEBUG oslo_concurrency.lockutils [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] Lock "/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.377 182939 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:48:13 compute-0 nova_compute[182935]:   <uuid>2977f489-9f9d-43f7-a617-7556b7df5171</uuid>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   <name>instance-00000017</name>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <nova:name>tempest-MigrationsAdminTest-server-529809703</nova:name>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:48:13</nova:creationTime>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:48:13 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:48:13 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:48:13 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:48:13 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:48:13 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:48:13 compute-0 nova_compute[182935]:         <nova:user uuid="36d71830ce70436e97fbc17b6da8d3c6">tempest-MigrationsAdminTest-1559502816-project-member</nova:user>
Jan 21 23:48:13 compute-0 nova_compute[182935]:         <nova:project uuid="95574103d0094883861c58d01690e5a3">tempest-MigrationsAdminTest-1559502816</nova:project>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <system>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <entry name="serial">2977f489-9f9d-43f7-a617-7556b7df5171</entry>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <entry name="uuid">2977f489-9f9d-43f7-a617-7556b7df5171</entry>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     </system>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   <os>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   </os>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   <features>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   </features>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk.config"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/console.log" append="off"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <video>
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     </video>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:48:13 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:48:13 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:48:13 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:48:13 compute-0 nova_compute[182935]: </domain>
Jan 21 23:48:13 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.448 182939 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.449 182939 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.449 182939 INFO nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Using config drive
Jan 21 23:48:13 compute-0 systemd-machined[154182]: New machine qemu-14-instance-00000017.
Jan 21 23:48:13 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000017.
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.950 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039293.9480276, 2977f489-9f9d-43f7-a617-7556b7df5171 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.950 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] VM Resumed (Lifecycle Event)
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.953 182939 DEBUG nova.compute.manager [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.957 182939 INFO nova.virt.libvirt.driver [-] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance running successfully.
Jan 21 23:48:13 compute-0 virtqemud[182477]: argument unsupported: QEMU guest agent is not configured
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.961 182939 DEBUG nova.virt.libvirt.guest [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 21 23:48:13 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.961 182939 DEBUG nova.virt.libvirt.driver [None req-56bcb8a2-6be4-47c0-9ec7-758300a28f39 9677535ac83f4274b374a03b9ade1cf8 fd13708789ca41ca9ed376ba39d87279 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:13.999 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.008 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.054 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.055 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039293.949416, 2977f489-9f9d-43f7-a617-7556b7df5171 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.055 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] VM Started (Lifecycle Event)
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.082 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.085 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.781 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.950 182939 DEBUG oslo_concurrency.lockutils [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.950 182939 DEBUG oslo_concurrency.lockutils [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.950 182939 DEBUG oslo_concurrency.lockutils [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.951 182939 DEBUG oslo_concurrency.lockutils [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.951 182939 DEBUG oslo_concurrency.lockutils [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.961 182939 INFO nova.compute.manager [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Terminating instance
Jan 21 23:48:14 compute-0 nova_compute[182935]: 2026-01-21 23:48:14.973 182939 DEBUG nova.compute.manager [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:48:14 compute-0 kernel: tap07de181e-ac (unregistering): left promiscuous mode
Jan 21 23:48:15 compute-0 NetworkManager[55139]: <info>  [1769039295.0019] device (tap07de181e-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:48:15 compute-0 ovn_controller[95047]: 2026-01-21T23:48:15Z|00101|binding|INFO|Releasing lport 07de181e-ac7b-4c3f-826a-3b63c1bdb993 from this chassis (sb_readonly=0)
Jan 21 23:48:15 compute-0 ovn_controller[95047]: 2026-01-21T23:48:15Z|00102|binding|INFO|Setting lport 07de181e-ac7b-4c3f-826a-3b63c1bdb993 down in Southbound
Jan 21 23:48:15 compute-0 ovn_controller[95047]: 2026-01-21T23:48:15Z|00103|binding|INFO|Removing iface tap07de181e-ac ovn-installed in OVS
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.011 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.013 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.023 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:70:45 10.100.0.5'], port_security=['fa:16:3e:69:70:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'efc683b9-a8d9-4a67-bb19-aeaabfbd5423', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=07de181e-ac7b-4c3f-826a-3b63c1bdb993) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.024 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 07de181e-ac7b-4c3f-826a-3b63c1bdb993 in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 unbound from our chassis
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.026 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.045 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.064 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[33deb2e4-1aad-4178-a57e-b3590250e6f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000014.scope: Deactivated successfully.
Jan 21 23:48:15 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000014.scope: Consumed 16.140s CPU time.
Jan 21 23:48:15 compute-0 systemd-machined[154182]: Machine qemu-10-instance-00000014 terminated.
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.099 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7b4f7fa8-853c-4d77-9cd4-7ea23324cf5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.103 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b6046b88-bf62-4e87-b546-10b1c3d5ca00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.131 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[6871361a-f8c5-4923-88dd-70231e71aa7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.146 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3f614215-b02b-4393-b65b-977616b91d7e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371486, 'reachable_time': 18847, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215013, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.163 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[75d58c82-cc68-4c13-80e3-ffacf9cddbb7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371506, 'tstamp': 371506}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215014, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371510, 'tstamp': 371510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215014, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.164 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.166 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.170 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.171 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1530a22a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.171 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.171 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1530a22a-f0, col_values=(('external_ids', {'iface-id': '1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.172 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:15 compute-0 kernel: tap07de181e-ac: entered promiscuous mode
Jan 21 23:48:15 compute-0 NetworkManager[55139]: <info>  [1769039295.1951] manager: (tap07de181e-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.197 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 ovn_controller[95047]: 2026-01-21T23:48:15Z|00104|binding|INFO|Claiming lport 07de181e-ac7b-4c3f-826a-3b63c1bdb993 for this chassis.
Jan 21 23:48:15 compute-0 ovn_controller[95047]: 2026-01-21T23:48:15Z|00105|binding|INFO|07de181e-ac7b-4c3f-826a-3b63c1bdb993: Claiming fa:16:3e:69:70:45 10.100.0.5
Jan 21 23:48:15 compute-0 kernel: tap07de181e-ac (unregistering): left promiscuous mode
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.205 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:70:45 10.100.0.5'], port_security=['fa:16:3e:69:70:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'efc683b9-a8d9-4a67-bb19-aeaabfbd5423', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=07de181e-ac7b-4c3f-826a-3b63c1bdb993) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.206 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 07de181e-ac7b-4c3f-826a-3b63c1bdb993 in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 bound to our chassis
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.217 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.227 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 ovn_controller[95047]: 2026-01-21T23:48:15Z|00106|binding|INFO|Setting lport 07de181e-ac7b-4c3f-826a-3b63c1bdb993 ovn-installed in OVS
Jan 21 23:48:15 compute-0 ovn_controller[95047]: 2026-01-21T23:48:15Z|00107|binding|INFO|Setting lport 07de181e-ac7b-4c3f-826a-3b63c1bdb993 up in Southbound
Jan 21 23:48:15 compute-0 ovn_controller[95047]: 2026-01-21T23:48:15Z|00108|binding|INFO|Releasing lport 07de181e-ac7b-4c3f-826a-3b63c1bdb993 from this chassis (sb_readonly=1)
Jan 21 23:48:15 compute-0 ovn_controller[95047]: 2026-01-21T23:48:15Z|00109|if_status|INFO|Not setting lport 07de181e-ac7b-4c3f-826a-3b63c1bdb993 down as sb is readonly
Jan 21 23:48:15 compute-0 ovn_controller[95047]: 2026-01-21T23:48:15Z|00110|binding|INFO|Removing iface tap07de181e-ac ovn-installed in OVS
Jan 21 23:48:15 compute-0 ovn_controller[95047]: 2026-01-21T23:48:15Z|00111|binding|INFO|Releasing lport 07de181e-ac7b-4c3f-826a-3b63c1bdb993 from this chassis (sb_readonly=0)
Jan 21 23:48:15 compute-0 ovn_controller[95047]: 2026-01-21T23:48:15Z|00112|binding|INFO|Setting lport 07de181e-ac7b-4c3f-826a-3b63c1bdb993 down in Southbound
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.243 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[50b70685-92f5-4e6e-9a22-419ce4410983]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.244 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.247 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:70:45 10.100.0.5'], port_security=['fa:16:3e:69:70:45 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'efc683b9-a8d9-4a67-bb19-aeaabfbd5423', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=07de181e-ac7b-4c3f-826a-3b63c1bdb993) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.253 182939 INFO nova.virt.libvirt.driver [-] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Instance destroyed successfully.
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.254 182939 DEBUG nova.objects.instance [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'resources' on Instance uuid efc683b9-a8d9-4a67-bb19-aeaabfbd5423 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.277 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[47ca2cc0-9cc6-4cb2-b255-ff1db7d3dccb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.278 182939 DEBUG nova.virt.libvirt.vif [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:47:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-466710194',display_name='tempest-ServersAdminTestJSON-server-466710194',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-466710194',id=20,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:47:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-85hksn2t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:47:12Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=efc683b9-a8d9-4a67-bb19-aeaabfbd5423,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "address": "fa:16:3e:69:70:45", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07de181e-ac", "ovs_interfaceid": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.279 182939 DEBUG nova.network.os_vif_util [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "address": "fa:16:3e:69:70:45", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07de181e-ac", "ovs_interfaceid": "07de181e-ac7b-4c3f-826a-3b63c1bdb993", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.280 182939 DEBUG nova.network.os_vif_util [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:70:45,bridge_name='br-int',has_traffic_filtering=True,id=07de181e-ac7b-4c3f-826a-3b63c1bdb993,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07de181e-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.280 182939 DEBUG os_vif [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:70:45,bridge_name='br-int',has_traffic_filtering=True,id=07de181e-ac7b-4c3f-826a-3b63c1bdb993,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07de181e-ac') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.282 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[84b2beed-1a8e-4cd8-8908-32bedbf92adb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.282 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.283 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap07de181e-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.284 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.287 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.291 182939 INFO os_vif [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:70:45,bridge_name='br-int',has_traffic_filtering=True,id=07de181e-ac7b-4c3f-826a-3b63c1bdb993,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07de181e-ac')
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.291 182939 INFO nova.virt.libvirt.driver [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Deleting instance files /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423_del
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.292 182939 INFO nova.virt.libvirt.driver [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Deletion of /var/lib/nova/instances/efc683b9-a8d9-4a67-bb19-aeaabfbd5423_del complete
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.318 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[001f5a5a-a614-4fb3-8dce-fcfaae9a7fc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.338 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[44db6435-8e3b-4fb8-92b0-68f283a18f26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 17, 'rx_bytes': 700, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 17, 'rx_bytes': 700, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371486, 'reachable_time': 18847, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215031, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.357 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e21f865d-5110-40c2-99fe-0aab55234808]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371506, 'tstamp': 371506}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215033, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371510, 'tstamp': 371510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215033, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.359 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.361 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.362 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.363 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1530a22a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.363 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.363 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1530a22a-f0, col_values=(('external_ids', {'iface-id': '1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.363 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.364 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 07de181e-ac7b-4c3f-826a-3b63c1bdb993 in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 unbound from our chassis
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.366 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1530a22a-f758-407d-b1aa-fd922904fe07
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.382 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[97db6b9c-8059-4915-b727-244a5e1d200a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.404 182939 INFO nova.compute.manager [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.405 182939 DEBUG oslo.service.loopingcall [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.405 182939 DEBUG nova.compute.manager [-] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.406 182939 DEBUG nova.network.neutron [-] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.421 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[5de8d512-eccf-463c-8928-f292dd4a08dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.426 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[dc39a7bc-327d-478a-8fbc-46c372c75d94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.472 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b1ff1cb0-1bf8-4777-9348-4a9b62a00edc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.497 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[499187e9-cfd1-418a-87d5-a5fd0536a8a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1530a22a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:bf:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 19, 'rx_bytes': 700, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 19, 'rx_bytes': 700, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371486, 'reachable_time': 18847, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215043, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.518 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[71e4cc51-f9d5-472b-963f-f3328174d633]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371506, 'tstamp': 371506}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215044, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap1530a22a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371510, 'tstamp': 371510}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215044, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.520 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.522 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.523 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.523 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1530a22a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.523 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.524 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1530a22a-f0, col_values=(('external_ids', {'iface-id': '1e43acd7-fb26-4f78-8f65-2a3b2d4a2acd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:15 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:15.524 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.605 182939 DEBUG nova.compute.manager [req-7acf141a-3feb-4dde-bf69-1630ede942e3 req-5caabfd4-3a32-4db1-b2f0-97a7a308d424 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received event network-vif-unplugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.605 182939 DEBUG oslo_concurrency.lockutils [req-7acf141a-3feb-4dde-bf69-1630ede942e3 req-5caabfd4-3a32-4db1-b2f0-97a7a308d424 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.605 182939 DEBUG oslo_concurrency.lockutils [req-7acf141a-3feb-4dde-bf69-1630ede942e3 req-5caabfd4-3a32-4db1-b2f0-97a7a308d424 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.606 182939 DEBUG oslo_concurrency.lockutils [req-7acf141a-3feb-4dde-bf69-1630ede942e3 req-5caabfd4-3a32-4db1-b2f0-97a7a308d424 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.606 182939 DEBUG nova.compute.manager [req-7acf141a-3feb-4dde-bf69-1630ede942e3 req-5caabfd4-3a32-4db1-b2f0-97a7a308d424 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] No waiting events found dispatching network-vif-unplugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:15 compute-0 nova_compute[182935]: 2026-01-21 23:48:15.606 182939 DEBUG nova.compute.manager [req-7acf141a-3feb-4dde-bf69-1630ede942e3 req-5caabfd4-3a32-4db1-b2f0-97a7a308d424 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received event network-vif-unplugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:48:16 compute-0 nova_compute[182935]: 2026-01-21 23:48:16.821 182939 DEBUG nova.network.neutron [-] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:16 compute-0 nova_compute[182935]: 2026-01-21 23:48:16.843 182939 INFO nova.compute.manager [-] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Took 1.44 seconds to deallocate network for instance.
Jan 21 23:48:16 compute-0 nova_compute[182935]: 2026-01-21 23:48:16.942 182939 DEBUG oslo_concurrency.lockutils [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:16 compute-0 nova_compute[182935]: 2026-01-21 23:48:16.942 182939 DEBUG oslo_concurrency.lockutils [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.060 182939 DEBUG nova.compute.provider_tree [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.077 182939 DEBUG nova.scheduler.client.report [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.102 182939 DEBUG oslo_concurrency.lockutils [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.139 182939 INFO nova.scheduler.client.report [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Deleted allocations for instance efc683b9-a8d9-4a67-bb19-aeaabfbd5423
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.208 182939 DEBUG oslo_concurrency.lockutils [None req-5ac00682-0c2a-4136-9cac-e34d93618f80 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.708 182939 DEBUG nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.708 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.709 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.709 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.709 182939 DEBUG nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] No waiting events found dispatching network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.709 182939 WARNING nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received unexpected event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 for instance with vm_state deleted and task_state None.
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.709 182939 DEBUG nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.709 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.710 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.710 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.710 182939 DEBUG nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] No waiting events found dispatching network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.710 182939 WARNING nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received unexpected event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 for instance with vm_state deleted and task_state None.
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.710 182939 DEBUG nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.710 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.710 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.711 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.711 182939 DEBUG nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] No waiting events found dispatching network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.711 182939 WARNING nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received unexpected event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 for instance with vm_state deleted and task_state None.
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.711 182939 DEBUG nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received event network-vif-unplugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.711 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.711 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.711 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.712 182939 DEBUG nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] No waiting events found dispatching network-vif-unplugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.712 182939 WARNING nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received unexpected event network-vif-unplugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 for instance with vm_state deleted and task_state None.
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.712 182939 DEBUG nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.712 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.712 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.712 182939 DEBUG oslo_concurrency.lockutils [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "efc683b9-a8d9-4a67-bb19-aeaabfbd5423-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.712 182939 DEBUG nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] No waiting events found dispatching network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:17 compute-0 nova_compute[182935]: 2026-01-21 23:48:17.712 182939 WARNING nova.compute.manager [req-f6a615d8-267b-4fd5-891c-44d214f17f77 req-e51839a9-ae74-4efd-91d4-7a4c60e9e60d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received unexpected event network-vif-plugged-07de181e-ac7b-4c3f-826a-3b63c1bdb993 for instance with vm_state deleted and task_state None.
Jan 21 23:48:17 compute-0 ovn_controller[95047]: 2026-01-21T23:48:17Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0d:1c:84 10.100.0.8
Jan 21 23:48:17 compute-0 ovn_controller[95047]: 2026-01-21T23:48:17Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0d:1c:84 10.100.0.8
Jan 21 23:48:18 compute-0 nova_compute[182935]: 2026-01-21 23:48:18.034 182939 DEBUG nova.compute.manager [req-fdb0cdf8-27c2-46a1-a39c-dea2ad2caeea req-54c88cf5-f2e3-450f-8148-5c471cef7098 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Received event network-vif-deleted-07de181e-ac7b-4c3f-826a-3b63c1bdb993 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:18 compute-0 nova_compute[182935]: 2026-01-21 23:48:18.830 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:18 compute-0 nova_compute[182935]: 2026-01-21 23:48:18.830 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:48:18 compute-0 nova_compute[182935]: 2026-01-21 23:48:18.852 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:48:18 compute-0 nova_compute[182935]: 2026-01-21 23:48:18.852 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:18 compute-0 nova_compute[182935]: 2026-01-21 23:48:18.852 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:19 compute-0 podman[215064]: 2026-01-21 23:48:19.718082954 +0000 UTC m=+0.070004701 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:48:19 compute-0 podman[215063]: 2026-01-21 23:48:19.752864954 +0000 UTC m=+0.104670878 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 21 23:48:19 compute-0 nova_compute[182935]: 2026-01-21 23:48:19.782 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:20 compute-0 nova_compute[182935]: 2026-01-21 23:48:20.285 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:20 compute-0 nova_compute[182935]: 2026-01-21 23:48:20.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:20 compute-0 nova_compute[182935]: 2026-01-21 23:48:20.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:20 compute-0 nova_compute[182935]: 2026-01-21 23:48:20.835 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:20 compute-0 nova_compute[182935]: 2026-01-21 23:48:20.836 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:20 compute-0 nova_compute[182935]: 2026-01-21 23:48:20.836 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:20 compute-0 nova_compute[182935]: 2026-01-21 23:48:20.836 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:48:20 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 23:48:20 compute-0 systemd[214934]: Activating special unit Exit the Session...
Jan 21 23:48:20 compute-0 systemd[214934]: Stopped target Main User Target.
Jan 21 23:48:20 compute-0 systemd[214934]: Stopped target Basic System.
Jan 21 23:48:20 compute-0 systemd[214934]: Stopped target Paths.
Jan 21 23:48:20 compute-0 systemd[214934]: Stopped target Sockets.
Jan 21 23:48:20 compute-0 systemd[214934]: Stopped target Timers.
Jan 21 23:48:20 compute-0 systemd[214934]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:48:20 compute-0 systemd[214934]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:48:20 compute-0 systemd[214934]: Closed D-Bus User Message Bus Socket.
Jan 21 23:48:20 compute-0 systemd[214934]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:48:20 compute-0 systemd[214934]: Removed slice User Application Slice.
Jan 21 23:48:20 compute-0 systemd[214934]: Reached target Shutdown.
Jan 21 23:48:20 compute-0 systemd[214934]: Finished Exit the Session.
Jan 21 23:48:20 compute-0 systemd[214934]: Reached target Exit the Session.
Jan 21 23:48:20 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 23:48:20 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 23:48:20 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 23:48:20 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 23:48:20 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 23:48:20 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 23:48:20 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 23:48:20 compute-0 nova_compute[182935]: 2026-01-21 23:48:20.983 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.046 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.048 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.112 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.120 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.181 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.182 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.274 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.445 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.446 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5315MB free_disk=73.28876876831055GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.446 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.447 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.794 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance a6a89006-02c9-49b1-8bfb-8640ba1b495f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.794 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 2977f489-9f9d-43f7-a617-7556b7df5171 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.795 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.795 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.876 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.902 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.941 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:48:21 compute-0 nova_compute[182935]: 2026-01-21 23:48:21.942 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.630 182939 DEBUG oslo_concurrency.lockutils [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.630 182939 DEBUG oslo_concurrency.lockutils [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.631 182939 DEBUG oslo_concurrency.lockutils [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.631 182939 DEBUG oslo_concurrency.lockutils [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.632 182939 DEBUG oslo_concurrency.lockutils [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.647 182939 INFO nova.compute.manager [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Terminating instance
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.661 182939 DEBUG nova.compute.manager [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:48:22 compute-0 kernel: tap25b6ea25-2c (unregistering): left promiscuous mode
Jan 21 23:48:22 compute-0 NetworkManager[55139]: <info>  [1769039302.6849] device (tap25b6ea25-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.694 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:22 compute-0 ovn_controller[95047]: 2026-01-21T23:48:22Z|00113|binding|INFO|Releasing lport 25b6ea25-2c24-4a07-9772-28913505aec2 from this chassis (sb_readonly=0)
Jan 21 23:48:22 compute-0 ovn_controller[95047]: 2026-01-21T23:48:22Z|00114|binding|INFO|Setting lport 25b6ea25-2c24-4a07-9772-28913505aec2 down in Southbound
Jan 21 23:48:22 compute-0 ovn_controller[95047]: 2026-01-21T23:48:22Z|00115|binding|INFO|Removing iface tap25b6ea25-2c ovn-installed in OVS
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.698 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:22 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:22.710 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:1c:84 10.100.0.8'], port_security=['fa:16:3e:0d:1c:84 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a6a89006-02c9-49b1-8bfb-8640ba1b495f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1530a22a-f758-407d-b1aa-fd922904fe07', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d40fc03fb534b5689415f3d8a3de1fc', 'neutron:revision_number': '8', 'neutron:security_group_ids': '53586897-09f0-4175-b34b-334e99525efe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b41d97fa-58c9-4587-b2cb-1c83c11100c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=25b6ea25-2c24-4a07-9772-28913505aec2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:22 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:22.712 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 25b6ea25-2c24-4a07-9772-28913505aec2 in datapath 1530a22a-f758-407d-b1aa-fd922904fe07 unbound from our chassis
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.713 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:22 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:22.714 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1530a22a-f758-407d-b1aa-fd922904fe07, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:48:22 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:22.715 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[135b0f7a-4482-43b7-97bb-f2e33ca7ea89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:22 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:22.716 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 namespace which is not needed anymore
Jan 21 23:48:22 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 21 23:48:22 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000012.scope: Consumed 14.031s CPU time.
Jan 21 23:48:22 compute-0 systemd-machined[154182]: Machine qemu-13-instance-00000012 terminated.
Jan 21 23:48:22 compute-0 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213595]: [NOTICE]   (213601) : haproxy version is 2.8.14-c23fe91
Jan 21 23:48:22 compute-0 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213595]: [NOTICE]   (213601) : path to executable is /usr/sbin/haproxy
Jan 21 23:48:22 compute-0 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213595]: [WARNING]  (213601) : Exiting Master process...
Jan 21 23:48:22 compute-0 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213595]: [WARNING]  (213601) : Exiting Master process...
Jan 21 23:48:22 compute-0 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213595]: [ALERT]    (213601) : Current worker (213603) exited with code 143 (Terminated)
Jan 21 23:48:22 compute-0 neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07[213595]: [WARNING]  (213601) : All workers exited. Exiting... (0)
Jan 21 23:48:22 compute-0 systemd[1]: libpod-473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c.scope: Deactivated successfully.
Jan 21 23:48:22 compute-0 podman[215150]: 2026-01-21 23:48:22.885623895 +0000 UTC m=+0.061254944 container died 473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.885 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.891 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c-userdata-shm.mount: Deactivated successfully.
Jan 21 23:48:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-59b5eab0a52fc8adb82ffc73f15219c4f5390f51b2d5b372bce629a0d4577c4c-merged.mount: Deactivated successfully.
Jan 21 23:48:22 compute-0 podman[215150]: 2026-01-21 23:48:22.9240037 +0000 UTC m=+0.099634729 container cleanup 473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.937 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.938 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.938 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.938 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.944 182939 INFO nova.virt.libvirt.driver [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Instance destroyed successfully.
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.944 182939 DEBUG nova.objects.instance [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lazy-loading 'resources' on Instance uuid a6a89006-02c9-49b1-8bfb-8640ba1b495f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:22 compute-0 systemd[1]: libpod-conmon-473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c.scope: Deactivated successfully.
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.956 182939 DEBUG nova.virt.libvirt.vif [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:46:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-168641085',display_name='tempest-ServersAdminTestJSON-server-168641085',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-168641085',id=18,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:48:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4d40fc03fb534b5689415f3d8a3de1fc',ramdisk_id='',reservation_id='r-evbqme7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1815099341',owner_user_name='tempest-ServersAdminTestJSON-1815099341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:48:08Z,user_data=None,user_id='4a6034ff39094b6486bac680b7ed5a57',uuid=a6a89006-02c9-49b1-8bfb-8640ba1b495f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.957 182939 DEBUG nova.network.os_vif_util [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converting VIF {"id": "25b6ea25-2c24-4a07-9772-28913505aec2", "address": "fa:16:3e:0d:1c:84", "network": {"id": "1530a22a-f758-407d-b1aa-fd922904fe07", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1431691179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4d40fc03fb534b5689415f3d8a3de1fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25b6ea25-2c", "ovs_interfaceid": "25b6ea25-2c24-4a07-9772-28913505aec2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.957 182939 DEBUG nova.network.os_vif_util [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.958 182939 DEBUG os_vif [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.959 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.959 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25b6ea25-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.961 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.964 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.966 182939 INFO os_vif [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:1c:84,bridge_name='br-int',has_traffic_filtering=True,id=25b6ea25-2c24-4a07-9772-28913505aec2,network=Network(1530a22a-f758-407d-b1aa-fd922904fe07),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25b6ea25-2c')
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.966 182939 INFO nova.virt.libvirt.driver [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Deleting instance files /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f_del
Jan 21 23:48:22 compute-0 nova_compute[182935]: 2026-01-21 23:48:22.967 182939 INFO nova.virt.libvirt.driver [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Deletion of /var/lib/nova/instances/a6a89006-02c9-49b1-8bfb-8640ba1b495f_del complete
Jan 21 23:48:23 compute-0 podman[215196]: 2026-01-21 23:48:23.007905488 +0000 UTC m=+0.043268911 container remove 473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:48:23 compute-0 nova_compute[182935]: 2026-01-21 23:48:23.014 182939 DEBUG nova.compute.manager [req-0bdbd448-06c5-4b4d-9282-04311332da77 req-9b6076b1-b71c-43ad-bab2-2a0da367a394 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-unplugged-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:23.013 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9bd65368-0685-4bd5-be35-76ea0ba7ddd6]: (4, ('Wed Jan 21 11:48:22 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 (473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c)\n473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c\nWed Jan 21 11:48:22 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 (473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c)\n473fad13f1b4d01c9f6ad61d84d290a91b85ef5e8ff27dc99c4e91a1e11b486c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:23 compute-0 nova_compute[182935]: 2026-01-21 23:48:23.014 182939 DEBUG oslo_concurrency.lockutils [req-0bdbd448-06c5-4b4d-9282-04311332da77 req-9b6076b1-b71c-43ad-bab2-2a0da367a394 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:23 compute-0 nova_compute[182935]: 2026-01-21 23:48:23.015 182939 DEBUG oslo_concurrency.lockutils [req-0bdbd448-06c5-4b4d-9282-04311332da77 req-9b6076b1-b71c-43ad-bab2-2a0da367a394 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:23 compute-0 nova_compute[182935]: 2026-01-21 23:48:23.015 182939 DEBUG oslo_concurrency.lockutils [req-0bdbd448-06c5-4b4d-9282-04311332da77 req-9b6076b1-b71c-43ad-bab2-2a0da367a394 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:23 compute-0 nova_compute[182935]: 2026-01-21 23:48:23.015 182939 DEBUG nova.compute.manager [req-0bdbd448-06c5-4b4d-9282-04311332da77 req-9b6076b1-b71c-43ad-bab2-2a0da367a394 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] No waiting events found dispatching network-vif-unplugged-25b6ea25-2c24-4a07-9772-28913505aec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:23 compute-0 nova_compute[182935]: 2026-01-21 23:48:23.016 182939 DEBUG nova.compute.manager [req-0bdbd448-06c5-4b4d-9282-04311332da77 req-9b6076b1-b71c-43ad-bab2-2a0da367a394 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-unplugged-25b6ea25-2c24-4a07-9772-28913505aec2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:48:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:23.016 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6232ab32-97c7-4f7c-b8d6-e9904c91b31d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:23.017 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1530a22a-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:23 compute-0 kernel: tap1530a22a-f0: left promiscuous mode
Jan 21 23:48:23 compute-0 nova_compute[182935]: 2026-01-21 23:48:23.019 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:23 compute-0 nova_compute[182935]: 2026-01-21 23:48:23.031 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:23.035 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4ce5fa3f-3ee7-4cf7-b654-b324cf411e8c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:23.053 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[34473eda-92a5-4e38-8005-83fc4b630276]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:23.056 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[64a51566-07e9-42af-bf62-40d32d198881]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:23 compute-0 nova_compute[182935]: 2026-01-21 23:48:23.061 182939 INFO nova.compute.manager [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 21 23:48:23 compute-0 nova_compute[182935]: 2026-01-21 23:48:23.062 182939 DEBUG oslo.service.loopingcall [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:48:23 compute-0 nova_compute[182935]: 2026-01-21 23:48:23.062 182939 DEBUG nova.compute.manager [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:48:23 compute-0 nova_compute[182935]: 2026-01-21 23:48:23.062 182939 DEBUG nova.network.neutron [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:48:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:23.075 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e6de4d7b-64c3-4b4a-b469-6a7b6c636a38]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371475, 'reachable_time': 22840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215212, 'error': None, 'target': 'ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:23.079 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1530a22a-f758-407d-b1aa-fd922904fe07 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:48:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:23.080 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef58101-9255-4910-8e3e-ddc16e14cd46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:23 compute-0 systemd[1]: run-netns-ovnmeta\x2d1530a22a\x2df758\x2d407d\x2db1aa\x2dfd922904fe07.mount: Deactivated successfully.
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.278 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'name': 'tempest-MigrationsAdminTest-server-529809703', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000017', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '95574103d0094883861c58d01690e5a3', 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'hostId': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.280 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.313 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.314 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5658a4c4-d823-47d6-9c07-34127c6dddbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-vda', 'timestamp': '2026-01-21T23:48:23.280429', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac2ead4c-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.074737215, 'message_signature': '96d406a9b8ab5db46a45d42ff1ccb3e1740e0cb1865bea23f711ad705cf3f786'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-sda', 'timestamp': '2026-01-21T23:48:23.280429', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac2ebe86-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.074737215, 'message_signature': '7975a86d4193da4dba6be51c3872bc3ae437e50e7920c953bc9b4997ec6bb538'}]}, 'timestamp': '2026-01-21 23:48:23.314778', '_unique_id': '722b27cd816d46ec8e92087670855964'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.317 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.318 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.318 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.318 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-529809703>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-529809703>]
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.319 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.337 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.337 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 2977f489-9f9d-43f7-a617-7556b7df5171: ceilometer.compute.pollsters.NoVolumeException
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.337 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.348 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.349 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc5c9d90-b2ed-4f33-aa0d-aac7a6934cdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-vda', 'timestamp': '2026-01-21T23:48:23.337731', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac340026-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.131988375, 'message_signature': '2a689e8754b4c54d880e9398d95d832b28cea37d335c0f59584d1ea2ac91d155'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-sda', 'timestamp': '2026-01-21T23:48:23.337731', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac340bca-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.131988375, 'message_signature': '2b630da7b5aac5b4587991e5af3d97aacdd34eaa441aa6966ea8a7386c7d40a1'}]}, 'timestamp': '2026-01-21 23:48:23.349460', '_unique_id': '1fef44d24b98488ca7298f4f9bb06c79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.350 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.353 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.353 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/cpu volume: 9000000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39cbfa77-fdf5-4a67-9606-4834cc62f41f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9000000000, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'timestamp': '2026-01-21T23:48:23.353697', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ac34bd4a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.131297959, 'message_signature': '14df3fc5984a6f106467e098ebc7f4c2a11371ee1693b33f46c46df403e084cc'}]}, 'timestamp': '2026-01-21 23:48:23.354048', '_unique_id': '190ee578679242088b52f5aa41c79f6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.354 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.355 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.355 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.355 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-529809703>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-529809703>]
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.355 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.355 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.356 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.356 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.356 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.read.latency volume: 105841239 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.356 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.read.latency volume: 309807 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7aa28b9c-3f56-4318-b333-59f46fa3db4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 105841239, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-vda', 'timestamp': '2026-01-21T23:48:23.356174', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac351b78-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.074737215, 'message_signature': 'c32fd3cf90ddc539f7290790abb90eaed070d09a584e5bb1dd5b2d70ebe1ee4b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 309807, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-sda', 'timestamp': '2026-01-21T23:48:23.356174', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac3523de-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.074737215, 'message_signature': 'a89c4e0512a9db78cc059b96ddc8d8ff182ce2b227d17d5cc519dae0aab19324'}]}, 'timestamp': '2026-01-21 23:48:23.356641', '_unique_id': '83561967aa524548a3991fa1372835d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.357 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4a63d55-45c8-4210-a294-463265d3978e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-vda', 'timestamp': '2026-01-21T23:48:23.357914', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac355eee-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.074737215, 'message_signature': '472d50850ed6f4ba4678a2204d30c8ee42c462e3a3caae304ae154bf7e29c4f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-sda', 'timestamp': '2026-01-21T23:48:23.357914', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac356736-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.074737215, 'message_signature': 'bee1135f2cf2761188d8b38999cb994d2ec42d669021be64ab9c5e5a37dd5878'}]}, 'timestamp': '2026-01-21 23:48:23.358348', '_unique_id': 'e2d0f95c10bf4cceb4f5a079bc179821'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.358 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.359 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.359 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.359 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.359 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e8f28ee-3233-4cf5-8f08-64ea1432c6e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-vda', 'timestamp': '2026-01-21T23:48:23.359682', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac35a64c-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.074737215, 'message_signature': '53738b04f259ab3100e5b44252bfb8d7995ae3176b4aff556227b554b9ab43d9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-sda', 'timestamp': '2026-01-21T23:48:23.359682', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac35af2a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.074737215, 'message_signature': '503e8de1c8e0be42a8cbd0e8dd11d5815819c7dd6d545299724cb617179b1edb'}]}, 'timestamp': '2026-01-21 23:48:23.360194', '_unique_id': '2e554743bd3d40839be07e726dbc0c78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.360 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.361 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.361 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.361 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.361 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-529809703>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-529809703>]
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.361 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e45585df-2c73-4c1b-806f-d4229884c815', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-vda', 'timestamp': '2026-01-21T23:48:23.362063', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac360132-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.074737215, 'message_signature': '3bd163d53a7237968f9def40e537b8f8a36e0e63eda440376b3ca9c62138f736'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-sda', 'timestamp': '2026-01-21T23:48:23.362063', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac36097a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.074737215, 'message_signature': '311978a32268ecae1e7a9948a019d046b9736c2825818eb754f4a0ba9712ea77'}]}, 'timestamp': '2026-01-21 23:48:23.362489', '_unique_id': 'c83560f7b6174a1eb9c9394f59601fda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.362 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.363 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.363 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.363 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38c665af-0f44-4d80-9f2b-8116805f2258', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-vda', 'timestamp': '2026-01-21T23:48:23.363617', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac363de6-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.131988375, 'message_signature': 'f0dadca554439ac47f3e5baa51d0a8419fa454ea1e439be0ece035ef413e1290'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-sda', 'timestamp': '2026-01-21T23:48:23.363617', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac36475a-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.131988375, 'message_signature': 'e004c3f0183a480bdded03923aeb399b7d99c313c0b2f9321dcf66c88eda0965'}]}, 'timestamp': '2026-01-21 23:48:23.364074', '_unique_id': '8d71dbefdd984f709c01561ad2862311'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.364 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.365 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.365 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.365 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.allocation volume: 29954048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.365 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6ea3480-4b5d-4759-a747-b556f70b2bd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29954048, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-vda', 'timestamp': '2026-01-21T23:48:23.365353', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac368198-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.131988375, 'message_signature': '57114072a84dd9fcc2287bd6bad9fc7cd9b0c3cd681cdb88e6a049072bc4b3eb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-sda', 'timestamp': '2026-01-21T23:48:23.365353', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac368936-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.131988375, 'message_signature': '6b4149f6bb9608343e3b7caad8f5c02f9c57403eabf38f64a24ec1842c5987ec'}]}, 'timestamp': '2026-01-21 23:48:23.365755', '_unique_id': 'ebf5a3e7c41642138d5ffe0ad471d496'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.366 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.367 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.367 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-529809703>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-529809703>]
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.367 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.367 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.367 12 DEBUG ceilometer.compute.pollsters [-] 2977f489-9f9d-43f7-a617-7556b7df5171/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2281f77a-0797-4d56-9836-a5dbe9038a0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-vda', 'timestamp': '2026-01-21T23:48:23.367412', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ac36d3be-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.074737215, 'message_signature': 'd4ca96decac09fcdf0f393cdcd6f7b528e83bb946fa778f49237d9f559310f5e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'user_name': None, 'project_id': '95574103d0094883861c58d01690e5a3', 'project_name': None, 'resource_id': '2977f489-9f9d-43f7-a617-7556b7df5171-sda', 'timestamp': '2026-01-21T23:48:23.367412', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-529809703', 'name': 'instance-00000017', 'instance_id': '2977f489-9f9d-43f7-a617-7556b7df5171', 'instance_type': 'm1.nano', 'host': '9646c04fc51043d6281e367b54d4025cd0ed502b152ecf4648ac8012', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ac36dbf2-f723-11f0-9743-fa163e6b0dfb', 'monotonic_time': 3806.074737215, 'message_signature': 'a49db17a6926003b94779cc88f8b301d5102dab3df9f4245f9ef41a0cc3a6c89'}]}, 'timestamp': '2026-01-21 23:48:23.367992', '_unique_id': 'f3d4854f7d9146e68c28a4a1ef0dec73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:48:23.368 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:48:24 compute-0 nova_compute[182935]: 2026-01-21 23:48:24.016 182939 DEBUG nova.network.neutron [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:24 compute-0 nova_compute[182935]: 2026-01-21 23:48:24.045 182939 INFO nova.compute.manager [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Took 0.98 seconds to deallocate network for instance.
Jan 21 23:48:24 compute-0 nova_compute[182935]: 2026-01-21 23:48:24.148 182939 DEBUG oslo_concurrency.lockutils [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:24 compute-0 nova_compute[182935]: 2026-01-21 23:48:24.148 182939 DEBUG oslo_concurrency.lockutils [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:24 compute-0 nova_compute[182935]: 2026-01-21 23:48:24.240 182939 DEBUG nova.compute.provider_tree [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:24 compute-0 nova_compute[182935]: 2026-01-21 23:48:24.254 182939 DEBUG nova.scheduler.client.report [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:24 compute-0 nova_compute[182935]: 2026-01-21 23:48:24.285 182939 DEBUG oslo_concurrency.lockutils [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:24 compute-0 nova_compute[182935]: 2026-01-21 23:48:24.328 182939 INFO nova.scheduler.client.report [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Deleted allocations for instance a6a89006-02c9-49b1-8bfb-8640ba1b495f
Jan 21 23:48:24 compute-0 nova_compute[182935]: 2026-01-21 23:48:24.429 182939 DEBUG oslo_concurrency.lockutils [None req-b4dd1f90-3aac-4028-b54b-77f9e8e7265d 4a6034ff39094b6486bac680b7ed5a57 4d40fc03fb534b5689415f3d8a3de1fc - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.799s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:24 compute-0 nova_compute[182935]: 2026-01-21 23:48:24.785 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:24 compute-0 nova_compute[182935]: 2026-01-21 23:48:24.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:48:25 compute-0 nova_compute[182935]: 2026-01-21 23:48:25.125 182939 DEBUG nova.compute.manager [req-f2b6ad2f-f58b-4bcd-b9c2-4b498a072ba1 req-d67f8089-4a2c-41fb-89dd-d0f4c6c05832 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:25 compute-0 nova_compute[182935]: 2026-01-21 23:48:25.126 182939 DEBUG oslo_concurrency.lockutils [req-f2b6ad2f-f58b-4bcd-b9c2-4b498a072ba1 req-d67f8089-4a2c-41fb-89dd-d0f4c6c05832 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:25 compute-0 nova_compute[182935]: 2026-01-21 23:48:25.126 182939 DEBUG oslo_concurrency.lockutils [req-f2b6ad2f-f58b-4bcd-b9c2-4b498a072ba1 req-d67f8089-4a2c-41fb-89dd-d0f4c6c05832 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:25 compute-0 nova_compute[182935]: 2026-01-21 23:48:25.127 182939 DEBUG oslo_concurrency.lockutils [req-f2b6ad2f-f58b-4bcd-b9c2-4b498a072ba1 req-d67f8089-4a2c-41fb-89dd-d0f4c6c05832 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a6a89006-02c9-49b1-8bfb-8640ba1b495f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:25 compute-0 nova_compute[182935]: 2026-01-21 23:48:25.127 182939 DEBUG nova.compute.manager [req-f2b6ad2f-f58b-4bcd-b9c2-4b498a072ba1 req-d67f8089-4a2c-41fb-89dd-d0f4c6c05832 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] No waiting events found dispatching network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:25 compute-0 nova_compute[182935]: 2026-01-21 23:48:25.127 182939 WARNING nova.compute.manager [req-f2b6ad2f-f58b-4bcd-b9c2-4b498a072ba1 req-d67f8089-4a2c-41fb-89dd-d0f4c6c05832 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received unexpected event network-vif-plugged-25b6ea25-2c24-4a07-9772-28913505aec2 for instance with vm_state deleted and task_state None.
Jan 21 23:48:25 compute-0 nova_compute[182935]: 2026-01-21 23:48:25.603 182939 DEBUG nova.compute.manager [req-dc9e9975-f251-4b4a-b988-7d9defce4a46 req-a6c84549-15ee-4def-aff4-d6cb70ae7b1f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Received event network-vif-deleted-25b6ea25-2c24-4a07-9772-28913505aec2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:25 compute-0 nova_compute[182935]: 2026-01-21 23:48:25.873 182939 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Creating tmpfile /var/lib/nova/instances/tmpu3c6dsuj to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Jan 21 23:48:25 compute-0 nova_compute[182935]: 2026-01-21 23:48:25.874 182939 DEBUG nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu3c6dsuj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Jan 21 23:48:26 compute-0 nova_compute[182935]: 2026-01-21 23:48:26.710 182939 DEBUG nova.compute.manager [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 21 23:48:26 compute-0 podman[215228]: 2026-01-21 23:48:26.712424856 +0000 UTC m=+0.077672532 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:48:26 compute-0 nova_compute[182935]: 2026-01-21 23:48:26.945 182939 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:26 compute-0 nova_compute[182935]: 2026-01-21 23:48:26.946 182939 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:26 compute-0 nova_compute[182935]: 2026-01-21 23:48:26.973 182939 DEBUG nova.objects.instance [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'pci_requests' on Instance uuid 63b2e61e-8ad4-44e9-ba44-db37454a4b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:26 compute-0 nova_compute[182935]: 2026-01-21 23:48:26.987 182939 DEBUG nova.virt.hardware [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:48:26 compute-0 nova_compute[182935]: 2026-01-21 23:48:26.987 182939 INFO nova.compute.claims [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:48:26 compute-0 nova_compute[182935]: 2026-01-21 23:48:26.988 182939 DEBUG nova.objects.instance [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'resources' on Instance uuid 63b2e61e-8ad4-44e9-ba44-db37454a4b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:27 compute-0 nova_compute[182935]: 2026-01-21 23:48:27.000 182939 DEBUG nova.objects.instance [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 63b2e61e-8ad4-44e9-ba44-db37454a4b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:27 compute-0 nova_compute[182935]: 2026-01-21 23:48:27.045 182939 INFO nova.compute.resource_tracker [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Updating resource usage from migration 42bc1ed7-e22c-426c-943d-5b751761144a
Jan 21 23:48:27 compute-0 nova_compute[182935]: 2026-01-21 23:48:27.045 182939 DEBUG nova.compute.resource_tracker [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Starting to track incoming migration 42bc1ed7-e22c-426c-943d-5b751761144a with flavor ff01ccba-ad51-439f-9037-926190d6dc0f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 21 23:48:27 compute-0 nova_compute[182935]: 2026-01-21 23:48:27.209 182939 DEBUG nova.compute.provider_tree [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:27 compute-0 nova_compute[182935]: 2026-01-21 23:48:27.229 182939 DEBUG nova.scheduler.client.report [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:27 compute-0 nova_compute[182935]: 2026-01-21 23:48:27.260 182939 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:27 compute-0 nova_compute[182935]: 2026-01-21 23:48:27.261 182939 INFO nova.compute.manager [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Migrating
Jan 21 23:48:27 compute-0 nova_compute[182935]: 2026-01-21 23:48:27.483 182939 DEBUG nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu3c6dsuj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2038fd11-9c07-48d0-8092-d973d69d8eb9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Jan 21 23:48:27 compute-0 nova_compute[182935]: 2026-01-21 23:48:27.515 182939 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:48:27 compute-0 nova_compute[182935]: 2026-01-21 23:48:27.515 182939 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquired lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:48:27 compute-0 nova_compute[182935]: 2026-01-21 23:48:27.515 182939 DEBUG nova.network.neutron [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:48:27 compute-0 nova_compute[182935]: 2026-01-21 23:48:27.963 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:28 compute-0 sshd-session[215251]: Accepted publickey for nova from 192.168.122.102 port 52124 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:48:28 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 23:48:28 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 23:48:28 compute-0 systemd-logind[784]: New session 36 of user nova.
Jan 21 23:48:28 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 23:48:28 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 23:48:28 compute-0 systemd[215255]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:28 compute-0 systemd[215255]: Queued start job for default target Main User Target.
Jan 21 23:48:28 compute-0 systemd[215255]: Created slice User Application Slice.
Jan 21 23:48:28 compute-0 systemd[215255]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:48:28 compute-0 systemd[215255]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:48:28 compute-0 systemd[215255]: Reached target Paths.
Jan 21 23:48:28 compute-0 systemd[215255]: Reached target Timers.
Jan 21 23:48:28 compute-0 systemd[215255]: Starting D-Bus User Message Bus Socket...
Jan 21 23:48:28 compute-0 systemd[215255]: Starting Create User's Volatile Files and Directories...
Jan 21 23:48:28 compute-0 systemd[215255]: Finished Create User's Volatile Files and Directories.
Jan 21 23:48:28 compute-0 systemd[215255]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:48:28 compute-0 systemd[215255]: Reached target Sockets.
Jan 21 23:48:28 compute-0 systemd[215255]: Reached target Basic System.
Jan 21 23:48:28 compute-0 systemd[215255]: Reached target Main User Target.
Jan 21 23:48:28 compute-0 systemd[215255]: Startup finished in 138ms.
Jan 21 23:48:28 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 23:48:28 compute-0 systemd[1]: Started Session 36 of User nova.
Jan 21 23:48:28 compute-0 sshd-session[215251]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:28 compute-0 sshd-session[215271]: Received disconnect from 192.168.122.102 port 52124:11: disconnected by user
Jan 21 23:48:28 compute-0 sshd-session[215271]: Disconnected from user nova 192.168.122.102 port 52124
Jan 21 23:48:28 compute-0 sshd-session[215251]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:48:28 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Jan 21 23:48:28 compute-0 systemd-logind[784]: Session 36 logged out. Waiting for processes to exit.
Jan 21 23:48:28 compute-0 systemd-logind[784]: Removed session 36.
Jan 21 23:48:28 compute-0 sshd-session[215273]: Accepted publickey for nova from 192.168.122.102 port 52126 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:48:29 compute-0 systemd-logind[784]: New session 38 of user nova.
Jan 21 23:48:29 compute-0 systemd[1]: Started Session 38 of User nova.
Jan 21 23:48:29 compute-0 sshd-session[215273]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:29 compute-0 sshd-session[215276]: Received disconnect from 192.168.122.102 port 52126:11: disconnected by user
Jan 21 23:48:29 compute-0 sshd-session[215276]: Disconnected from user nova 192.168.122.102 port 52126
Jan 21 23:48:29 compute-0 sshd-session[215273]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:48:29 compute-0 systemd[1]: session-38.scope: Deactivated successfully.
Jan 21 23:48:29 compute-0 systemd-logind[784]: Session 38 logged out. Waiting for processes to exit.
Jan 21 23:48:29 compute-0 systemd-logind[784]: Removed session 38.
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.118 182939 DEBUG nova.network.neutron [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Updating instance_info_cache with network_info: [{"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.146 182939 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Releasing lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.166 182939 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu3c6dsuj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2038fd11-9c07-48d0-8092-d973d69d8eb9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.167 182939 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Creating instance directory: /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.168 182939 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Creating disk.info with the contents: {'/var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk': 'qcow2', '/var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.168 182939 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.169 182939 DEBUG nova.objects.instance [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2038fd11-9c07-48d0-8092-d973d69d8eb9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.198 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.288 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.290 182939 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.291 182939 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.304 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.401 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.402 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.441 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.442 182939 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.443 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.506 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.507 182939 DEBUG nova.virt.disk.api [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Checking if we can resize image /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.508 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.569 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.570 182939 DEBUG nova.virt.disk.api [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Cannot resize image /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.571 182939 DEBUG nova.objects.instance [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lazy-loading 'migration_context' on Instance uuid 2038fd11-9c07-48d0-8092-d973d69d8eb9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.586 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.612 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.config 485376" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.614 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.config to /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.615 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.config /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.787 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:29 compute-0 nova_compute[182935]: 2026-01-21 23:48:29.955 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.250 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039295.2501163, efc683b9-a8d9-4a67-bb19-aeaabfbd5423 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.251 182939 INFO nova.compute.manager [-] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] VM Stopped (Lifecycle Event)
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.257 182939 DEBUG oslo_concurrency.processutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9/disk.config /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9" returned: 0 in 0.643s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.258 182939 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.259 182939 DEBUG nova.virt.libvirt.vif [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:48:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1127248027',display_name='tempest-LiveMigrationTest-server-1127248027',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1127248027',id=25,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:48:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-6nt020yh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:48:21Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=2038fd11-9c07-48d0-8092-d973d69d8eb9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.259 182939 DEBUG nova.network.os_vif_util [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converting VIF {"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.260 182939 DEBUG nova.network.os_vif_util [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.260 182939 DEBUG os_vif [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.260 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.261 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.261 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.266 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.266 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbbc46799-07, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.266 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbbc46799-07, col_values=(('external_ids', {'iface-id': 'bbc46799-0727-41d9-9ae1-017037df9492', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:14:7f:86', 'vm-uuid': '2038fd11-9c07-48d0-8092-d973d69d8eb9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.268 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.269 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:48:30 compute-0 NetworkManager[55139]: <info>  [1769039310.2689] manager: (tapbbc46799-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.277 182939 DEBUG nova.compute.manager [None req-ce5ce406-6e36-4df9-95fe-d01c3c262b53 - - - - - -] [instance: efc683b9-a8d9-4a67-bb19-aeaabfbd5423] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.277 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.279 182939 INFO os_vif [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07')
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.280 182939 DEBUG nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Jan 21 23:48:30 compute-0 nova_compute[182935]: 2026-01-21 23:48:30.281 182939 DEBUG nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu3c6dsuj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2038fd11-9c07-48d0-8092-d973d69d8eb9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Jan 21 23:48:30 compute-0 podman[215300]: 2026-01-21 23:48:30.72525038 +0000 UTC m=+0.094939088 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 23:48:32 compute-0 nova_compute[182935]: 2026-01-21 23:48:32.387 182939 DEBUG nova.network.neutron [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Port bbc46799-0727-41d9-9ae1-017037df9492 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Jan 21 23:48:32 compute-0 nova_compute[182935]: 2026-01-21 23:48:32.400 182939 DEBUG nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpu3c6dsuj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2038fd11-9c07-48d0-8092-d973d69d8eb9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Jan 21 23:48:32 compute-0 kernel: tapbbc46799-07: entered promiscuous mode
Jan 21 23:48:32 compute-0 NetworkManager[55139]: <info>  [1769039312.7306] manager: (tapbbc46799-07): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Jan 21 23:48:32 compute-0 ovn_controller[95047]: 2026-01-21T23:48:32Z|00116|binding|INFO|Claiming lport bbc46799-0727-41d9-9ae1-017037df9492 for this additional chassis.
Jan 21 23:48:32 compute-0 nova_compute[182935]: 2026-01-21 23:48:32.732 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:32 compute-0 ovn_controller[95047]: 2026-01-21T23:48:32Z|00117|binding|INFO|bbc46799-0727-41d9-9ae1-017037df9492: Claiming fa:16:3e:14:7f:86 10.100.0.13
Jan 21 23:48:32 compute-0 ovn_controller[95047]: 2026-01-21T23:48:32Z|00118|binding|INFO|Claiming lport 56571b22-2d90-46ed-b4c3-681729d375d9 for this additional chassis.
Jan 21 23:48:32 compute-0 ovn_controller[95047]: 2026-01-21T23:48:32Z|00119|binding|INFO|56571b22-2d90-46ed-b4c3-681729d375d9: Claiming fa:16:3e:cf:be:25 19.80.0.150
Jan 21 23:48:32 compute-0 systemd-udevd[215334]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:48:32 compute-0 NetworkManager[55139]: <info>  [1769039312.7943] device (tapbbc46799-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:48:32 compute-0 NetworkManager[55139]: <info>  [1769039312.7948] device (tapbbc46799-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:48:32 compute-0 systemd-machined[154182]: New machine qemu-15-instance-00000019.
Jan 21 23:48:32 compute-0 nova_compute[182935]: 2026-01-21 23:48:32.819 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:32 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000019.
Jan 21 23:48:32 compute-0 ovn_controller[95047]: 2026-01-21T23:48:32Z|00120|binding|INFO|Setting lport bbc46799-0727-41d9-9ae1-017037df9492 ovn-installed in OVS
Jan 21 23:48:32 compute-0 nova_compute[182935]: 2026-01-21 23:48:32.829 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:33 compute-0 nova_compute[182935]: 2026-01-21 23:48:33.468 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039313.468191, 2038fd11-9c07-48d0-8092-d973d69d8eb9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:33 compute-0 nova_compute[182935]: 2026-01-21 23:48:33.469 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] VM Started (Lifecycle Event)
Jan 21 23:48:33 compute-0 nova_compute[182935]: 2026-01-21 23:48:33.508 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:34 compute-0 nova_compute[182935]: 2026-01-21 23:48:34.451 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039314.450922, 2038fd11-9c07-48d0-8092-d973d69d8eb9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:34 compute-0 nova_compute[182935]: 2026-01-21 23:48:34.453 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] VM Resumed (Lifecycle Event)
Jan 21 23:48:34 compute-0 nova_compute[182935]: 2026-01-21 23:48:34.478 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:34 compute-0 nova_compute[182935]: 2026-01-21 23:48:34.483 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:34 compute-0 nova_compute[182935]: 2026-01-21 23:48:34.510 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 21 23:48:34 compute-0 nova_compute[182935]: 2026-01-21 23:48:34.788 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:35 compute-0 nova_compute[182935]: 2026-01-21 23:48:35.268 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:36 compute-0 ovn_controller[95047]: 2026-01-21T23:48:36Z|00121|binding|INFO|Claiming lport bbc46799-0727-41d9-9ae1-017037df9492 for this chassis.
Jan 21 23:48:36 compute-0 ovn_controller[95047]: 2026-01-21T23:48:36Z|00122|binding|INFO|bbc46799-0727-41d9-9ae1-017037df9492: Claiming fa:16:3e:14:7f:86 10.100.0.13
Jan 21 23:48:36 compute-0 ovn_controller[95047]: 2026-01-21T23:48:36Z|00123|binding|INFO|Claiming lport 56571b22-2d90-46ed-b4c3-681729d375d9 for this chassis.
Jan 21 23:48:36 compute-0 ovn_controller[95047]: 2026-01-21T23:48:36Z|00124|binding|INFO|56571b22-2d90-46ed-b4c3-681729d375d9: Claiming fa:16:3e:cf:be:25 19.80.0.150
Jan 21 23:48:36 compute-0 ovn_controller[95047]: 2026-01-21T23:48:36Z|00125|binding|INFO|Setting lport bbc46799-0727-41d9-9ae1-017037df9492 up in Southbound
Jan 21 23:48:36 compute-0 ovn_controller[95047]: 2026-01-21T23:48:36Z|00126|binding|INFO|Setting lport 56571b22-2d90-46ed-b4c3-681729d375d9 up in Southbound
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.357 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:7f:86 10.100.0.13'], port_security=['fa:16:3e:14:7f:86 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1972857521', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2df233d-b255-4dda-925c-3ccab3a032ee', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1972857521', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ceab9906-340c-4566-81ac-4c6dd292f58f, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=bbc46799-0727-41d9-9ae1-017037df9492) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.359 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:be:25 19.80.0.150'], port_security=['fa:16:3e:cf:be:25 19.80.0.150'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['bbc46799-0727-41d9-9ae1-017037df9492'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1175960055', 'neutron:cidrs': '19.80.0.150/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b0b760c-cbd0-4413-9603-713296c75717', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1175960055', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=05a945c7-2b1a-4093-88a2-493079ba8709, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=56571b22-2d90-46ed-b4c3-681729d375d9) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.360 104408 INFO neutron.agent.ovn.metadata.agent [-] Port bbc46799-0727-41d9-9ae1-017037df9492 in datapath b2df233d-b255-4dda-925c-3ccab3a032ee bound to our chassis
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.361 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.380 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2bfacd7c-055a-463e-a871-8c219b2eca9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.381 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2df233d-b1 in ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.385 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2df233d-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.385 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa3e23a-ca5d-4afa-acbc-964dd91e77b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.386 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e9400afa-32ba-4231-bb03-c477740c66a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.407 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[b07ec06c-bc29-4dbf-8b3d-0250fe172abc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.431 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[58644ffd-ef76-4c1b-82ef-a2ba44a640e2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 nova_compute[182935]: 2026-01-21 23:48:36.456 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.456 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.474 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[5bdf8331-6c6b-4065-b4b7-83574c83ec77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.485 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cd296e04-26d2-4f2f-8ff9-70cee11b62da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 NetworkManager[55139]: <info>  [1769039316.4860] manager: (tapb2df233d-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Jan 21 23:48:36 compute-0 systemd-udevd[215365]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.536 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[1cbf3b12-6b07-4296-90f1-e9b5ded42bb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.542 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[97fcf09e-6147-45ed-a44f-01f29ffa820d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 NetworkManager[55139]: <info>  [1769039316.5855] device (tapb2df233d-b0): carrier: link connected
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.591 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[69a5f98d-300f-4c34-bfe9-3998ed7c270f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.621 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e41f371e-85b5-43eb-8a0f-d06ee7372cf7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2df233d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:e6:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381932, 'reachable_time': 42994, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215384, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.642 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[09f11995-d906-4a7d-b067-4c0b675f66e4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe45:e636'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 381932, 'tstamp': 381932}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215385, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.670 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d8576431-dbd7-4c12-9353-0f6a4db76c50]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2df233d-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:e6:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381932, 'reachable_time': 42994, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215386, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.709 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3177c705-0cfa-40d6-967e-782f8dff052e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 nova_compute[182935]: 2026-01-21 23:48:36.711 182939 INFO nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Post operation of migration started
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.778 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d7921f3e-37ba-473e-af4c-bb44cfbc37fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.780 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2df233d-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.781 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.782 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2df233d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:36 compute-0 nova_compute[182935]: 2026-01-21 23:48:36.803 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:36 compute-0 NetworkManager[55139]: <info>  [1769039316.8044] manager: (tapb2df233d-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 21 23:48:36 compute-0 kernel: tapb2df233d-b0: entered promiscuous mode
Jan 21 23:48:36 compute-0 nova_compute[182935]: 2026-01-21 23:48:36.807 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.808 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2df233d-b0, col_values=(('external_ids', {'iface-id': '75454af0-da31-4238-b248-a6678c575f51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:36 compute-0 nova_compute[182935]: 2026-01-21 23:48:36.810 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:36 compute-0 ovn_controller[95047]: 2026-01-21T23:48:36Z|00127|binding|INFO|Releasing lport 75454af0-da31-4238-b248-a6678c575f51 from this chassis (sb_readonly=0)
Jan 21 23:48:36 compute-0 nova_compute[182935]: 2026-01-21 23:48:36.811 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.813 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2df233d-b255-4dda-925c-3ccab3a032ee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2df233d-b255-4dda-925c-3ccab3a032ee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.814 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[32356f5c-2757-49ae-98e1-1ece25757887]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.815 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/b2df233d-b255-4dda-925c-3ccab3a032ee.pid.haproxy
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID b2df233d-b255-4dda-925c-3ccab3a032ee
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:48:36 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:36.816 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'env', 'PROCESS_TAG=haproxy-b2df233d-b255-4dda-925c-3ccab3a032ee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2df233d-b255-4dda-925c-3ccab3a032ee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:48:36 compute-0 nova_compute[182935]: 2026-01-21 23:48:36.821 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:37 compute-0 podman[215419]: 2026-01-21 23:48:37.212024547 +0000 UTC m=+0.054736921 container create 599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:48:37 compute-0 systemd[1]: Started libpod-conmon-599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b.scope.
Jan 21 23:48:37 compute-0 podman[215419]: 2026-01-21 23:48:37.18543839 +0000 UTC m=+0.028150774 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:48:37 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:48:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90230ea1d3d90d5ca0c5ef95aba00a99e2e4c574b87ccebd9bc6b17c4a0ec577/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:48:37 compute-0 podman[215419]: 2026-01-21 23:48:37.307371795 +0000 UTC m=+0.150084169 container init 599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 23:48:37 compute-0 podman[215419]: 2026-01-21 23:48:37.312581517 +0000 UTC m=+0.155293871 container start 599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 23:48:37 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[215435]: [NOTICE]   (215439) : New worker (215441) forked
Jan 21 23:48:37 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[215435]: [NOTICE]   (215439) : Loading success.
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.377 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 56571b22-2d90-46ed-b4c3-681729d375d9 in datapath 1b0b760c-cbd0-4413-9603-713296c75717 unbound from our chassis
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.379 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1b0b760c-cbd0-4413-9603-713296c75717
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.392 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8be263d2-d1ce-4a57-b787-f89272d34652]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.393 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1b0b760c-c1 in ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.395 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1b0b760c-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.395 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[def4afa6-37fe-4612-a444-71c4525bb79d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.396 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d2926d3a-c899-4b4a-90b5-0eb8976a7357]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.408 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[7a458d1f-80fc-4183-869f-bfe08ae99fe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 nova_compute[182935]: 2026-01-21 23:48:37.433 182939 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:48:37 compute-0 nova_compute[182935]: 2026-01-21 23:48:37.434 182939 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquired lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:48:37 compute-0 nova_compute[182935]: 2026-01-21 23:48:37.434 182939 DEBUG nova.network.neutron [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.434 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[87e2a653-2811-4499-8641-310048f3991d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.461 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[cd67081a-6acb-4c70-8504-1064aa7eba25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.470 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0b9ad24e-572e-45d3-aa99-cc25c8066caf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 NetworkManager[55139]: <info>  [1769039317.4717] manager: (tap1b0b760c-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Jan 21 23:48:37 compute-0 systemd-udevd[215374]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.506 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[72b192af-c13b-4c89-a716-5f765630ce80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.511 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b4c07bda-de67-4f80-b6ec-940f8fc93ce1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 NetworkManager[55139]: <info>  [1769039317.5369] device (tap1b0b760c-c0): carrier: link connected
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.543 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[2b7d14a4-1dbd-4530-a8cb-9d9e53cb4e34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.564 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[63125bad-08bf-455c-abb3-3c052feb8153]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b0b760c-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:4e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382027, 'reachable_time': 23078, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215460, 'error': None, 'target': 'ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.582 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce8d959-64d2-498b-92e4-ba865a0396e6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6d:4e23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382027, 'tstamp': 382027}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215461, 'error': None, 'target': 'ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.608 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4b11b6c8-403d-40e4-8666-a705511c2717]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b0b760c-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:4e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382027, 'reachable_time': 23078, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215462, 'error': None, 'target': 'ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.646 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6974abbf-1bee-44ee-8438-9e9fef47dc81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.698 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[89fcfa74-8809-4a7b-8454-7735ff0dccb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.700 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b0b760c-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.701 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.702 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b0b760c-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:37 compute-0 kernel: tap1b0b760c-c0: entered promiscuous mode
Jan 21 23:48:37 compute-0 NetworkManager[55139]: <info>  [1769039317.7058] manager: (tap1b0b760c-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 21 23:48:37 compute-0 nova_compute[182935]: 2026-01-21 23:48:37.706 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.709 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1b0b760c-c0, col_values=(('external_ids', {'iface-id': '81fbdf60-46d3-442f-bc69-6a381397338b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:37 compute-0 nova_compute[182935]: 2026-01-21 23:48:37.710 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:37 compute-0 ovn_controller[95047]: 2026-01-21T23:48:37Z|00128|binding|INFO|Releasing lport 81fbdf60-46d3-442f-bc69-6a381397338b from this chassis (sb_readonly=0)
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.714 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1b0b760c-cbd0-4413-9603-713296c75717.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1b0b760c-cbd0-4413-9603-713296c75717.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.715 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0c4d1e16-2100-4f4e-9825-fabce770fab6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.715 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-1b0b760c-cbd0-4413-9603-713296c75717
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/1b0b760c-cbd0-4413-9603-713296c75717.pid.haproxy
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 1b0b760c-cbd0-4413-9603-713296c75717
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:48:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:37.716 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717', 'env', 'PROCESS_TAG=haproxy-1b0b760c-cbd0-4413-9603-713296c75717', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1b0b760c-cbd0-4413-9603-713296c75717.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:48:37 compute-0 nova_compute[182935]: 2026-01-21 23:48:37.721 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:37 compute-0 nova_compute[182935]: 2026-01-21 23:48:37.942 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039302.940715, a6a89006-02c9-49b1-8bfb-8640ba1b495f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:37 compute-0 nova_compute[182935]: 2026-01-21 23:48:37.943 182939 INFO nova.compute.manager [-] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] VM Stopped (Lifecycle Event)
Jan 21 23:48:37 compute-0 nova_compute[182935]: 2026-01-21 23:48:37.973 182939 DEBUG nova.compute.manager [None req-d7e962f2-421c-480f-ad10-bfa38e8ac401 - - - - - -] [instance: a6a89006-02c9-49b1-8bfb-8640ba1b495f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:38 compute-0 podman[215494]: 2026-01-21 23:48:38.106208164 +0000 UTC m=+0.056665826 container create 842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 23:48:38 compute-0 systemd[1]: Started libpod-conmon-842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c.scope.
Jan 21 23:48:38 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:48:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af7616e777f40cc26005669c220020e9b092c35c02c4d90dc1b2c1917081d4aa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:48:38 compute-0 podman[215494]: 2026-01-21 23:48:38.079113076 +0000 UTC m=+0.029570758 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:48:38 compute-0 podman[215494]: 2026-01-21 23:48:38.171829131 +0000 UTC m=+0.122286813 container init 842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 23:48:38 compute-0 podman[215494]: 2026-01-21 23:48:38.177192627 +0000 UTC m=+0.127650289 container start 842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 21 23:48:38 compute-0 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[215509]: [NOTICE]   (215513) : New worker (215515) forked
Jan 21 23:48:38 compute-0 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[215509]: [NOTICE]   (215513) : Loading success.
Jan 21 23:48:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:38.251 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:48:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:38.255 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:39 compute-0 nova_compute[182935]: 2026-01-21 23:48:39.025 182939 DEBUG nova.network.neutron [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Updating instance_info_cache with network_info: [{"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:39 compute-0 nova_compute[182935]: 2026-01-21 23:48:39.046 182939 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Releasing lock "refresh_cache-2038fd11-9c07-48d0-8092-d973d69d8eb9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:48:39 compute-0 nova_compute[182935]: 2026-01-21 23:48:39.077 182939 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:39 compute-0 nova_compute[182935]: 2026-01-21 23:48:39.077 182939 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:39 compute-0 nova_compute[182935]: 2026-01-21 23:48:39.078 182939 DEBUG oslo_concurrency.lockutils [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:39 compute-0 nova_compute[182935]: 2026-01-21 23:48:39.082 182939 INFO nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Jan 21 23:48:39 compute-0 virtqemud[182477]: Domain id=15 name='instance-00000019' uuid=2038fd11-9c07-48d0-8092-d973d69d8eb9 is tainted: custom-monitor
Jan 21 23:48:39 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 23:48:39 compute-0 systemd[215255]: Activating special unit Exit the Session...
Jan 21 23:48:39 compute-0 systemd[215255]: Stopped target Main User Target.
Jan 21 23:48:39 compute-0 systemd[215255]: Stopped target Basic System.
Jan 21 23:48:39 compute-0 systemd[215255]: Stopped target Paths.
Jan 21 23:48:39 compute-0 systemd[215255]: Stopped target Sockets.
Jan 21 23:48:39 compute-0 systemd[215255]: Stopped target Timers.
Jan 21 23:48:39 compute-0 systemd[215255]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:48:39 compute-0 systemd[215255]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:48:39 compute-0 systemd[215255]: Closed D-Bus User Message Bus Socket.
Jan 21 23:48:39 compute-0 systemd[215255]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:48:39 compute-0 systemd[215255]: Removed slice User Application Slice.
Jan 21 23:48:39 compute-0 systemd[215255]: Reached target Shutdown.
Jan 21 23:48:39 compute-0 systemd[215255]: Finished Exit the Session.
Jan 21 23:48:39 compute-0 systemd[215255]: Reached target Exit the Session.
Jan 21 23:48:39 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 23:48:39 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 23:48:39 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 23:48:39 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 23:48:39 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 23:48:39 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 23:48:39 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 23:48:39 compute-0 podman[215524]: 2026-01-21 23:48:39.205878423 +0000 UTC m=+0.075092120 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal)
Jan 21 23:48:39 compute-0 podman[215525]: 2026-01-21 23:48:39.220056978 +0000 UTC m=+0.078974063 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 23:48:39 compute-0 nova_compute[182935]: 2026-01-21 23:48:39.790 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:40 compute-0 nova_compute[182935]: 2026-01-21 23:48:40.093 182939 INFO nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Jan 21 23:48:40 compute-0 nova_compute[182935]: 2026-01-21 23:48:40.271 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:41 compute-0 nova_compute[182935]: 2026-01-21 23:48:41.102 182939 INFO nova.virt.libvirt.driver [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Jan 21 23:48:41 compute-0 nova_compute[182935]: 2026-01-21 23:48:41.108 182939 DEBUG nova.compute.manager [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:41 compute-0 nova_compute[182935]: 2026-01-21 23:48:41.134 182939 DEBUG nova.objects.instance [None req-33db5e23-5585-4e5a-977e-1f0387bd5035 1adbe9e68da64ab2b37ec5153a34449e f1b5630a8807401cb50490834d2d16d7 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.224 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "b448a112-7efc-4f54-b6db-0aabc1bf767d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.225 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.246 182939 DEBUG nova.compute.manager [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.361 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.362 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.371 182939 DEBUG nova.virt.hardware [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.371 182939 INFO nova.compute.claims [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.496 182939 DEBUG nova.scheduler.client.report [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.517 182939 DEBUG nova.scheduler.client.report [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.519 182939 DEBUG nova.compute.provider_tree [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.535 182939 DEBUG nova.scheduler.client.report [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.564 182939 DEBUG nova.scheduler.client.report [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 23:48:42 compute-0 sshd-session[215566]: Accepted publickey for nova from 192.168.122.102 port 58850 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:48:42 compute-0 systemd-logind[784]: New session 39 of user nova.
Jan 21 23:48:42 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 23:48:42 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 23:48:42 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 23:48:42 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 23:48:42 compute-0 systemd[215570]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.683 182939 DEBUG nova.compute.provider_tree [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.705 182939 DEBUG nova.scheduler.client.report [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.731 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.370s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.733 182939 DEBUG nova.compute.manager [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:48:42 compute-0 systemd[215570]: Queued start job for default target Main User Target.
Jan 21 23:48:42 compute-0 systemd[215570]: Created slice User Application Slice.
Jan 21 23:48:42 compute-0 systemd[215570]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:48:42 compute-0 systemd[215570]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:48:42 compute-0 systemd[215570]: Reached target Paths.
Jan 21 23:48:42 compute-0 systemd[215570]: Reached target Timers.
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.810 182939 DEBUG nova.compute.manager [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:48:42 compute-0 systemd[215570]: Starting D-Bus User Message Bus Socket...
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.811 182939 DEBUG nova.network.neutron [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:48:42 compute-0 systemd[215570]: Starting Create User's Volatile Files and Directories...
Jan 21 23:48:42 compute-0 systemd[215570]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:48:42 compute-0 systemd[215570]: Finished Create User's Volatile Files and Directories.
Jan 21 23:48:42 compute-0 systemd[215570]: Reached target Sockets.
Jan 21 23:48:42 compute-0 systemd[215570]: Reached target Basic System.
Jan 21 23:48:42 compute-0 systemd[215570]: Reached target Main User Target.
Jan 21 23:48:42 compute-0 systemd[215570]: Startup finished in 134ms.
Jan 21 23:48:42 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.831 182939 INFO nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:48:42 compute-0 systemd[1]: Started Session 39 of User nova.
Jan 21 23:48:42 compute-0 nova_compute[182935]: 2026-01-21 23:48:42.851 182939 DEBUG nova.compute.manager [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:48:42 compute-0 sshd-session[215566]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.034 182939 DEBUG nova.compute.manager [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.036 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.037 182939 INFO nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Creating image(s)
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.037 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "/var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.038 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "/var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.039 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "/var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.050 182939 DEBUG oslo_concurrency.processutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.108 182939 DEBUG oslo_concurrency.processutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.109 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.110 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.121 182939 DEBUG oslo_concurrency.processutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.179 182939 DEBUG oslo_concurrency.processutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.180 182939 DEBUG oslo_concurrency.processutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.281 182939 DEBUG oslo_concurrency.processutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk 1073741824" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.283 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.284 182939 DEBUG oslo_concurrency.processutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:43 compute-0 sshd-session[215585]: Received disconnect from 192.168.122.102 port 58850:11: disconnected by user
Jan 21 23:48:43 compute-0 sshd-session[215585]: Disconnected from user nova 192.168.122.102 port 58850
Jan 21 23:48:43 compute-0 sshd-session[215566]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:48:43 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Jan 21 23:48:43 compute-0 systemd-logind[784]: Session 39 logged out. Waiting for processes to exit.
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.350 182939 DEBUG nova.policy [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c3f927acf834c718155d5ee5dd81b19', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2edcdd2e6c5a46cb95eb89874a9cb5f3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:48:43 compute-0 systemd-logind[784]: Removed session 39.
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.360 182939 DEBUG oslo_concurrency.processutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.361 182939 DEBUG nova.virt.disk.api [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Checking if we can resize image /var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.361 182939 DEBUG oslo_concurrency.processutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.420 182939 DEBUG oslo_concurrency.processutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.421 182939 DEBUG nova.virt.disk.api [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Cannot resize image /var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.421 182939 DEBUG nova.objects.instance [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lazy-loading 'migration_context' on Instance uuid b448a112-7efc-4f54-b6db-0aabc1bf767d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.439 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.440 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Ensure instance console log exists: /var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.440 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.440 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.441 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:43 compute-0 sshd-session[215600]: Accepted publickey for nova from 192.168.122.102 port 58862 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:48:43 compute-0 systemd-logind[784]: New session 41 of user nova.
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.504 182939 DEBUG oslo_concurrency.lockutils [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.505 182939 DEBUG oslo_concurrency.lockutils [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.505 182939 DEBUG oslo_concurrency.lockutils [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.505 182939 DEBUG oslo_concurrency.lockutils [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.505 182939 DEBUG oslo_concurrency.lockutils [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:43 compute-0 systemd[1]: Started Session 41 of User nova.
Jan 21 23:48:43 compute-0 sshd-session[215600]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.520 182939 INFO nova.compute.manager [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Terminating instance
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.534 182939 DEBUG nova.compute.manager [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:48:43 compute-0 kernel: tapbbc46799-07 (unregistering): left promiscuous mode
Jan 21 23:48:43 compute-0 NetworkManager[55139]: <info>  [1769039323.5697] device (tapbbc46799-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.579 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:43 compute-0 ovn_controller[95047]: 2026-01-21T23:48:43Z|00129|binding|INFO|Releasing lport bbc46799-0727-41d9-9ae1-017037df9492 from this chassis (sb_readonly=0)
Jan 21 23:48:43 compute-0 ovn_controller[95047]: 2026-01-21T23:48:43Z|00130|binding|INFO|Setting lport bbc46799-0727-41d9-9ae1-017037df9492 down in Southbound
Jan 21 23:48:43 compute-0 ovn_controller[95047]: 2026-01-21T23:48:43Z|00131|binding|INFO|Releasing lport 56571b22-2d90-46ed-b4c3-681729d375d9 from this chassis (sb_readonly=0)
Jan 21 23:48:43 compute-0 ovn_controller[95047]: 2026-01-21T23:48:43Z|00132|binding|INFO|Setting lport 56571b22-2d90-46ed-b4c3-681729d375d9 down in Southbound
Jan 21 23:48:43 compute-0 ovn_controller[95047]: 2026-01-21T23:48:43Z|00133|binding|INFO|Removing iface tapbbc46799-07 ovn-installed in OVS
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.582 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:43 compute-0 ovn_controller[95047]: 2026-01-21T23:48:43Z|00134|binding|INFO|Releasing lport 75454af0-da31-4238-b248-a6678c575f51 from this chassis (sb_readonly=0)
Jan 21 23:48:43 compute-0 ovn_controller[95047]: 2026-01-21T23:48:43Z|00135|binding|INFO|Releasing lport 81fbdf60-46d3-442f-bc69-6a381397338b from this chassis (sb_readonly=0)
Jan 21 23:48:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:43.592 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:7f:86 10.100.0.13'], port_security=['fa:16:3e:14:7f:86 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1972857521', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '2038fd11-9c07-48d0-8092-d973d69d8eb9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2df233d-b255-4dda-925c-3ccab3a032ee', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1972857521', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ceab9906-340c-4566-81ac-4c6dd292f58f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=bbc46799-0727-41d9-9ae1-017037df9492) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:43.595 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:be:25 19.80.0.150'], port_security=['fa:16:3e:cf:be:25 19.80.0.150'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['bbc46799-0727-41d9-9ae1-017037df9492'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1175960055', 'neutron:cidrs': '19.80.0.150/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b0b760c-cbd0-4413-9603-713296c75717', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1175960055', 'neutron:project_id': 'cdcb2f57183e484cace5d5f78dd635a1', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'b0d61dfd-cc58-4a7f-a4c1-6c8c92847694', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=05a945c7-2b1a-4093-88a2-493079ba8709, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=56571b22-2d90-46ed-b4c3-681729d375d9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:43.597 104408 INFO neutron.agent.ovn.metadata.agent [-] Port bbc46799-0727-41d9-9ae1-017037df9492 in datapath b2df233d-b255-4dda-925c-3ccab3a032ee unbound from our chassis
Jan 21 23:48:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:43.600 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2df233d-b255-4dda-925c-3ccab3a032ee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:48:43 compute-0 sshd-session[215605]: Received disconnect from 192.168.122.102 port 58862:11: disconnected by user
Jan 21 23:48:43 compute-0 sshd-session[215605]: Disconnected from user nova 192.168.122.102 port 58862
Jan 21 23:48:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:43.601 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e6e840-7de7-4571-9f7f-3239552fe47c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:43.603 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee namespace which is not needed anymore
Jan 21 23:48:43 compute-0 sshd-session[215600]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:48:43 compute-0 systemd[1]: session-41.scope: Deactivated successfully.
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.610 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:43 compute-0 systemd-logind[784]: Session 41 logged out. Waiting for processes to exit.
Jan 21 23:48:43 compute-0 systemd-logind[784]: Removed session 41.
Jan 21 23:48:43 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 21 23:48:43 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000019.scope: Consumed 1.926s CPU time.
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.672 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:43 compute-0 systemd-machined[154182]: Machine qemu-15-instance-00000019 terminated.
Jan 21 23:48:43 compute-0 sshd-session[215618]: Accepted publickey for nova from 192.168.122.102 port 58866 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:48:43 compute-0 systemd-logind[784]: New session 42 of user nova.
Jan 21 23:48:43 compute-0 systemd[1]: Started Session 42 of User nova.
Jan 21 23:48:43 compute-0 sshd-session[215618]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:48:43 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[215435]: [NOTICE]   (215439) : haproxy version is 2.8.14-c23fe91
Jan 21 23:48:43 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[215435]: [NOTICE]   (215439) : path to executable is /usr/sbin/haproxy
Jan 21 23:48:43 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[215435]: [WARNING]  (215439) : Exiting Master process...
Jan 21 23:48:43 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[215435]: [ALERT]    (215439) : Current worker (215441) exited with code 143 (Terminated)
Jan 21 23:48:43 compute-0 neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee[215435]: [WARNING]  (215439) : All workers exited. Exiting... (0)
Jan 21 23:48:43 compute-0 systemd[1]: libpod-599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b.scope: Deactivated successfully.
Jan 21 23:48:43 compute-0 podman[215633]: 2026-01-21 23:48:43.811238756 +0000 UTC m=+0.074280852 container died 599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.815 182939 INFO nova.virt.libvirt.driver [-] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Instance destroyed successfully.
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.816 182939 DEBUG nova.objects.instance [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lazy-loading 'resources' on Instance uuid 2038fd11-9c07-48d0-8092-d973d69d8eb9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b-userdata-shm.mount: Deactivated successfully.
Jan 21 23:48:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-90230ea1d3d90d5ca0c5ef95aba00a99e2e4c574b87ccebd9bc6b17c4a0ec577-merged.mount: Deactivated successfully.
Jan 21 23:48:43 compute-0 podman[215633]: 2026-01-21 23:48:43.853053132 +0000 UTC m=+0.116095228 container cleanup 599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.854 182939 DEBUG nova.virt.libvirt.vif [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:48:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1127248027',display_name='tempest-LiveMigrationTest-server-1127248027',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1127248027',id=25,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:48:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cdcb2f57183e484cace5d5f78dd635a1',ramdisk_id='',reservation_id='r-6nt020yh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-430976321',owner_user_name='tempest-LiveMigrationTest-430976321-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:48:41Z,user_data=None,user_id='d4ff24d8abf8416db9d64c645436c5f1',uuid=2038fd11-9c07-48d0-8092-d973d69d8eb9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.855 182939 DEBUG nova.network.os_vif_util [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Converting VIF {"id": "bbc46799-0727-41d9-9ae1-017037df9492", "address": "fa:16:3e:14:7f:86", "network": {"id": "b2df233d-b255-4dda-925c-3ccab3a032ee", "bridge": "br-int", "label": "tempest-LiveMigrationTest-283223724-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cdcb2f57183e484cace5d5f78dd635a1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbbc46799-07", "ovs_interfaceid": "bbc46799-0727-41d9-9ae1-017037df9492", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.856 182939 DEBUG nova.network.os_vif_util [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.856 182939 DEBUG os_vif [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.859 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.859 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbbc46799-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.861 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.862 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:43 compute-0 sshd-session[215664]: Received disconnect from 192.168.122.102 port 58866:11: disconnected by user
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.866 182939 INFO os_vif [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:14:7f:86,bridge_name='br-int',has_traffic_filtering=True,id=bbc46799-0727-41d9-9ae1-017037df9492,network=Network(b2df233d-b255-4dda-925c-3ccab3a032ee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbbc46799-07')
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.867 182939 INFO nova.virt.libvirt.driver [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Deleting instance files /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9_del
Jan 21 23:48:43 compute-0 sshd-session[215664]: Disconnected from user nova 192.168.122.102 port 58866
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.868 182939 INFO nova.virt.libvirt.driver [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Deletion of /var/lib/nova/instances/2038fd11-9c07-48d0-8092-d973d69d8eb9_del complete
Jan 21 23:48:43 compute-0 sshd-session[215618]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:48:43 compute-0 systemd[1]: session-42.scope: Deactivated successfully.
Jan 21 23:48:43 compute-0 systemd-logind[784]: Session 42 logged out. Waiting for processes to exit.
Jan 21 23:48:43 compute-0 systemd[1]: libpod-conmon-599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b.scope: Deactivated successfully.
Jan 21 23:48:43 compute-0 systemd-logind[784]: Removed session 42.
Jan 21 23:48:43 compute-0 podman[215680]: 2026-01-21 23:48:43.953941619 +0000 UTC m=+0.058058249 container remove 599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 23:48:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:43.961 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b4984f69-42ad-4063-aa1d-8a2f3b35c3bd]: (4, ('Wed Jan 21 11:48:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee (599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b)\n599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b\nWed Jan 21 11:48:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee (599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b)\n599a90d72db50d3180a3cecf22bbc8b61806f529dc518d65e2dd50bec69b777b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:43.963 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ffaf08-f226-42c6-95df-c5dc52657e7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:43.964 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2df233d-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.966 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:43 compute-0 kernel: tapb2df233d-b0: left promiscuous mode
Jan 21 23:48:43 compute-0 nova_compute[182935]: 2026-01-21 23:48:43.979 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:43.983 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[19cf0299-af58-42ee-b170-d794d39f26e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.011 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[280e123f-cff3-4719-ac37-55f2a6566524]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.012 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0713cb-a4c5-46d0-a9ca-0513b8ebda98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.030 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[770f5270-170b-4f2c-81a1-1e291cccefb4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381920, 'reachable_time': 17016, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215698, 'error': None, 'target': 'ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 systemd[1]: run-netns-ovnmeta\x2db2df233d\x2db255\x2d4dda\x2d925c\x2d3ccab3a032ee.mount: Deactivated successfully.
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.034 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2df233d-b255-4dda-925c-3ccab3a032ee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.034 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[e39bc9d9-1734-43b1-9b9f-b4748236482d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.035 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 56571b22-2d90-46ed-b4c3-681729d375d9 in datapath 1b0b760c-cbd0-4413-9603-713296c75717 unbound from our chassis
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.034 182939 INFO nova.compute.manager [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Took 0.50 seconds to destroy the instance on the hypervisor.
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.035 182939 DEBUG oslo.service.loopingcall [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.036 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1b0b760c-cbd0-4413-9603-713296c75717, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.036 182939 DEBUG nova.compute.manager [-] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.037 182939 DEBUG nova.network.neutron [-] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.037 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ad201f6d-b695-4636-baeb-6efd0cc30b57]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.038 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717 namespace which is not needed anymore
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.048 182939 DEBUG nova.compute.manager [req-1aa8d8ca-35d3-4458-8cc9-163f164a5fcd req-a79faa9b-ec1f-4408-843e-4a1b95036e69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-unplugged-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.049 182939 DEBUG oslo_concurrency.lockutils [req-1aa8d8ca-35d3-4458-8cc9-163f164a5fcd req-a79faa9b-ec1f-4408-843e-4a1b95036e69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.049 182939 DEBUG oslo_concurrency.lockutils [req-1aa8d8ca-35d3-4458-8cc9-163f164a5fcd req-a79faa9b-ec1f-4408-843e-4a1b95036e69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.049 182939 DEBUG oslo_concurrency.lockutils [req-1aa8d8ca-35d3-4458-8cc9-163f164a5fcd req-a79faa9b-ec1f-4408-843e-4a1b95036e69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.049 182939 DEBUG nova.compute.manager [req-1aa8d8ca-35d3-4458-8cc9-163f164a5fcd req-a79faa9b-ec1f-4408-843e-4a1b95036e69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] No waiting events found dispatching network-vif-unplugged-bbc46799-0727-41d9-9ae1-017037df9492 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.049 182939 DEBUG nova.compute.manager [req-1aa8d8ca-35d3-4458-8cc9-163f164a5fcd req-a79faa9b-ec1f-4408-843e-4a1b95036e69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-unplugged-bbc46799-0727-41d9-9ae1-017037df9492 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:48:44 compute-0 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[215509]: [NOTICE]   (215513) : haproxy version is 2.8.14-c23fe91
Jan 21 23:48:44 compute-0 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[215509]: [NOTICE]   (215513) : path to executable is /usr/sbin/haproxy
Jan 21 23:48:44 compute-0 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[215509]: [WARNING]  (215513) : Exiting Master process...
Jan 21 23:48:44 compute-0 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[215509]: [ALERT]    (215513) : Current worker (215515) exited with code 143 (Terminated)
Jan 21 23:48:44 compute-0 neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717[215509]: [WARNING]  (215513) : All workers exited. Exiting... (0)
Jan 21 23:48:44 compute-0 systemd[1]: libpod-842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c.scope: Deactivated successfully.
Jan 21 23:48:44 compute-0 podman[215714]: 2026-01-21 23:48:44.196373934 +0000 UTC m=+0.050520692 container died 842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:48:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c-userdata-shm.mount: Deactivated successfully.
Jan 21 23:48:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-af7616e777f40cc26005669c220020e9b092c35c02c4d90dc1b2c1917081d4aa-merged.mount: Deactivated successfully.
Jan 21 23:48:44 compute-0 podman[215714]: 2026-01-21 23:48:44.238563048 +0000 UTC m=+0.092709806 container cleanup 842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 23:48:44 compute-0 systemd[1]: libpod-conmon-842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c.scope: Deactivated successfully.
Jan 21 23:48:44 compute-0 podman[215746]: 2026-01-21 23:48:44.314470477 +0000 UTC m=+0.042380000 container remove 842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.321 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[582ba4f1-6ff9-4270-ac68-8b589273fada]: (4, ('Wed Jan 21 11:48:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717 (842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c)\n842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c\nWed Jan 21 11:48:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717 (842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c)\n842b548d0479c3a10121e1a07c0cbf21f69bea630c2f9a95ab4c0c9a15bf0f9c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.323 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fe6faed7-e236-4ae9-bc27-30842a258e2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.324 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b0b760c-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.325 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:44 compute-0 kernel: tap1b0b760c-c0: left promiscuous mode
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.341 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.343 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.347 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[29e51a1f-245a-449b-a0b5-92c6b22b428c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.366 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[74a983d3-d46a-463f-9846-0538be3ffe46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.368 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1e71e07a-4e93-4628-823b-e580e34984a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.385 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[68e396ac-b0d0-4fc9-85c5-1542fb0f4abd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382019, 'reachable_time': 15175, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215761, 'error': None, 'target': 'ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.388 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1b0b760c-cbd0-4413-9603-713296c75717 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:48:44 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:44.388 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[7d34d851-f49a-46cd-90b8-81e09f4bb1a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.679 182939 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-63b2e61e-8ad4-44e9-ba44-db37454a4b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.679 182939 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-63b2e61e-8ad4-44e9-ba44-db37454a4b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.680 182939 DEBUG nova.network.neutron [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.717 182939 DEBUG nova.network.neutron [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Successfully created port: c4a47993-b901-4550-97cf-5b9a89730459 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.792 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d1b0b760c\x2dcbd0\x2d4413\x2d9603\x2d713296c75717.mount: Deactivated successfully.
Jan 21 23:48:44 compute-0 nova_compute[182935]: 2026-01-21 23:48:44.855 182939 DEBUG nova.network.neutron [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.186 182939 DEBUG nova.network.neutron [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.213 182939 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-63b2e61e-8ad4-44e9-ba44-db37454a4b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.346 182939 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.347 182939 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.347 182939 INFO nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Creating image(s)
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.348 182939 DEBUG nova.objects.instance [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 63b2e61e-8ad4-44e9-ba44-db37454a4b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.362 182939 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.420 182939 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.421 182939 DEBUG nova.virt.disk.api [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Checking if we can resize image /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.422 182939 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.483 182939 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.485 182939 DEBUG nova.virt.disk.api [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Cannot resize image /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.511 182939 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.511 182939 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Ensure instance console log exists: /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.512 182939 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.512 182939 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.512 182939 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.514 182939 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.520 182939 WARNING nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.526 182939 DEBUG nova.virt.libvirt.host [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.527 182939 DEBUG nova.virt.libvirt.host [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.531 182939 DEBUG nova.virt.libvirt.host [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.531 182939 DEBUG nova.virt.libvirt.host [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.533 182939 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.534 182939 DEBUG nova.virt.hardware [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ff01ccba-ad51-439f-9037-926190d6dc0f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.534 182939 DEBUG nova.virt.hardware [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.534 182939 DEBUG nova.virt.hardware [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.534 182939 DEBUG nova.virt.hardware [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.535 182939 DEBUG nova.virt.hardware [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.535 182939 DEBUG nova.virt.hardware [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.535 182939 DEBUG nova.virt.hardware [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.535 182939 DEBUG nova.virt.hardware [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.535 182939 DEBUG nova.virt.hardware [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.536 182939 DEBUG nova.virt.hardware [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.536 182939 DEBUG nova.virt.hardware [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.536 182939 DEBUG nova.objects.instance [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 63b2e61e-8ad4-44e9-ba44-db37454a4b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.556 182939 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.590 182939 DEBUG nova.network.neutron [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Successfully updated port: c4a47993-b901-4550-97cf-5b9a89730459 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.619 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "refresh_cache-b448a112-7efc-4f54-b6db-0aabc1bf767d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.620 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquired lock "refresh_cache-b448a112-7efc-4f54-b6db-0aabc1bf767d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.621 182939 DEBUG nova.network.neutron [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.657 182939 DEBUG oslo_concurrency.processutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.config --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.658 182939 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.658 182939 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.659 182939 DEBUG oslo_concurrency.lockutils [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.662 182939 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:48:45 compute-0 nova_compute[182935]:   <uuid>63b2e61e-8ad4-44e9-ba44-db37454a4b34</uuid>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   <name>instance-0000001a</name>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   <memory>196608</memory>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <nova:name>tempest-MigrationsAdminTest-server-1192752510</nova:name>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:48:45</nova:creationTime>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <nova:flavor name="m1.micro">
Jan 21 23:48:45 compute-0 nova_compute[182935]:         <nova:memory>192</nova:memory>
Jan 21 23:48:45 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:48:45 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:48:45 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:48:45 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:48:45 compute-0 nova_compute[182935]:         <nova:user uuid="36d71830ce70436e97fbc17b6da8d3c6">tempest-MigrationsAdminTest-1559502816-project-member</nova:user>
Jan 21 23:48:45 compute-0 nova_compute[182935]:         <nova:project uuid="95574103d0094883861c58d01690e5a3">tempest-MigrationsAdminTest-1559502816</nova:project>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <system>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <entry name="serial">63b2e61e-8ad4-44e9-ba44-db37454a4b34</entry>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <entry name="uuid">63b2e61e-8ad4-44e9-ba44-db37454a4b34</entry>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     </system>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   <os>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   </os>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   <features>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   </features>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk.config"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/console.log" append="off"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <video>
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     </video>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:48:45 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:48:45 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:48:45 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:48:45 compute-0 nova_compute[182935]: </domain>
Jan 21 23:48:45 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.686 182939 DEBUG nova.compute.manager [req-b5501ff3-ba0e-4d0b-85fe-6fdc310408d7 req-43e46e01-0ee4-4c97-b8b7-762accb56685 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Received event network-changed-c4a47993-b901-4550-97cf-5b9a89730459 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.687 182939 DEBUG nova.compute.manager [req-b5501ff3-ba0e-4d0b-85fe-6fdc310408d7 req-43e46e01-0ee4-4c97-b8b7-762accb56685 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Refreshing instance network info cache due to event network-changed-c4a47993-b901-4550-97cf-5b9a89730459. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.687 182939 DEBUG oslo_concurrency.lockutils [req-b5501ff3-ba0e-4d0b-85fe-6fdc310408d7 req-43e46e01-0ee4-4c97-b8b7-762accb56685 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-b448a112-7efc-4f54-b6db-0aabc1bf767d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.749 182939 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.750 182939 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.751 182939 INFO nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Using config drive
Jan 21 23:48:45 compute-0 systemd-machined[154182]: New machine qemu-16-instance-0000001a.
Jan 21 23:48:45 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-0000001a.
Jan 21 23:48:45 compute-0 nova_compute[182935]: 2026-01-21 23:48:45.883 182939 DEBUG nova.network.neutron [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.097 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039326.096432, 63b2e61e-8ad4-44e9-ba44-db37454a4b34 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.097 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] VM Resumed (Lifecycle Event)
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.099 182939 DEBUG nova.compute.manager [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.104 182939 INFO nova.virt.libvirt.driver [-] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance running successfully.
Jan 21 23:48:46 compute-0 virtqemud[182477]: argument unsupported: QEMU guest agent is not configured
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.106 182939 DEBUG nova.virt.libvirt.guest [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.106 182939 DEBUG nova.virt.libvirt.driver [None req-15a89eb9-8e01-4c87-aafa-5521252a6100 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.133 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.139 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.179 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.180 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039326.096741, 63b2e61e-8ad4-44e9-ba44-db37454a4b34 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.180 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] VM Started (Lifecycle Event)
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.207 182939 DEBUG nova.compute.manager [req-5eec0fe5-6942-4700-9951-e95c80d2750e req-7301a092-d918-4e24-839d-5ac10f810c6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.207 182939 DEBUG oslo_concurrency.lockutils [req-5eec0fe5-6942-4700-9951-e95c80d2750e req-7301a092-d918-4e24-839d-5ac10f810c6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.208 182939 DEBUG oslo_concurrency.lockutils [req-5eec0fe5-6942-4700-9951-e95c80d2750e req-7301a092-d918-4e24-839d-5ac10f810c6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.208 182939 DEBUG oslo_concurrency.lockutils [req-5eec0fe5-6942-4700-9951-e95c80d2750e req-7301a092-d918-4e24-839d-5ac10f810c6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.208 182939 DEBUG nova.compute.manager [req-5eec0fe5-6942-4700-9951-e95c80d2750e req-7301a092-d918-4e24-839d-5ac10f810c6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] No waiting events found dispatching network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.209 182939 WARNING nova.compute.manager [req-5eec0fe5-6942-4700-9951-e95c80d2750e req-7301a092-d918-4e24-839d-5ac10f810c6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Received unexpected event network-vif-plugged-bbc46799-0727-41d9-9ae1-017037df9492 for instance with vm_state active and task_state deleting.
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.220 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.226 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.353 182939 DEBUG nova.network.neutron [-] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.403 182939 INFO nova.compute.manager [-] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Took 2.37 seconds to deallocate network for instance.
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.603 182939 DEBUG oslo_concurrency.lockutils [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.604 182939 DEBUG oslo_concurrency.lockutils [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.613 182939 DEBUG oslo_concurrency.lockutils [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.650 182939 INFO nova.scheduler.client.report [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Deleted allocations for instance 2038fd11-9c07-48d0-8092-d973d69d8eb9
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.751 182939 DEBUG oslo_concurrency.lockutils [None req-3dea40b4-406b-4545-9f02-86ae494b443d d4ff24d8abf8416db9d64c645436c5f1 cdcb2f57183e484cace5d5f78dd635a1 - - default default] Lock "2038fd11-9c07-48d0-8092-d973d69d8eb9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.878 182939 DEBUG nova.network.neutron [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Updating instance_info_cache with network_info: [{"id": "c4a47993-b901-4550-97cf-5b9a89730459", "address": "fa:16:3e:8c:2f:a3", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a47993-b9", "ovs_interfaceid": "c4a47993-b901-4550-97cf-5b9a89730459", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.907 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Releasing lock "refresh_cache-b448a112-7efc-4f54-b6db-0aabc1bf767d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.907 182939 DEBUG nova.compute.manager [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Instance network_info: |[{"id": "c4a47993-b901-4550-97cf-5b9a89730459", "address": "fa:16:3e:8c:2f:a3", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a47993-b9", "ovs_interfaceid": "c4a47993-b901-4550-97cf-5b9a89730459", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.909 182939 DEBUG oslo_concurrency.lockutils [req-b5501ff3-ba0e-4d0b-85fe-6fdc310408d7 req-43e46e01-0ee4-4c97-b8b7-762accb56685 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-b448a112-7efc-4f54-b6db-0aabc1bf767d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.909 182939 DEBUG nova.network.neutron [req-b5501ff3-ba0e-4d0b-85fe-6fdc310408d7 req-43e46e01-0ee4-4c97-b8b7-762accb56685 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Refreshing network info cache for port c4a47993-b901-4550-97cf-5b9a89730459 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.915 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Start _get_guest_xml network_info=[{"id": "c4a47993-b901-4550-97cf-5b9a89730459", "address": "fa:16:3e:8c:2f:a3", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a47993-b9", "ovs_interfaceid": "c4a47993-b901-4550-97cf-5b9a89730459", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.923 182939 WARNING nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.928 182939 DEBUG nova.virt.libvirt.host [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.929 182939 DEBUG nova.virt.libvirt.host [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.933 182939 DEBUG nova.virt.libvirt.host [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.934 182939 DEBUG nova.virt.libvirt.host [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.935 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.935 182939 DEBUG nova.virt.hardware [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.936 182939 DEBUG nova.virt.hardware [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.936 182939 DEBUG nova.virt.hardware [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.937 182939 DEBUG nova.virt.hardware [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.937 182939 DEBUG nova.virt.hardware [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.937 182939 DEBUG nova.virt.hardware [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.938 182939 DEBUG nova.virt.hardware [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.938 182939 DEBUG nova.virt.hardware [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.938 182939 DEBUG nova.virt.hardware [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.939 182939 DEBUG nova.virt.hardware [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.939 182939 DEBUG nova.virt.hardware [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.944 182939 DEBUG nova.virt.libvirt.vif [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:48:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1244489397',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1244489397',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1244489397',id=28,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2edcdd2e6c5a46cb95eb89874a9cb5f3',ramdisk_id='',reservation_id='r-k46jxfrr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-222133061',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-222133061-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:48:42Z,user_data=None,user_id='0c3f927acf834c718155d5ee5dd81b19',uuid=b448a112-7efc-4f54-b6db-0aabc1bf767d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c4a47993-b901-4550-97cf-5b9a89730459", "address": "fa:16:3e:8c:2f:a3", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a47993-b9", "ovs_interfaceid": "c4a47993-b901-4550-97cf-5b9a89730459", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.944 182939 DEBUG nova.network.os_vif_util [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converting VIF {"id": "c4a47993-b901-4550-97cf-5b9a89730459", "address": "fa:16:3e:8c:2f:a3", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a47993-b9", "ovs_interfaceid": "c4a47993-b901-4550-97cf-5b9a89730459", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.945 182939 DEBUG nova.network.os_vif_util [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:2f:a3,bridge_name='br-int',has_traffic_filtering=True,id=c4a47993-b901-4550-97cf-5b9a89730459,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a47993-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.949 182939 DEBUG nova.objects.instance [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lazy-loading 'pci_devices' on Instance uuid b448a112-7efc-4f54-b6db-0aabc1bf767d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.966 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:48:46 compute-0 nova_compute[182935]:   <uuid>b448a112-7efc-4f54-b6db-0aabc1bf767d</uuid>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   <name>instance-0000001c</name>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-1244489397</nova:name>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:48:46</nova:creationTime>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:48:46 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:48:46 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:48:46 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:48:46 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:48:46 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:48:46 compute-0 nova_compute[182935]:         <nova:user uuid="0c3f927acf834c718155d5ee5dd81b19">tempest-ImagesOneServerNegativeTestJSON-222133061-project-member</nova:user>
Jan 21 23:48:46 compute-0 nova_compute[182935]:         <nova:project uuid="2edcdd2e6c5a46cb95eb89874a9cb5f3">tempest-ImagesOneServerNegativeTestJSON-222133061</nova:project>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:48:46 compute-0 nova_compute[182935]:         <nova:port uuid="c4a47993-b901-4550-97cf-5b9a89730459">
Jan 21 23:48:46 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <system>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <entry name="serial">b448a112-7efc-4f54-b6db-0aabc1bf767d</entry>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <entry name="uuid">b448a112-7efc-4f54-b6db-0aabc1bf767d</entry>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     </system>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   <os>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   </os>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   <features>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   </features>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk.config"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:8c:2f:a3"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <target dev="tapc4a47993-b9"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/console.log" append="off"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <video>
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     </video>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:48:46 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:48:46 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:48:46 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:48:46 compute-0 nova_compute[182935]: </domain>
Jan 21 23:48:46 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.967 182939 DEBUG nova.compute.manager [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Preparing to wait for external event network-vif-plugged-c4a47993-b901-4550-97cf-5b9a89730459 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.967 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.967 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.967 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.968 182939 DEBUG nova.virt.libvirt.vif [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:48:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1244489397',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1244489397',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1244489397',id=28,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2edcdd2e6c5a46cb95eb89874a9cb5f3',ramdisk_id='',reservation_id='r-k46jxfrr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-222133061',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-222133061-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:48:42Z,user_data=None,user_id='0c3f927acf834c718155d5ee5dd81b19',uuid=b448a112-7efc-4f54-b6db-0aabc1bf767d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c4a47993-b901-4550-97cf-5b9a89730459", "address": "fa:16:3e:8c:2f:a3", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a47993-b9", "ovs_interfaceid": "c4a47993-b901-4550-97cf-5b9a89730459", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.969 182939 DEBUG nova.network.os_vif_util [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converting VIF {"id": "c4a47993-b901-4550-97cf-5b9a89730459", "address": "fa:16:3e:8c:2f:a3", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a47993-b9", "ovs_interfaceid": "c4a47993-b901-4550-97cf-5b9a89730459", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.969 182939 DEBUG nova.network.os_vif_util [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:2f:a3,bridge_name='br-int',has_traffic_filtering=True,id=c4a47993-b901-4550-97cf-5b9a89730459,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a47993-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.970 182939 DEBUG os_vif [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:2f:a3,bridge_name='br-int',has_traffic_filtering=True,id=c4a47993-b901-4550-97cf-5b9a89730459,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a47993-b9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.970 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.971 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.971 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.975 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.975 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc4a47993-b9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.976 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc4a47993-b9, col_values=(('external_ids', {'iface-id': 'c4a47993-b901-4550-97cf-5b9a89730459', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:2f:a3', 'vm-uuid': 'b448a112-7efc-4f54-b6db-0aabc1bf767d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.978 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.980 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:48:46 compute-0 NetworkManager[55139]: <info>  [1769039326.9804] manager: (tapc4a47993-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.989 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:46 compute-0 nova_compute[182935]: 2026-01-21 23:48:46.990 182939 INFO os_vif [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:2f:a3,bridge_name='br-int',has_traffic_filtering=True,id=c4a47993-b901-4550-97cf-5b9a89730459,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a47993-b9')
Jan 21 23:48:47 compute-0 nova_compute[182935]: 2026-01-21 23:48:47.070 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:47 compute-0 nova_compute[182935]: 2026-01-21 23:48:47.070 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:47 compute-0 nova_compute[182935]: 2026-01-21 23:48:47.070 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] No VIF found with MAC fa:16:3e:8c:2f:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:48:47 compute-0 nova_compute[182935]: 2026-01-21 23:48:47.071 182939 INFO nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Using config drive
Jan 21 23:48:47 compute-0 sshd-session[215796]: Invalid user tomcat from 188.166.69.60 port 55812
Jan 21 23:48:47 compute-0 sshd-session[215796]: Connection closed by invalid user tomcat 188.166.69.60 port 55812 [preauth]
Jan 21 23:48:47 compute-0 nova_compute[182935]: 2026-01-21 23:48:47.946 182939 INFO nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Creating config drive at /var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk.config
Jan 21 23:48:47 compute-0 nova_compute[182935]: 2026-01-21 23:48:47.953 182939 DEBUG oslo_concurrency.processutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeu1jpulv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.084 182939 DEBUG oslo_concurrency.processutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeu1jpulv" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:48 compute-0 NetworkManager[55139]: <info>  [1769039328.1536] manager: (tapc4a47993-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Jan 21 23:48:48 compute-0 kernel: tapc4a47993-b9: entered promiscuous mode
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.159 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:48 compute-0 ovn_controller[95047]: 2026-01-21T23:48:48Z|00136|binding|INFO|Claiming lport c4a47993-b901-4550-97cf-5b9a89730459 for this chassis.
Jan 21 23:48:48 compute-0 ovn_controller[95047]: 2026-01-21T23:48:48Z|00137|binding|INFO|c4a47993-b901-4550-97cf-5b9a89730459: Claiming fa:16:3e:8c:2f:a3 10.100.0.14
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.181 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:2f:a3 10.100.0.14'], port_security=['fa:16:3e:8c:2f:a3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b448a112-7efc-4f54-b6db-0aabc1bf767d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-135f4ca0-b287-4f82-8393-a426855e9926', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2edcdd2e6c5a46cb95eb89874a9cb5f3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b452e9c4-b5fd-46cd-9749-caa7edf73c8c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=357b9b46-d446-48ea-adde-5992e2bcd56d, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=c4a47993-b901-4550-97cf-5b9a89730459) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.183 104408 INFO neutron.agent.ovn.metadata.agent [-] Port c4a47993-b901-4550-97cf-5b9a89730459 in datapath 135f4ca0-b287-4f82-8393-a426855e9926 bound to our chassis
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.186 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 135f4ca0-b287-4f82-8393-a426855e9926
Jan 21 23:48:48 compute-0 systemd-machined[154182]: New machine qemu-17-instance-0000001c.
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.202 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ec2ec4ef-933c-4564-8ffe-94732742eba0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.211 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap135f4ca0-b1 in ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.213 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap135f4ca0-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.213 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5a4dfa93-e96c-4007-b1e7-61146a4f0d0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.219 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d604045e-5306-4c21-870d-4cd27736c366]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-0000001c.
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.237 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.236 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[1afb762d-2055-4d4d-9802-fd7a29f281b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 ovn_controller[95047]: 2026-01-21T23:48:48Z|00138|binding|INFO|Setting lport c4a47993-b901-4550-97cf-5b9a89730459 ovn-installed in OVS
Jan 21 23:48:48 compute-0 ovn_controller[95047]: 2026-01-21T23:48:48Z|00139|binding|INFO|Setting lport c4a47993-b901-4550-97cf-5b9a89730459 up in Southbound
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.246 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.252 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ad18210d-2961-4d46-bc76-b4983f652c96]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 systemd-udevd[215825]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:48:48 compute-0 NetworkManager[55139]: <info>  [1769039328.2882] device (tapc4a47993-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:48:48 compute-0 NetworkManager[55139]: <info>  [1769039328.2888] device (tapc4a47993-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.297 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[99854f92-b05c-4499-90c9-8344baf982ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 NetworkManager[55139]: <info>  [1769039328.3094] manager: (tap135f4ca0-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Jan 21 23:48:48 compute-0 systemd-udevd[215831]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.308 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[83c572fb-9d13-40a9-8892-1a6d167a1ece]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.349 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[555918ed-3e64-42ac-9b04-b1bf80503553]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.353 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[bdf65eb2-ec7a-470f-ac92-0042e000a797]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 NetworkManager[55139]: <info>  [1769039328.3877] device (tap135f4ca0-b0): carrier: link connected
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.395 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d44aef1b-78cb-4ab6-a7fb-029795c8c8eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.414 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[967870be-f027-47b4-aeee-9342d5f44174]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap135f4ca0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:bd:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383112, 'reachable_time': 28224, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215852, 'error': None, 'target': 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.437 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c95f91e9-05aa-4f87-b878-629311bba61f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7b:bddf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383112, 'tstamp': 383112}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215853, 'error': None, 'target': 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.462 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b84f86c8-6f4b-4e01-9d23-89d5a2fdf5ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap135f4ca0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:bd:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 266, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 266, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383112, 'reachable_time': 28224, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215854, 'error': None, 'target': 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.502 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[908ed9f2-3ff8-44f7-9ed7-40b04a6c1e14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.575 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cc8c72b3-a0d8-4bbe-9ca0-809cd6f4976a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.577 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap135f4ca0-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.578 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.578 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap135f4ca0-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.580 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:48 compute-0 NetworkManager[55139]: <info>  [1769039328.5812] manager: (tap135f4ca0-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 21 23:48:48 compute-0 kernel: tap135f4ca0-b0: entered promiscuous mode
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.585 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap135f4ca0-b0, col_values=(('external_ids', {'iface-id': 'f24d5ed7-f246-4123-afeb-d49e73610afb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.586 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:48 compute-0 ovn_controller[95047]: 2026-01-21T23:48:48Z|00140|binding|INFO|Releasing lport f24d5ed7-f246-4123-afeb-d49e73610afb from this chassis (sb_readonly=0)
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.587 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.589 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/135f4ca0-b287-4f82-8393-a426855e9926.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/135f4ca0-b287-4f82-8393-a426855e9926.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.590 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5d31c8fe-ad8a-453d-b52f-96359a86fda3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.591 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-135f4ca0-b287-4f82-8393-a426855e9926
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/135f4ca0-b287-4f82-8393-a426855e9926.pid.haproxy
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 135f4ca0-b287-4f82-8393-a426855e9926
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:48:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:48.593 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'env', 'PROCESS_TAG=haproxy-135f4ca0-b287-4f82-8393-a426855e9926', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/135f4ca0-b287-4f82-8393-a426855e9926.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.598 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.601 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039328.601575, b448a112-7efc-4f54-b6db-0aabc1bf767d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.602 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] VM Started (Lifecycle Event)
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.623 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.628 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039328.601632, b448a112-7efc-4f54-b6db-0aabc1bf767d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.628 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] VM Paused (Lifecycle Event)
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.646 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.649 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:48 compute-0 nova_compute[182935]: 2026-01-21 23:48:48.667 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.004 182939 DEBUG nova.compute.manager [req-f0628105-286c-40ef-9f2c-e0ab510b898d req-82b5f6ce-9b55-4c50-9156-30bd4fe5b970 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Received event network-vif-plugged-c4a47993-b901-4550-97cf-5b9a89730459 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.005 182939 DEBUG oslo_concurrency.lockutils [req-f0628105-286c-40ef-9f2c-e0ab510b898d req-82b5f6ce-9b55-4c50-9156-30bd4fe5b970 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.005 182939 DEBUG oslo_concurrency.lockutils [req-f0628105-286c-40ef-9f2c-e0ab510b898d req-82b5f6ce-9b55-4c50-9156-30bd4fe5b970 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.006 182939 DEBUG oslo_concurrency.lockutils [req-f0628105-286c-40ef-9f2c-e0ab510b898d req-82b5f6ce-9b55-4c50-9156-30bd4fe5b970 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.006 182939 DEBUG nova.compute.manager [req-f0628105-286c-40ef-9f2c-e0ab510b898d req-82b5f6ce-9b55-4c50-9156-30bd4fe5b970 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Processing event network-vif-plugged-c4a47993-b901-4550-97cf-5b9a89730459 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.007 182939 DEBUG nova.compute.manager [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.025 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.025 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039329.0237608, b448a112-7efc-4f54-b6db-0aabc1bf767d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.026 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] VM Resumed (Lifecycle Event)
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.032 182939 INFO nova.virt.libvirt.driver [-] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Instance spawned successfully.
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.033 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:48:49 compute-0 podman[215893]: 2026-01-21 23:48:49.038609998 +0000 UTC m=+0.073585155 container create 685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.047 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.061 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.068 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.069 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.070 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.070 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.071 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.072 182939 DEBUG nova.virt.libvirt.driver [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:49 compute-0 systemd[1]: Started libpod-conmon-685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de.scope.
Jan 21 23:48:49 compute-0 podman[215893]: 2026-01-21 23:48:48.995319878 +0000 UTC m=+0.030295015 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.101 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:48:49 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:48:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6bda2b2809c7db1eb4ad5a8f020400e39ab73ac3b9b67f56736a3c68e1e5af8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:48:49 compute-0 podman[215893]: 2026-01-21 23:48:49.137788826 +0000 UTC m=+0.172763963 container init 685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 21 23:48:49 compute-0 podman[215893]: 2026-01-21 23:48:49.14690759 +0000 UTC m=+0.181882707 container start 685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.171 182939 INFO nova.compute.manager [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Took 6.14 seconds to spawn the instance on the hypervisor.
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.172 182939 DEBUG nova.compute.manager [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:49 compute-0 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[215909]: [NOTICE]   (215913) : New worker (215915) forked
Jan 21 23:48:49 compute-0 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[215909]: [NOTICE]   (215913) : Loading success.
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.290 182939 INFO nova.compute.manager [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Took 6.97 seconds to build instance.
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.354 182939 DEBUG oslo_concurrency.lockutils [None req-0e48b54b-8b00-4361-9628-3cea937f777f 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.795 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.935 182939 DEBUG nova.network.neutron [req-b5501ff3-ba0e-4d0b-85fe-6fdc310408d7 req-43e46e01-0ee4-4c97-b8b7-762accb56685 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Updated VIF entry in instance network info cache for port c4a47993-b901-4550-97cf-5b9a89730459. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.936 182939 DEBUG nova.network.neutron [req-b5501ff3-ba0e-4d0b-85fe-6fdc310408d7 req-43e46e01-0ee4-4c97-b8b7-762accb56685 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Updating instance_info_cache with network_info: [{"id": "c4a47993-b901-4550-97cf-5b9a89730459", "address": "fa:16:3e:8c:2f:a3", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a47993-b9", "ovs_interfaceid": "c4a47993-b901-4550-97cf-5b9a89730459", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:49 compute-0 nova_compute[182935]: 2026-01-21 23:48:49.961 182939 DEBUG oslo_concurrency.lockutils [req-b5501ff3-ba0e-4d0b-85fe-6fdc310408d7 req-43e46e01-0ee4-4c97-b8b7-762accb56685 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-b448a112-7efc-4f54-b6db-0aabc1bf767d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:48:50 compute-0 podman[215926]: 2026-01-21 23:48:50.741724562 +0000 UTC m=+0.077770435 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:48:50 compute-0 podman[215925]: 2026-01-21 23:48:50.766234619 +0000 UTC m=+0.104569536 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.157 182939 DEBUG nova.compute.manager [req-a2ae453e-5e53-466e-bc97-ff11f92c235e req-d51be94b-0bf9-46c8-9dc3-9ca8b8478fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Received event network-vif-plugged-c4a47993-b901-4550-97cf-5b9a89730459 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.158 182939 DEBUG oslo_concurrency.lockutils [req-a2ae453e-5e53-466e-bc97-ff11f92c235e req-d51be94b-0bf9-46c8-9dc3-9ca8b8478fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.158 182939 DEBUG oslo_concurrency.lockutils [req-a2ae453e-5e53-466e-bc97-ff11f92c235e req-d51be94b-0bf9-46c8-9dc3-9ca8b8478fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.158 182939 DEBUG oslo_concurrency.lockutils [req-a2ae453e-5e53-466e-bc97-ff11f92c235e req-d51be94b-0bf9-46c8-9dc3-9ca8b8478fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.158 182939 DEBUG nova.compute.manager [req-a2ae453e-5e53-466e-bc97-ff11f92c235e req-d51be94b-0bf9-46c8-9dc3-9ca8b8478fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] No waiting events found dispatching network-vif-plugged-c4a47993-b901-4550-97cf-5b9a89730459 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.159 182939 WARNING nova.compute.manager [req-a2ae453e-5e53-466e-bc97-ff11f92c235e req-d51be94b-0bf9-46c8-9dc3-9ca8b8478fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Received unexpected event network-vif-plugged-c4a47993-b901-4550-97cf-5b9a89730459 for instance with vm_state active and task_state deleting.
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.208 182939 DEBUG oslo_concurrency.lockutils [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "b448a112-7efc-4f54-b6db-0aabc1bf767d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.209 182939 DEBUG oslo_concurrency.lockutils [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.209 182939 DEBUG oslo_concurrency.lockutils [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.209 182939 DEBUG oslo_concurrency.lockutils [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.210 182939 DEBUG oslo_concurrency.lockutils [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.220 182939 INFO nova.compute.manager [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Terminating instance
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.231 182939 DEBUG nova.compute.manager [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:48:51 compute-0 kernel: tapc4a47993-b9 (unregistering): left promiscuous mode
Jan 21 23:48:51 compute-0 NetworkManager[55139]: <info>  [1769039331.2652] device (tapc4a47993-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.270 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:51 compute-0 ovn_controller[95047]: 2026-01-21T23:48:51Z|00141|binding|INFO|Releasing lport c4a47993-b901-4550-97cf-5b9a89730459 from this chassis (sb_readonly=0)
Jan 21 23:48:51 compute-0 ovn_controller[95047]: 2026-01-21T23:48:51Z|00142|binding|INFO|Setting lport c4a47993-b901-4550-97cf-5b9a89730459 down in Southbound
Jan 21 23:48:51 compute-0 ovn_controller[95047]: 2026-01-21T23:48:51Z|00143|binding|INFO|Removing iface tapc4a47993-b9 ovn-installed in OVS
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.284 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:2f:a3 10.100.0.14'], port_security=['fa:16:3e:8c:2f:a3 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b448a112-7efc-4f54-b6db-0aabc1bf767d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-135f4ca0-b287-4f82-8393-a426855e9926', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2edcdd2e6c5a46cb95eb89874a9cb5f3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b452e9c4-b5fd-46cd-9749-caa7edf73c8c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=357b9b46-d446-48ea-adde-5992e2bcd56d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=c4a47993-b901-4550-97cf-5b9a89730459) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.288 104408 INFO neutron.agent.ovn.metadata.agent [-] Port c4a47993-b901-4550-97cf-5b9a89730459 in datapath 135f4ca0-b287-4f82-8393-a426855e9926 unbound from our chassis
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.290 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 135f4ca0-b287-4f82-8393-a426855e9926, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.291 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.293 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[568a11cd-0e6d-4b78-83ef-e7b8cf89bff8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.294 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 namespace which is not needed anymore
Jan 21 23:48:51 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Jan 21 23:48:51 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001c.scope: Consumed 2.575s CPU time.
Jan 21 23:48:51 compute-0 systemd-machined[154182]: Machine qemu-17-instance-0000001c terminated.
Jan 21 23:48:51 compute-0 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[215909]: [NOTICE]   (215913) : haproxy version is 2.8.14-c23fe91
Jan 21 23:48:51 compute-0 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[215909]: [NOTICE]   (215913) : path to executable is /usr/sbin/haproxy
Jan 21 23:48:51 compute-0 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[215909]: [ALERT]    (215913) : Current worker (215915) exited with code 143 (Terminated)
Jan 21 23:48:51 compute-0 neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926[215909]: [WARNING]  (215913) : All workers exited. Exiting... (0)
Jan 21 23:48:51 compute-0 systemd[1]: libpod-685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de.scope: Deactivated successfully.
Jan 21 23:48:51 compute-0 podman[215998]: 2026-01-21 23:48:51.463928114 +0000 UTC m=+0.067615354 container died 685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.513 182939 INFO nova.virt.libvirt.driver [-] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Instance destroyed successfully.
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.514 182939 DEBUG nova.objects.instance [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lazy-loading 'resources' on Instance uuid b448a112-7efc-4f54-b6db-0aabc1bf767d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de-userdata-shm.mount: Deactivated successfully.
Jan 21 23:48:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-d6bda2b2809c7db1eb4ad5a8f020400e39ab73ac3b9b67f56736a3c68e1e5af8-merged.mount: Deactivated successfully.
Jan 21 23:48:51 compute-0 podman[215998]: 2026-01-21 23:48:51.526005358 +0000 UTC m=+0.129692598 container cleanup 685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.530 182939 DEBUG nova.virt.libvirt.vif [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:48:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1244489397',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1244489397',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1244489397',id=28,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:48:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2edcdd2e6c5a46cb95eb89874a9cb5f3',ramdisk_id='',reservation_id='r-k46jxfrr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-222133061',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-222133061-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:48:49Z,user_data=None,user_id='0c3f927acf834c718155d5ee5dd81b19',uuid=b448a112-7efc-4f54-b6db-0aabc1bf767d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c4a47993-b901-4550-97cf-5b9a89730459", "address": "fa:16:3e:8c:2f:a3", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a47993-b9", "ovs_interfaceid": "c4a47993-b901-4550-97cf-5b9a89730459", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.530 182939 DEBUG nova.network.os_vif_util [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converting VIF {"id": "c4a47993-b901-4550-97cf-5b9a89730459", "address": "fa:16:3e:8c:2f:a3", "network": {"id": "135f4ca0-b287-4f82-8393-a426855e9926", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1018143163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2edcdd2e6c5a46cb95eb89874a9cb5f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a47993-b9", "ovs_interfaceid": "c4a47993-b901-4550-97cf-5b9a89730459", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.532 182939 DEBUG nova.network.os_vif_util [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:2f:a3,bridge_name='br-int',has_traffic_filtering=True,id=c4a47993-b901-4550-97cf-5b9a89730459,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a47993-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.532 182939 DEBUG os_vif [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:2f:a3,bridge_name='br-int',has_traffic_filtering=True,id=c4a47993-b901-4550-97cf-5b9a89730459,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a47993-b9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.534 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.535 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4a47993-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.537 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.538 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:51 compute-0 systemd[1]: libpod-conmon-685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de.scope: Deactivated successfully.
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.542 182939 INFO os_vif [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:2f:a3,bridge_name='br-int',has_traffic_filtering=True,id=c4a47993-b901-4550-97cf-5b9a89730459,network=Network(135f4ca0-b287-4f82-8393-a426855e9926),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a47993-b9')
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.542 182939 INFO nova.virt.libvirt.driver [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Deleting instance files /var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d_del
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.543 182939 INFO nova.virt.libvirt.driver [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Deletion of /var/lib/nova/instances/b448a112-7efc-4f54-b6db-0aabc1bf767d_del complete
Jan 21 23:48:51 compute-0 podman[216042]: 2026-01-21 23:48:51.595495395 +0000 UTC m=+0.043608428 container remove 685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.601 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f3e70b68-2f71-42ee-8f84-cd7f79470011]: (4, ('Wed Jan 21 11:48:51 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 (685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de)\n685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de\nWed Jan 21 11:48:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 (685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de)\n685954df04cc522dbcf5e658d391a3642ff2e7340bf390f9d28969db7cc2c2de\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.602 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f9063d5a-2e86-423a-bd00-98aee37298b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.604 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap135f4ca0-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.605 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:51 compute-0 kernel: tap135f4ca0-b0: left promiscuous mode
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.615 182939 INFO nova.compute.manager [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.616 182939 DEBUG oslo.service.loopingcall [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.616 182939 DEBUG nova.compute.manager [-] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.616 182939 DEBUG nova.network.neutron [-] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:48:51 compute-0 nova_compute[182935]: 2026-01-21 23:48:51.621 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.621 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ceb1b771-9f5b-4467-8447-17287363289d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.640 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ab466e70-ac0f-4ff2-b1ac-e05dccc57805]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.641 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a55d5d53-a75e-475b-b512-c43103e35f28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.655 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bec3ea23-61e0-4d32-8026-e7d90881a635]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383102, 'reachable_time': 26923, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216057, 'error': None, 'target': 'ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:51 compute-0 systemd[1]: run-netns-ovnmeta\x2d135f4ca0\x2db287\x2d4f82\x2d8393\x2da426855e9926.mount: Deactivated successfully.
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.657 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-135f4ca0-b287-4f82-8393-a426855e9926 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:48:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:48:51.658 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[08fb7ce4-f418-4eff-a2d6-c989033857a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:48:52 compute-0 nova_compute[182935]: 2026-01-21 23:48:52.513 182939 DEBUG nova.network.neutron [-] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:48:52 compute-0 nova_compute[182935]: 2026-01-21 23:48:52.551 182939 INFO nova.compute.manager [-] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Took 0.93 seconds to deallocate network for instance.
Jan 21 23:48:52 compute-0 nova_compute[182935]: 2026-01-21 23:48:52.641 182939 DEBUG oslo_concurrency.lockutils [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:52 compute-0 nova_compute[182935]: 2026-01-21 23:48:52.642 182939 DEBUG oslo_concurrency.lockutils [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:52 compute-0 nova_compute[182935]: 2026-01-21 23:48:52.644 182939 DEBUG nova.compute.manager [req-8760d725-9fab-4480-9d32-e5193ae3286a req-7c56ebc7-7976-4012-99f1-af0d6c7fd150 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Received event network-vif-deleted-c4a47993-b901-4550-97cf-5b9a89730459 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:52 compute-0 nova_compute[182935]: 2026-01-21 23:48:52.746 182939 DEBUG nova.compute.provider_tree [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:52 compute-0 nova_compute[182935]: 2026-01-21 23:48:52.765 182939 DEBUG nova.scheduler.client.report [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:52 compute-0 nova_compute[182935]: 2026-01-21 23:48:52.801 182939 DEBUG oslo_concurrency.lockutils [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:52 compute-0 nova_compute[182935]: 2026-01-21 23:48:52.833 182939 INFO nova.scheduler.client.report [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Deleted allocations for instance b448a112-7efc-4f54-b6db-0aabc1bf767d
Jan 21 23:48:52 compute-0 nova_compute[182935]: 2026-01-21 23:48:52.938 182939 DEBUG oslo_concurrency.lockutils [None req-c9eb3faf-f802-4375-94e5-f82868655416 0c3f927acf834c718155d5ee5dd81b19 2edcdd2e6c5a46cb95eb89874a9cb5f3 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:53 compute-0 nova_compute[182935]: 2026-01-21 23:48:53.332 182939 DEBUG nova.compute.manager [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Received event network-vif-unplugged-c4a47993-b901-4550-97cf-5b9a89730459 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:53 compute-0 nova_compute[182935]: 2026-01-21 23:48:53.333 182939 DEBUG oslo_concurrency.lockutils [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:53 compute-0 nova_compute[182935]: 2026-01-21 23:48:53.333 182939 DEBUG oslo_concurrency.lockutils [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:53 compute-0 nova_compute[182935]: 2026-01-21 23:48:53.333 182939 DEBUG oslo_concurrency.lockutils [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:53 compute-0 nova_compute[182935]: 2026-01-21 23:48:53.333 182939 DEBUG nova.compute.manager [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] No waiting events found dispatching network-vif-unplugged-c4a47993-b901-4550-97cf-5b9a89730459 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:53 compute-0 nova_compute[182935]: 2026-01-21 23:48:53.334 182939 WARNING nova.compute.manager [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Received unexpected event network-vif-unplugged-c4a47993-b901-4550-97cf-5b9a89730459 for instance with vm_state deleted and task_state None.
Jan 21 23:48:53 compute-0 nova_compute[182935]: 2026-01-21 23:48:53.334 182939 DEBUG nova.compute.manager [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Received event network-vif-plugged-c4a47993-b901-4550-97cf-5b9a89730459 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:48:53 compute-0 nova_compute[182935]: 2026-01-21 23:48:53.334 182939 DEBUG oslo_concurrency.lockutils [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:53 compute-0 nova_compute[182935]: 2026-01-21 23:48:53.334 182939 DEBUG oslo_concurrency.lockutils [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:53 compute-0 nova_compute[182935]: 2026-01-21 23:48:53.334 182939 DEBUG oslo_concurrency.lockutils [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b448a112-7efc-4f54-b6db-0aabc1bf767d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:53 compute-0 nova_compute[182935]: 2026-01-21 23:48:53.334 182939 DEBUG nova.compute.manager [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] No waiting events found dispatching network-vif-plugged-c4a47993-b901-4550-97cf-5b9a89730459 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:48:53 compute-0 nova_compute[182935]: 2026-01-21 23:48:53.334 182939 WARNING nova.compute.manager [req-6479c26c-60e2-430d-a170-c08f04d67656 req-8d06d496-8059-4fe7-8979-f17d137c3024 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Received unexpected event network-vif-plugged-c4a47993-b901-4550-97cf-5b9a89730459 for instance with vm_state deleted and task_state None.
Jan 21 23:48:54 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 23:48:54 compute-0 systemd[215570]: Activating special unit Exit the Session...
Jan 21 23:48:54 compute-0 systemd[215570]: Stopped target Main User Target.
Jan 21 23:48:54 compute-0 systemd[215570]: Stopped target Basic System.
Jan 21 23:48:54 compute-0 systemd[215570]: Stopped target Paths.
Jan 21 23:48:54 compute-0 systemd[215570]: Stopped target Sockets.
Jan 21 23:48:54 compute-0 systemd[215570]: Stopped target Timers.
Jan 21 23:48:54 compute-0 systemd[215570]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:48:54 compute-0 systemd[215570]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:48:54 compute-0 systemd[215570]: Closed D-Bus User Message Bus Socket.
Jan 21 23:48:54 compute-0 systemd[215570]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:48:54 compute-0 systemd[215570]: Removed slice User Application Slice.
Jan 21 23:48:54 compute-0 systemd[215570]: Reached target Shutdown.
Jan 21 23:48:54 compute-0 systemd[215570]: Finished Exit the Session.
Jan 21 23:48:54 compute-0 systemd[215570]: Reached target Exit the Session.
Jan 21 23:48:54 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 23:48:54 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 23:48:54 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 23:48:54 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 23:48:54 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 23:48:54 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 23:48:54 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 23:48:54 compute-0 nova_compute[182935]: 2026-01-21 23:48:54.796 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.426 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "4d84ec02-4252-4dab-8580-d9961b6e6afd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.427 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "4d84ec02-4252-4dab-8580-d9961b6e6afd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.473 182939 DEBUG nova.compute.manager [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.538 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.597 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.598 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.607 182939 DEBUG nova.virt.hardware [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.607 182939 INFO nova.compute.claims [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.789 182939 DEBUG nova.compute.provider_tree [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.809 182939 DEBUG nova.scheduler.client.report [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.831 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.832 182939 DEBUG nova.compute.manager [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.903 182939 DEBUG nova.compute.manager [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.904 182939 DEBUG nova.network.neutron [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.924 182939 INFO nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:48:56 compute-0 nova_compute[182935]: 2026-01-21 23:48:56.950 182939 DEBUG nova.compute.manager [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.107 182939 DEBUG nova.compute.manager [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.109 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.110 182939 INFO nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Creating image(s)
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.111 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.111 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.112 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.131 182939 DEBUG oslo_concurrency.processutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.218 182939 DEBUG oslo_concurrency.processutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.219 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.220 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.231 182939 DEBUG oslo_concurrency.processutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.250 182939 DEBUG nova.network.neutron [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.251 182939 DEBUG nova.compute.manager [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.289 182939 DEBUG oslo_concurrency.processutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.290 182939 DEBUG oslo_concurrency.processutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.324 182939 DEBUG oslo_concurrency.processutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.325 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.325 182939 DEBUG oslo_concurrency.processutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.382 182939 DEBUG oslo_concurrency.processutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.383 182939 DEBUG nova.virt.disk.api [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Checking if we can resize image /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.384 182939 DEBUG oslo_concurrency.processutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.448 182939 DEBUG oslo_concurrency.processutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.449 182939 DEBUG nova.virt.disk.api [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Cannot resize image /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.450 182939 DEBUG nova.objects.instance [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 4d84ec02-4252-4dab-8580-d9961b6e6afd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.493 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.493 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Ensure instance console log exists: /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.494 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.494 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.495 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.496 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.505 182939 WARNING nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.510 182939 DEBUG nova.virt.libvirt.host [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.510 182939 DEBUG nova.virt.libvirt.host [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.513 182939 DEBUG nova.virt.libvirt.host [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.514 182939 DEBUG nova.virt.libvirt.host [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.516 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.516 182939 DEBUG nova.virt.hardware [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:48:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ae7edcb5-f2bd-4344-9c7a-58d27b44fa9a',id=30,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-2099183814',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.516 182939 DEBUG nova.virt.hardware [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.517 182939 DEBUG nova.virt.hardware [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.517 182939 DEBUG nova.virt.hardware [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.517 182939 DEBUG nova.virt.hardware [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.517 182939 DEBUG nova.virt.hardware [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.517 182939 DEBUG nova.virt.hardware [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.518 182939 DEBUG nova.virt.hardware [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.518 182939 DEBUG nova.virt.hardware [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.518 182939 DEBUG nova.virt.hardware [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.518 182939 DEBUG nova.virt.hardware [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.523 182939 DEBUG nova.objects.instance [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4d84ec02-4252-4dab-8580-d9961b6e6afd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.539 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:48:57 compute-0 nova_compute[182935]:   <uuid>4d84ec02-4252-4dab-8580-d9961b6e6afd</uuid>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   <name>instance-0000001e</name>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <nova:name>tempest-MigrationsAdminTest-server-867500350</nova:name>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:48:57</nova:creationTime>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <nova:flavor name="tempest-test_resize_flavor_-2099183814">
Jan 21 23:48:57 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:48:57 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:48:57 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:48:57 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:48:57 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:48:57 compute-0 nova_compute[182935]:         <nova:user uuid="36d71830ce70436e97fbc17b6da8d3c6">tempest-MigrationsAdminTest-1559502816-project-member</nova:user>
Jan 21 23:48:57 compute-0 nova_compute[182935]:         <nova:project uuid="95574103d0094883861c58d01690e5a3">tempest-MigrationsAdminTest-1559502816</nova:project>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <system>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <entry name="serial">4d84ec02-4252-4dab-8580-d9961b6e6afd</entry>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <entry name="uuid">4d84ec02-4252-4dab-8580-d9961b6e6afd</entry>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     </system>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   <os>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   </os>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   <features>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   </features>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/console.log" append="off"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <video>
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     </video>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:48:57 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:48:57 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:48:57 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:48:57 compute-0 nova_compute[182935]: </domain>
Jan 21 23:48:57 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.608 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.608 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.609 182939 INFO nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Using config drive
Jan 21 23:48:57 compute-0 podman[216076]: 2026-01-21 23:48:57.675758521 +0000 UTC m=+0.089117151 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 23:48:57 compute-0 nova_compute[182935]: 2026-01-21 23:48:57.996 182939 INFO nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Creating config drive at /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config
Jan 21 23:48:58 compute-0 nova_compute[182935]: 2026-01-21 23:48:58.002 182939 DEBUG oslo_concurrency.processutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoxziu83u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:48:58 compute-0 nova_compute[182935]: 2026-01-21 23:48:58.025 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:58 compute-0 nova_compute[182935]: 2026-01-21 23:48:58.128 182939 DEBUG oslo_concurrency.processutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoxziu83u" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:48:58 compute-0 systemd-machined[154182]: New machine qemu-18-instance-0000001e.
Jan 21 23:48:58 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-0000001e.
Jan 21 23:48:58 compute-0 nova_compute[182935]: 2026-01-21 23:48:58.814 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039323.8118663, 2038fd11-9c07-48d0-8092-d973d69d8eb9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:58 compute-0 nova_compute[182935]: 2026-01-21 23:48:58.814 182939 INFO nova.compute.manager [-] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] VM Stopped (Lifecycle Event)
Jan 21 23:48:58 compute-0 nova_compute[182935]: 2026-01-21 23:48:58.839 182939 DEBUG nova.compute.manager [None req-003e794c-87f9-454c-9846-945faae3c23d - - - - - -] [instance: 2038fd11-9c07-48d0-8092-d973d69d8eb9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.551 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039339.5506327, 4d84ec02-4252-4dab-8580-d9961b6e6afd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.553 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] VM Resumed (Lifecycle Event)
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.556 182939 DEBUG nova.compute.manager [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.557 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.562 182939 INFO nova.virt.libvirt.driver [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance spawned successfully.
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.562 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.607 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.616 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.622 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.623 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.623 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.624 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.625 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.626 182939 DEBUG nova.virt.libvirt.driver [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.639 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.640 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039339.5521154, 4d84ec02-4252-4dab-8580-d9961b6e6afd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.641 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] VM Started (Lifecycle Event)
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.666 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.671 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.699 182939 INFO nova.compute.manager [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Took 2.59 seconds to spawn the instance on the hypervisor.
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.700 182939 DEBUG nova.compute.manager [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.707 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.785 182939 INFO nova.compute.manager [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Took 3.24 seconds to build instance.
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.799 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:48:59 compute-0 nova_compute[182935]: 2026-01-21 23:48:59.810 182939 DEBUG oslo_concurrency.lockutils [None req-12d8e106-1c3a-4548-91ca-30cd4312b8a5 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "4d84ec02-4252-4dab-8580-d9961b6e6afd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.382s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:01 compute-0 nova_compute[182935]: 2026-01-21 23:49:01.541 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:01 compute-0 podman[216139]: 2026-01-21 23:49:01.717791725 +0000 UTC m=+0.075759377 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 21 23:49:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:49:03.183 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:49:03.184 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:49:03.184 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:04 compute-0 nova_compute[182935]: 2026-01-21 23:49:04.237 182939 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:49:04 compute-0 nova_compute[182935]: 2026-01-21 23:49:04.237 182939 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:49:04 compute-0 nova_compute[182935]: 2026-01-21 23:49:04.238 182939 DEBUG nova.network.neutron [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:49:04 compute-0 nova_compute[182935]: 2026-01-21 23:49:04.682 182939 DEBUG nova.network.neutron [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:04 compute-0 nova_compute[182935]: 2026-01-21 23:49:04.801 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:04 compute-0 nova_compute[182935]: 2026-01-21 23:49:04.946 182939 DEBUG nova.network.neutron [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:04 compute-0 nova_compute[182935]: 2026-01-21 23:49:04.972 182939 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:49:05 compute-0 nova_compute[182935]: 2026-01-21 23:49:05.123 182939 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 21 23:49:05 compute-0 nova_compute[182935]: 2026-01-21 23:49:05.124 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Creating file /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/445d8463c7f4457bbab3c61aecfe3e22.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 21 23:49:05 compute-0 nova_compute[182935]: 2026-01-21 23:49:05.124 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/445d8463c7f4457bbab3c61aecfe3e22.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:05 compute-0 nova_compute[182935]: 2026-01-21 23:49:05.641 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/445d8463c7f4457bbab3c61aecfe3e22.tmp" returned: 1 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:05 compute-0 nova_compute[182935]: 2026-01-21 23:49:05.643 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/445d8463c7f4457bbab3c61aecfe3e22.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 21 23:49:05 compute-0 nova_compute[182935]: 2026-01-21 23:49:05.643 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Creating directory /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 21 23:49:05 compute-0 nova_compute[182935]: 2026-01-21 23:49:05.643 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:05 compute-0 nova_compute[182935]: 2026-01-21 23:49:05.877 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd" returned: 0 in 0.234s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:05 compute-0 nova_compute[182935]: 2026-01-21 23:49:05.881 182939 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:49:06 compute-0 nova_compute[182935]: 2026-01-21 23:49:06.512 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039331.5100322, b448a112-7efc-4f54-b6db-0aabc1bf767d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:06 compute-0 nova_compute[182935]: 2026-01-21 23:49:06.513 182939 INFO nova.compute.manager [-] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] VM Stopped (Lifecycle Event)
Jan 21 23:49:06 compute-0 nova_compute[182935]: 2026-01-21 23:49:06.541 182939 DEBUG nova.compute.manager [None req-8c2e4859-89c8-4e1f-9d8e-e4b94df261a3 - - - - - -] [instance: b448a112-7efc-4f54-b6db-0aabc1bf767d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:06 compute-0 nova_compute[182935]: 2026-01-21 23:49:06.543 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:09 compute-0 podman[216162]: 2026-01-21 23:49:09.722507111 +0000 UTC m=+0.084390930 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:49:09 compute-0 podman[216161]: 2026-01-21 23:49:09.761122571 +0000 UTC m=+0.097302504 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal)
Jan 21 23:49:09 compute-0 nova_compute[182935]: 2026-01-21 23:49:09.804 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:11 compute-0 nova_compute[182935]: 2026-01-21 23:49:11.547 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:14 compute-0 nova_compute[182935]: 2026-01-21 23:49:14.805 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:15 compute-0 nova_compute[182935]: 2026-01-21 23:49:15.932 182939 DEBUG nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 21 23:49:16 compute-0 nova_compute[182935]: 2026-01-21 23:49:16.549 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:16 compute-0 nova_compute[182935]: 2026-01-21 23:49:16.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:18 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 21 23:49:18 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000001e.scope: Consumed 14.854s CPU time.
Jan 21 23:49:18 compute-0 systemd-machined[154182]: Machine qemu-18-instance-0000001e terminated.
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.748 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "6c95a28a-2c16-4735-ab27-676395cc034b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.748 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "6c95a28a-2c16-4735-ab27-676395cc034b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.777 182939 DEBUG nova.compute.manager [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.792 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.898 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.899 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.909 182939 DEBUG nova.virt.hardware [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.910 182939 INFO nova.compute.claims [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.946 182939 INFO nova.virt.libvirt.driver [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance shutdown successfully after 13 seconds.
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.952 182939 INFO nova.virt.libvirt.driver [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance destroyed successfully.
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.957 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.976 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.977 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.977 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:49:18 compute-0 nova_compute[182935]: 2026-01-21 23:49:18.978 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2977f489-9f9d-43f7-a617-7556b7df5171 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.020 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.021 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.092 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.094 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Copying file /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_resize/disk to 192.168.122.102:/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.094 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_resize/disk 192.168.122.102:/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.135 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.143 182939 DEBUG nova.compute.provider_tree [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.157 182939 DEBUG nova.scheduler.client.report [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.180 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.181 182939 DEBUG nova.compute.manager [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.247 182939 DEBUG nova.compute.manager [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.248 182939 DEBUG nova.network.neutron [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.315 182939 INFO nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.342 182939 DEBUG nova.compute.manager [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.456 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.477 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.478 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.498 182939 DEBUG nova.compute.manager [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.499 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.499 182939 INFO nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Creating image(s)
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.500 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "/var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.500 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "/var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.500 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "/var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.512 182939 DEBUG oslo_concurrency.processutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.579 182939 DEBUG oslo_concurrency.processutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.581 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.581 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.593 182939 DEBUG oslo_concurrency.processutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.654 182939 DEBUG oslo_concurrency.processutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.655 182939 DEBUG oslo_concurrency.processutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.689 182939 DEBUG nova.network.neutron [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.690 182939 DEBUG nova.compute.manager [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.692 182939 DEBUG oslo_concurrency.processutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.693 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.694 182939 DEBUG oslo_concurrency.processutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.757 182939 DEBUG oslo_concurrency.processutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.758 182939 DEBUG nova.virt.disk.api [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Checking if we can resize image /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.759 182939 DEBUG oslo_concurrency.processutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.807 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.819 182939 DEBUG oslo_concurrency.processutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.820 182939 DEBUG nova.virt.disk.api [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Cannot resize image /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.820 182939 DEBUG nova.objects.instance [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lazy-loading 'migration_context' on Instance uuid 6c95a28a-2c16-4735-ab27-676395cc034b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.841 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.842 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Ensure instance console log exists: /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.843 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.843 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.843 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.844 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.850 182939 WARNING nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.854 182939 DEBUG nova.virt.libvirt.host [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.854 182939 DEBUG nova.virt.libvirt.host [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.857 182939 DEBUG nova.virt.libvirt.host [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.857 182939 DEBUG nova.virt.libvirt.host [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.859 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.859 182939 DEBUG nova.virt.hardware [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.859 182939 DEBUG nova.virt.hardware [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.859 182939 DEBUG nova.virt.hardware [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.860 182939 DEBUG nova.virt.hardware [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.860 182939 DEBUG nova.virt.hardware [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.860 182939 DEBUG nova.virt.hardware [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.860 182939 DEBUG nova.virt.hardware [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.861 182939 DEBUG nova.virt.hardware [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.861 182939 DEBUG nova.virt.hardware [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.861 182939 DEBUG nova.virt.hardware [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.861 182939 DEBUG nova.virt.hardware [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.865 182939 DEBUG nova.objects.instance [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6c95a28a-2c16-4735-ab27-676395cc034b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.889 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:49:19 compute-0 nova_compute[182935]:   <uuid>6c95a28a-2c16-4735-ab27-676395cc034b</uuid>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   <name>instance-00000022</name>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <nova:name>tempest-LiveMigrationNegativeTest-server-1672035777</nova:name>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:49:19</nova:creationTime>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:49:19 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:49:19 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:49:19 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:49:19 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:49:19 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:49:19 compute-0 nova_compute[182935]:         <nova:user uuid="f63fa215646b41c79f42ebb0bdcfcea0">tempest-LiveMigrationNegativeTest-896104195-project-member</nova:user>
Jan 21 23:49:19 compute-0 nova_compute[182935]:         <nova:project uuid="d261e3eff0854b5c86b1fdf0c14f9027">tempest-LiveMigrationNegativeTest-896104195</nova:project>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <system>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <entry name="serial">6c95a28a-2c16-4735-ab27-676395cc034b</entry>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <entry name="uuid">6c95a28a-2c16-4735-ab27-676395cc034b</entry>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     </system>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   <os>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   </os>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   <features>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   </features>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk.config"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/console.log" append="off"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <video>
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     </video>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:49:19 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:49:19 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:49:19 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:49:19 compute-0 nova_compute[182935]: </domain>
Jan 21 23:49:19 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.910 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "scp -r /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_resize/disk 192.168.122.102:/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk" returned: 0 in 0.816s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.911 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Copying file /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.911 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_resize/disk.config 192.168.122.102:/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.958 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.958 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:49:19 compute-0 nova_compute[182935]: 2026-01-21 23:49:19.959 182939 INFO nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Using config drive
Jan 21 23:49:20 compute-0 nova_compute[182935]: 2026-01-21 23:49:20.142 182939 INFO nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Creating config drive at /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk.config
Jan 21 23:49:20 compute-0 nova_compute[182935]: 2026-01-21 23:49:20.147 182939 DEBUG oslo_concurrency.processutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsmk662c6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:20 compute-0 nova_compute[182935]: 2026-01-21 23:49:20.164 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "scp -C -r /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_resize/disk.config 192.168.122.102:/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config" returned: 0 in 0.252s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:20 compute-0 nova_compute[182935]: 2026-01-21 23:49:20.165 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Copying file /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 21 23:49:20 compute-0 nova_compute[182935]: 2026-01-21 23:49:20.165 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_resize/disk.info 192.168.122.102:/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:20 compute-0 nova_compute[182935]: 2026-01-21 23:49:20.273 182939 DEBUG oslo_concurrency.processutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsmk662c6" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:20 compute-0 systemd-machined[154182]: New machine qemu-19-instance-00000022.
Jan 21 23:49:20 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000022.
Jan 21 23:49:20 compute-0 nova_compute[182935]: 2026-01-21 23:49:20.426 182939 DEBUG oslo_concurrency.processutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "scp -C -r /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_resize/disk.info 192.168.122.102:/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.info" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:20 compute-0 nova_compute[182935]: 2026-01-21 23:49:20.580 182939 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "4d84ec02-4252-4dab-8580-d9961b6e6afd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:20 compute-0 nova_compute[182935]: 2026-01-21 23:49:20.581 182939 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "4d84ec02-4252-4dab-8580-d9961b6e6afd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:20 compute-0 nova_compute[182935]: 2026-01-21 23:49:20.581 182939 DEBUG oslo_concurrency.lockutils [None req-f79a4c93-1692-4be0-9ec2-dc22359e4f87 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "4d84ec02-4252-4dab-8580-d9961b6e6afd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:20 compute-0 nova_compute[182935]: 2026-01-21 23:49:20.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:21 compute-0 podman[216278]: 2026-01-21 23:49:21.444133307 +0000 UTC m=+0.085738533 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.449 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039361.448829, 6c95a28a-2c16-4735-ab27-676395cc034b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.451 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] VM Resumed (Lifecycle Event)
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.457 182939 DEBUG nova.compute.manager [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.457 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.463 182939 INFO nova.virt.libvirt.driver [-] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Instance spawned successfully.
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.463 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:49:21 compute-0 podman[216276]: 2026-01-21 23:49:21.474004351 +0000 UTC m=+0.119343354 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.482 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.491 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.498 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.498 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.499 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.499 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.500 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.500 182939 DEBUG nova.virt.libvirt.driver [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.514 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.514 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039361.4491417, 6c95a28a-2c16-4735-ab27-676395cc034b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.514 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] VM Started (Lifecycle Event)
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.548 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.551 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.553 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.585 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.600 182939 INFO nova.compute.manager [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Took 2.10 seconds to spawn the instance on the hypervisor.
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.601 182939 DEBUG nova.compute.manager [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.687 182939 INFO nova.compute.manager [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Took 2.83 seconds to build instance.
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.712 182939 DEBUG oslo_concurrency.lockutils [None req-d1dcffd7-f191-48f8-8d29-39cde6a29287 f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "6c95a28a-2c16-4735-ab27-676395cc034b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.814 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.815 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.815 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.815 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:49:21 compute-0 nova_compute[182935]: 2026-01-21 23:49:21.917 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.001 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.002 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.064 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.072 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.150 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.152 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.215 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.222 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000001e, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.225 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.281 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.282 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.342 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.512 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.514 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5236MB free_disk=73.25918197631836GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.514 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.514 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.574 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Migration for instance 4d84ec02-4252-4dab-8580-d9961b6e6afd refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.597 182939 INFO nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Updating resource usage from migration 63e5814c-3982-49e0-b565-1e0faa0531cf
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.598 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Starting to track outgoing migration 63e5814c-3982-49e0-b565-1e0faa0531cf with flavor ae7edcb5-f2bd-4344-9c7a-58d27b44fa9a _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.632 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 2977f489-9f9d-43f7-a617-7556b7df5171 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.633 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.633 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Migration 63e5814c-3982-49e0-b565-1e0faa0531cf is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.633 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 6c95a28a-2c16-4735-ab27-676395cc034b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.633 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.634 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1088MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.767 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.791 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.834 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:49:22 compute-0 nova_compute[182935]: 2026-01-21 23:49:22.835 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:23 compute-0 nova_compute[182935]: 2026-01-21 23:49:23.483 182939 DEBUG nova.objects.instance [None req-9185b238-df6a-4b4c-9f41-6d80aab8cd8c a5a240057825449782c349fceda5b5d1 d3ef4a441fc8498193e574ed04bbe7c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6c95a28a-2c16-4735-ab27-676395cc034b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:23 compute-0 nova_compute[182935]: 2026-01-21 23:49:23.504 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039363.5045757, 6c95a28a-2c16-4735-ab27-676395cc034b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:23 compute-0 nova_compute[182935]: 2026-01-21 23:49:23.505 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] VM Paused (Lifecycle Event)
Jan 21 23:49:23 compute-0 nova_compute[182935]: 2026-01-21 23:49:23.524 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:23 compute-0 nova_compute[182935]: 2026-01-21 23:49:23.531 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:23 compute-0 nova_compute[182935]: 2026-01-21 23:49:23.555 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 21 23:49:23 compute-0 nova_compute[182935]: 2026-01-21 23:49:23.836 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:23 compute-0 nova_compute[182935]: 2026-01-21 23:49:23.836 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:24 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000022.scope: Deactivated successfully.
Jan 21 23:49:24 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000022.scope: Consumed 3.236s CPU time.
Jan 21 23:49:24 compute-0 systemd-machined[154182]: Machine qemu-19-instance-00000022 terminated.
Jan 21 23:49:24 compute-0 nova_compute[182935]: 2026-01-21 23:49:24.359 182939 DEBUG nova.compute.manager [None req-9185b238-df6a-4b4c-9f41-6d80aab8cd8c a5a240057825449782c349fceda5b5d1 d3ef4a441fc8498193e574ed04bbe7c8 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:24 compute-0 nova_compute[182935]: 2026-01-21 23:49:24.808 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:25 compute-0 nova_compute[182935]: 2026-01-21 23:49:25.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.219 182939 DEBUG oslo_concurrency.lockutils [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "6c95a28a-2c16-4735-ab27-676395cc034b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.220 182939 DEBUG oslo_concurrency.lockutils [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "6c95a28a-2c16-4735-ab27-676395cc034b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.220 182939 DEBUG oslo_concurrency.lockutils [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "6c95a28a-2c16-4735-ab27-676395cc034b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.220 182939 DEBUG oslo_concurrency.lockutils [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "6c95a28a-2c16-4735-ab27-676395cc034b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.220 182939 DEBUG oslo_concurrency.lockutils [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "6c95a28a-2c16-4735-ab27-676395cc034b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.235 182939 INFO nova.compute.manager [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Terminating instance
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.246 182939 DEBUG oslo_concurrency.lockutils [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "refresh_cache-6c95a28a-2c16-4735-ab27-676395cc034b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.246 182939 DEBUG oslo_concurrency.lockutils [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquired lock "refresh_cache-6c95a28a-2c16-4735-ab27-676395cc034b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.246 182939 DEBUG nova.network.neutron [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.554 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.612 182939 DEBUG nova.network.neutron [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.869 182939 DEBUG nova.network.neutron [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.901 182939 DEBUG oslo_concurrency.lockutils [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Releasing lock "refresh_cache-6c95a28a-2c16-4735-ab27-676395cc034b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.902 182939 DEBUG nova.compute.manager [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.909 182939 INFO nova.virt.libvirt.driver [-] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Instance destroyed successfully.
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.909 182939 DEBUG nova.objects.instance [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lazy-loading 'resources' on Instance uuid 6c95a28a-2c16-4735-ab27-676395cc034b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.924 182939 INFO nova.virt.libvirt.driver [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Deleting instance files /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b_del
Jan 21 23:49:26 compute-0 nova_compute[182935]: 2026-01-21 23:49:26.926 182939 INFO nova.virt.libvirt.driver [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Deletion of /var/lib/nova/instances/6c95a28a-2c16-4735-ab27-676395cc034b_del complete
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.029 182939 INFO nova.compute.manager [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Took 0.13 seconds to destroy the instance on the hypervisor.
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.030 182939 DEBUG oslo.service.loopingcall [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.031 182939 DEBUG nova.compute.manager [-] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.031 182939 DEBUG nova.network.neutron [-] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.189 182939 DEBUG nova.network.neutron [-] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.204 182939 DEBUG nova.network.neutron [-] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.220 182939 INFO nova.compute.manager [-] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Took 0.19 seconds to deallocate network for instance.
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.320 182939 DEBUG oslo_concurrency.lockutils [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.321 182939 DEBUG oslo_concurrency.lockutils [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.443 182939 DEBUG nova.compute.provider_tree [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.463 182939 DEBUG nova.scheduler.client.report [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.487 182939 DEBUG oslo_concurrency.lockutils [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.524 182939 INFO nova.scheduler.client.report [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Deleted allocations for instance 6c95a28a-2c16-4735-ab27-676395cc034b
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.606 182939 DEBUG oslo_concurrency.lockutils [None req-8c06e092-5cad-40c8-9fb9-5b13712fd8de f63fa215646b41c79f42ebb0bdcfcea0 d261e3eff0854b5c86b1fdf0c14f9027 - - default default] Lock "6c95a28a-2c16-4735-ab27-676395cc034b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.853 182939 INFO nova.compute.manager [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Swapping old allocation on dict_keys(['5f09a77c-505f-4bd3-ac26-41f43ebdf535']) held by migration 63e5814c-3982-49e0-b565-1e0faa0531cf for instance
Jan 21 23:49:27 compute-0 nova_compute[182935]: 2026-01-21 23:49:27.900 182939 DEBUG nova.scheduler.client.report [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Overwriting current allocation {'allocations': {'e96a8776-a298-4c19-937a-402cb8191067': {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}, 'generation': 18}}, 'project_id': '95574103d0094883861c58d01690e5a3', 'user_id': '36d71830ce70436e97fbc17b6da8d3c6', 'consumer_generation': 1} on consumer 4d84ec02-4252-4dab-8580-d9961b6e6afd move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.092 182939 DEBUG oslo_concurrency.lockutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.093 182939 DEBUG oslo_concurrency.lockutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.093 182939 DEBUG nova.network.neutron [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.262 182939 DEBUG nova.network.neutron [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.573 182939 DEBUG nova.network.neutron [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.593 182939 DEBUG oslo_concurrency.lockutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.593 182939 DEBUG nova.virt.libvirt.driver [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.601 182939 DEBUG nova.virt.libvirt.driver [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.604 182939 WARNING nova.virt.libvirt.driver [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.609 182939 DEBUG nova.virt.libvirt.host [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.609 182939 DEBUG nova.virt.libvirt.host [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.613 182939 DEBUG nova.virt.libvirt.host [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.613 182939 DEBUG nova.virt.libvirt.host [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.614 182939 DEBUG nova.virt.libvirt.driver [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.614 182939 DEBUG nova.virt.hardware [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:48:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ae7edcb5-f2bd-4344-9c7a-58d27b44fa9a',id=30,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-2099183814',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.615 182939 DEBUG nova.virt.hardware [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.615 182939 DEBUG nova.virt.hardware [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.615 182939 DEBUG nova.virt.hardware [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.615 182939 DEBUG nova.virt.hardware [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.616 182939 DEBUG nova.virt.hardware [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.616 182939 DEBUG nova.virt.hardware [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.616 182939 DEBUG nova.virt.hardware [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.616 182939 DEBUG nova.virt.hardware [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.616 182939 DEBUG nova.virt.hardware [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.617 182939 DEBUG nova.virt.hardware [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.617 182939 DEBUG nova.objects.instance [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4d84ec02-4252-4dab-8580-d9961b6e6afd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.632 182939 DEBUG oslo_concurrency.processutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:28 compute-0 podman[216371]: 2026-01-21 23:49:28.681624527 +0000 UTC m=+0.054510126 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.695 182939 DEBUG oslo_concurrency.processutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.696 182939 DEBUG oslo_concurrency.lockutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.697 182939 DEBUG oslo_concurrency.lockutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.698 182939 DEBUG oslo_concurrency.lockutils [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:28 compute-0 nova_compute[182935]: 2026-01-21 23:49:28.700 182939 DEBUG nova.virt.libvirt.driver [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:49:28 compute-0 nova_compute[182935]:   <uuid>4d84ec02-4252-4dab-8580-d9961b6e6afd</uuid>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   <name>instance-0000001e</name>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <nova:name>tempest-MigrationsAdminTest-server-867500350</nova:name>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:49:28</nova:creationTime>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <nova:flavor name="tempest-test_resize_flavor_-2099183814">
Jan 21 23:49:28 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:49:28 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:49:28 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:49:28 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:49:28 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:49:28 compute-0 nova_compute[182935]:         <nova:user uuid="36d71830ce70436e97fbc17b6da8d3c6">tempest-MigrationsAdminTest-1559502816-project-member</nova:user>
Jan 21 23:49:28 compute-0 nova_compute[182935]:         <nova:project uuid="95574103d0094883861c58d01690e5a3">tempest-MigrationsAdminTest-1559502816</nova:project>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <system>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <entry name="serial">4d84ec02-4252-4dab-8580-d9961b6e6afd</entry>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <entry name="uuid">4d84ec02-4252-4dab-8580-d9961b6e6afd</entry>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     </system>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   <os>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   </os>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   <features>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   </features>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/disk.config"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd/console.log" append="off"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <video>
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     </video>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <input type="keyboard" bus="usb"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:49:28 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:49:28 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:49:28 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:49:28 compute-0 nova_compute[182935]: </domain>
Jan 21 23:49:28 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:49:28 compute-0 systemd-machined[154182]: New machine qemu-20-instance-0000001e.
Jan 21 23:49:28 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000001e.
Jan 21 23:49:29 compute-0 sshd-session[216399]: Invalid user tomcat from 188.166.69.60 port 44634
Jan 21 23:49:29 compute-0 sshd-session[216399]: Connection closed by invalid user tomcat 188.166.69.60 port 44634 [preauth]
Jan 21 23:49:29 compute-0 nova_compute[182935]: 2026-01-21 23:49:29.688 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for 4d84ec02-4252-4dab-8580-d9961b6e6afd due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 21 23:49:29 compute-0 nova_compute[182935]: 2026-01-21 23:49:29.690 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039369.6882806, 4d84ec02-4252-4dab-8580-d9961b6e6afd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:29 compute-0 nova_compute[182935]: 2026-01-21 23:49:29.690 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] VM Resumed (Lifecycle Event)
Jan 21 23:49:29 compute-0 nova_compute[182935]: 2026-01-21 23:49:29.692 182939 DEBUG nova.compute.manager [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:49:29 compute-0 nova_compute[182935]: 2026-01-21 23:49:29.696 182939 INFO nova.virt.libvirt.driver [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance running successfully.
Jan 21 23:49:29 compute-0 nova_compute[182935]: 2026-01-21 23:49:29.697 182939 DEBUG nova.virt.libvirt.driver [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Jan 21 23:49:29 compute-0 nova_compute[182935]: 2026-01-21 23:49:29.810 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:29 compute-0 nova_compute[182935]: 2026-01-21 23:49:29.937 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:29 compute-0 nova_compute[182935]: 2026-01-21 23:49:29.942 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:30 compute-0 nova_compute[182935]: 2026-01-21 23:49:29.999 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 21 23:49:30 compute-0 nova_compute[182935]: 2026-01-21 23:49:30.000 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039369.6902025, 4d84ec02-4252-4dab-8580-d9961b6e6afd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:30 compute-0 nova_compute[182935]: 2026-01-21 23:49:30.000 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] VM Started (Lifecycle Event)
Jan 21 23:49:30 compute-0 nova_compute[182935]: 2026-01-21 23:49:30.041 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:30 compute-0 nova_compute[182935]: 2026-01-21 23:49:30.045 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:30 compute-0 nova_compute[182935]: 2026-01-21 23:49:30.079 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 21 23:49:30 compute-0 nova_compute[182935]: 2026-01-21 23:49:30.081 182939 INFO nova.compute.manager [None req-e8917893-674d-4211-b112-f1f8370b16ea 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Updating instance to original state: 'active'
Jan 21 23:49:31 compute-0 nova_compute[182935]: 2026-01-21 23:49:31.557 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:32 compute-0 podman[216426]: 2026-01-21 23:49:32.709961197 +0000 UTC m=+0.068094616 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 21 23:49:32 compute-0 ovn_controller[95047]: 2026-01-21T23:49:32Z|00144|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 21 23:49:33 compute-0 nova_compute[182935]: 2026-01-21 23:49:33.370 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:49:33.372 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:49:33 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:49:33.373 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:49:34 compute-0 nova_compute[182935]: 2026-01-21 23:49:34.812 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:36 compute-0 nova_compute[182935]: 2026-01-21 23:49:36.560 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:39 compute-0 nova_compute[182935]: 2026-01-21 23:49:39.361 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039364.3587828, 6c95a28a-2c16-4735-ab27-676395cc034b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:39 compute-0 nova_compute[182935]: 2026-01-21 23:49:39.363 182939 INFO nova.compute.manager [-] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] VM Stopped (Lifecycle Event)
Jan 21 23:49:39 compute-0 nova_compute[182935]: 2026-01-21 23:49:39.387 182939 DEBUG nova.compute.manager [None req-92de0248-5699-46a5-b15e-f97584eb59ec - - - - - -] [instance: 6c95a28a-2c16-4735-ab27-676395cc034b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:39 compute-0 nova_compute[182935]: 2026-01-21 23:49:39.814 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:40 compute-0 podman[216447]: 2026-01-21 23:49:40.807233905 +0000 UTC m=+0.077672541 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 21 23:49:40 compute-0 podman[216446]: 2026-01-21 23:49:40.832995742 +0000 UTC m=+0.109820519 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container)
Jan 21 23:49:41 compute-0 nova_compute[182935]: 2026-01-21 23:49:41.564 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:42 compute-0 nova_compute[182935]: 2026-01-21 23:49:42.597 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:42 compute-0 nova_compute[182935]: 2026-01-21 23:49:42.599 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:42 compute-0 nova_compute[182935]: 2026-01-21 23:49:42.599 182939 INFO nova.compute.manager [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Unshelving
Jan 21 23:49:42 compute-0 nova_compute[182935]: 2026-01-21 23:49:42.732 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:42 compute-0 nova_compute[182935]: 2026-01-21 23:49:42.734 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:42 compute-0 nova_compute[182935]: 2026-01-21 23:49:42.740 182939 DEBUG nova.objects.instance [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'pci_requests' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:42 compute-0 nova_compute[182935]: 2026-01-21 23:49:42.762 182939 DEBUG nova.objects.instance [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'numa_topology' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:42 compute-0 nova_compute[182935]: 2026-01-21 23:49:42.779 182939 DEBUG nova.virt.hardware [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:49:42 compute-0 nova_compute[182935]: 2026-01-21 23:49:42.780 182939 INFO nova.compute.claims [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:49:42 compute-0 nova_compute[182935]: 2026-01-21 23:49:42.991 182939 DEBUG nova.compute.provider_tree [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:43 compute-0 nova_compute[182935]: 2026-01-21 23:49:43.006 182939 DEBUG nova.scheduler.client.report [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:43 compute-0 nova_compute[182935]: 2026-01-21 23:49:43.069 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:43 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:49:43.376 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:49:43 compute-0 nova_compute[182935]: 2026-01-21 23:49:43.458 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:49:43 compute-0 nova_compute[182935]: 2026-01-21 23:49:43.459 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquired lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:49:43 compute-0 nova_compute[182935]: 2026-01-21 23:49:43.460 182939 DEBUG nova.network.neutron [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:49:44 compute-0 nova_compute[182935]: 2026-01-21 23:49:44.057 182939 DEBUG nova.network.neutron [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:44 compute-0 nova_compute[182935]: 2026-01-21 23:49:44.465 182939 DEBUG nova.network.neutron [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:44 compute-0 nova_compute[182935]: 2026-01-21 23:49:44.487 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Releasing lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:49:44 compute-0 nova_compute[182935]: 2026-01-21 23:49:44.490 182939 DEBUG nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:49:44 compute-0 nova_compute[182935]: 2026-01-21 23:49:44.491 182939 INFO nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Creating image(s)
Jan 21 23:49:44 compute-0 nova_compute[182935]: 2026-01-21 23:49:44.492 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:44 compute-0 nova_compute[182935]: 2026-01-21 23:49:44.493 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:44 compute-0 nova_compute[182935]: 2026-01-21 23:49:44.495 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:44 compute-0 nova_compute[182935]: 2026-01-21 23:49:44.496 182939 DEBUG nova.objects.instance [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'trusted_certs' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:44 compute-0 nova_compute[182935]: 2026-01-21 23:49:44.519 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "5a5493d740bb49aad4d429bc7765118b91ad220a" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:44 compute-0 nova_compute[182935]: 2026-01-21 23:49:44.521 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "5a5493d740bb49aad4d429bc7765118b91ad220a" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:44 compute-0 nova_compute[182935]: 2026-01-21 23:49:44.816 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:46 compute-0 nova_compute[182935]: 2026-01-21 23:49:46.555 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:46 compute-0 nova_compute[182935]: 2026-01-21 23:49:46.572 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:46 compute-0 nova_compute[182935]: 2026-01-21 23:49:46.614 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a.part --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:46 compute-0 nova_compute[182935]: 2026-01-21 23:49:46.615 182939 DEBUG nova.virt.images [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] 0ef107b0-6f96-43b3-a80c-432c1178a5c4 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 21 23:49:46 compute-0 nova_compute[182935]: 2026-01-21 23:49:46.616 182939 DEBUG nova.privsep.utils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:49:46 compute-0 nova_compute[182935]: 2026-01-21 23:49:46.617 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a.part /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:46 compute-0 nova_compute[182935]: 2026-01-21 23:49:46.945 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a.part /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a.converted" returned: 0 in 0.328s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:46 compute-0 nova_compute[182935]: 2026-01-21 23:49:46.955 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.044 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a.converted --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.046 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "5a5493d740bb49aad4d429bc7765118b91ad220a" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.526s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.074 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.136 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.138 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "5a5493d740bb49aad4d429bc7765118b91ad220a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.139 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "5a5493d740bb49aad4d429bc7765118b91ad220a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.157 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.227 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.228 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a,backing_fmt=raw /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.261 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a,backing_fmt=raw /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.262 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "5a5493d740bb49aad4d429bc7765118b91ad220a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.263 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.322 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.323 182939 DEBUG nova.objects.instance [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'migration_context' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.354 182939 INFO nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Rebasing disk image.
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.355 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.416 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:47 compute-0 nova_compute[182935]: 2026-01-21 23:49:47.417 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 -F raw /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.591 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 -F raw /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk" returned: 0 in 2.174s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.592 182939 DEBUG nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.592 182939 DEBUG nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Ensure instance console log exists: /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.593 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.593 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.593 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.595 182939 DEBUG nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='686b82e541c2a90fc53d5b1a12b4e031',container_format='bare',created_at=2026-01-21T23:49:20Z,direct_url=<?>,disk_format='qcow2',id=0ef107b0-6f96-43b3-a80c-432c1178a5c4,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1721280846-shelved',owner='af45596abab74cc9aca5cbb551899c80',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2026-01-21T23:49:37Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.598 182939 WARNING nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.603 182939 DEBUG nova.virt.libvirt.host [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.603 182939 DEBUG nova.virt.libvirt.host [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.606 182939 DEBUG nova.virt.libvirt.host [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.607 182939 DEBUG nova.virt.libvirt.host [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.608 182939 DEBUG nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.608 182939 DEBUG nova.virt.hardware [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='686b82e541c2a90fc53d5b1a12b4e031',container_format='bare',created_at=2026-01-21T23:49:20Z,direct_url=<?>,disk_format='qcow2',id=0ef107b0-6f96-43b3-a80c-432c1178a5c4,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1721280846-shelved',owner='af45596abab74cc9aca5cbb551899c80',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2026-01-21T23:49:37Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.609 182939 DEBUG nova.virt.hardware [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.609 182939 DEBUG nova.virt.hardware [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.609 182939 DEBUG nova.virt.hardware [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.610 182939 DEBUG nova.virt.hardware [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.610 182939 DEBUG nova.virt.hardware [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.610 182939 DEBUG nova.virt.hardware [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.610 182939 DEBUG nova.virt.hardware [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.611 182939 DEBUG nova.virt.hardware [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.611 182939 DEBUG nova.virt.hardware [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.611 182939 DEBUG nova.virt.hardware [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.611 182939 DEBUG nova.objects.instance [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'vcpu_model' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.629 182939 DEBUG nova.objects.instance [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'pci_devices' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.647 182939 DEBUG nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:49:49 compute-0 nova_compute[182935]:   <uuid>fb3b64cb-7a89-4d0b-b821-db928d77b940</uuid>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   <name>instance-0000001d</name>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1721280846</nova:name>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:49:49</nova:creationTime>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:49:49 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:49:49 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:49:49 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:49:49 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:49:49 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:49:49 compute-0 nova_compute[182935]:         <nova:user uuid="98cbe317d3494846bdfe48215cfbc5c0">tempest-UnshelveToHostMultiNodesTest-32641639-project-member</nova:user>
Jan 21 23:49:49 compute-0 nova_compute[182935]:         <nova:project uuid="af45596abab74cc9aca5cbb551899c80">tempest-UnshelveToHostMultiNodesTest-32641639</nova:project>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="0ef107b0-6f96-43b3-a80c-432c1178a5c4"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <system>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <entry name="serial">fb3b64cb-7a89-4d0b-b821-db928d77b940</entry>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <entry name="uuid">fb3b64cb-7a89-4d0b-b821-db928d77b940</entry>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     </system>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   <os>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   </os>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   <features>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   </features>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.config"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/console.log" append="off"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <video>
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     </video>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <input type="keyboard" bus="usb"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:49:49 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:49:49 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:49:49 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:49:49 compute-0 nova_compute[182935]: </domain>
Jan 21 23:49:49 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.710 182939 DEBUG nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.711 182939 DEBUG nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.712 182939 INFO nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Using config drive
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.734 182939 DEBUG nova.objects.instance [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'ec2_ids' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.807 182939 DEBUG nova.objects.instance [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lazy-loading 'keypairs' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:49 compute-0 nova_compute[182935]: 2026-01-21 23:49:49.818 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.080 182939 INFO nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Creating config drive at /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.config
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.085 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1rej5b6h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.221 182939 DEBUG oslo_concurrency.processutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1rej5b6h" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:49:50 compute-0 systemd-machined[154182]: New machine qemu-21-instance-0000001d.
Jan 21 23:49:50 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001d.
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.588 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039390.587582, fb3b64cb-7a89-4d0b-b821-db928d77b940 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.589 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] VM Resumed (Lifecycle Event)
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.592 182939 DEBUG nova.compute.manager [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.592 182939 DEBUG nova.virt.libvirt.driver [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.596 182939 INFO nova.virt.libvirt.driver [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance spawned successfully.
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.620 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.623 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.647 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.648 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039390.588737, fb3b64cb-7a89-4d0b-b821-db928d77b940 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.648 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] VM Started (Lifecycle Event)
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.677 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.680 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:49:50 compute-0 nova_compute[182935]: 2026-01-21 23:49:50.707 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:49:51 compute-0 nova_compute[182935]: 2026-01-21 23:49:51.575 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:51 compute-0 podman[216561]: 2026-01-21 23:49:51.728883415 +0000 UTC m=+0.082909385 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:49:51 compute-0 nova_compute[182935]: 2026-01-21 23:49:51.765 182939 DEBUG nova.compute.manager [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:49:51 compute-0 podman[216560]: 2026-01-21 23:49:51.777141893 +0000 UTC m=+0.124200079 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 23:49:51 compute-0 nova_compute[182935]: 2026-01-21 23:49:51.851 182939 DEBUG oslo_concurrency.lockutils [None req-9aa44c58-370c-4c8a-b131-202480a45d6c 6ecb82537aea43e49dbf7e72fc5cb2fd 8857c3f1f4694ee0b967080721188d60 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 9.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:53 compute-0 nova_compute[182935]: 2026-01-21 23:49:53.732 182939 DEBUG oslo_concurrency.lockutils [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:53 compute-0 nova_compute[182935]: 2026-01-21 23:49:53.733 182939 DEBUG oslo_concurrency.lockutils [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:53 compute-0 nova_compute[182935]: 2026-01-21 23:49:53.733 182939 DEBUG oslo_concurrency.lockutils [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "fb3b64cb-7a89-4d0b-b821-db928d77b940-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:53 compute-0 nova_compute[182935]: 2026-01-21 23:49:53.733 182939 DEBUG oslo_concurrency.lockutils [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:53 compute-0 nova_compute[182935]: 2026-01-21 23:49:53.734 182939 DEBUG oslo_concurrency.lockutils [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:53 compute-0 nova_compute[182935]: 2026-01-21 23:49:53.744 182939 INFO nova.compute.manager [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Terminating instance
Jan 21 23:49:53 compute-0 nova_compute[182935]: 2026-01-21 23:49:53.756 182939 DEBUG oslo_concurrency.lockutils [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:49:53 compute-0 nova_compute[182935]: 2026-01-21 23:49:53.756 182939 DEBUG oslo_concurrency.lockutils [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquired lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:49:53 compute-0 nova_compute[182935]: 2026-01-21 23:49:53.756 182939 DEBUG nova.network.neutron [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:49:54 compute-0 nova_compute[182935]: 2026-01-21 23:49:54.723 182939 DEBUG nova.network.neutron [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:54 compute-0 nova_compute[182935]: 2026-01-21 23:49:54.820 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:55 compute-0 nova_compute[182935]: 2026-01-21 23:49:55.132 182939 DEBUG nova.network.neutron [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:55 compute-0 nova_compute[182935]: 2026-01-21 23:49:55.152 182939 DEBUG oslo_concurrency.lockutils [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Releasing lock "refresh_cache-fb3b64cb-7a89-4d0b-b821-db928d77b940" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:49:55 compute-0 nova_compute[182935]: 2026-01-21 23:49:55.153 182939 DEBUG nova.compute.manager [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:49:55 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 21 23:49:55 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001d.scope: Consumed 4.962s CPU time.
Jan 21 23:49:55 compute-0 systemd-machined[154182]: Machine qemu-21-instance-0000001d terminated.
Jan 21 23:49:55 compute-0 nova_compute[182935]: 2026-01-21 23:49:55.409 182939 INFO nova.virt.libvirt.driver [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance destroyed successfully.
Jan 21 23:49:55 compute-0 nova_compute[182935]: 2026-01-21 23:49:55.410 182939 DEBUG nova.objects.instance [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lazy-loading 'resources' on Instance uuid fb3b64cb-7a89-4d0b-b821-db928d77b940 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:49:55 compute-0 nova_compute[182935]: 2026-01-21 23:49:55.426 182939 INFO nova.virt.libvirt.driver [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Deleting instance files /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940_del
Jan 21 23:49:55 compute-0 nova_compute[182935]: 2026-01-21 23:49:55.432 182939 INFO nova.virt.libvirt.driver [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Deletion of /var/lib/nova/instances/fb3b64cb-7a89-4d0b-b821-db928d77b940_del complete
Jan 21 23:49:55 compute-0 nova_compute[182935]: 2026-01-21 23:49:55.503 182939 INFO nova.compute.manager [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 21 23:49:55 compute-0 nova_compute[182935]: 2026-01-21 23:49:55.504 182939 DEBUG oslo.service.loopingcall [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:49:55 compute-0 nova_compute[182935]: 2026-01-21 23:49:55.505 182939 DEBUG nova.compute.manager [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:49:55 compute-0 nova_compute[182935]: 2026-01-21 23:49:55.505 182939 DEBUG nova.network.neutron [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:49:56 compute-0 nova_compute[182935]: 2026-01-21 23:49:56.160 182939 DEBUG nova.network.neutron [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:49:56 compute-0 nova_compute[182935]: 2026-01-21 23:49:56.176 182939 DEBUG nova.network.neutron [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:49:56 compute-0 nova_compute[182935]: 2026-01-21 23:49:56.198 182939 INFO nova.compute.manager [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Took 0.69 seconds to deallocate network for instance.
Jan 21 23:49:56 compute-0 nova_compute[182935]: 2026-01-21 23:49:56.294 182939 DEBUG oslo_concurrency.lockutils [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:49:56 compute-0 nova_compute[182935]: 2026-01-21 23:49:56.294 182939 DEBUG oslo_concurrency.lockutils [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:49:56 compute-0 nova_compute[182935]: 2026-01-21 23:49:56.417 182939 DEBUG nova.compute.provider_tree [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:49:56 compute-0 nova_compute[182935]: 2026-01-21 23:49:56.436 182939 DEBUG nova.scheduler.client.report [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:49:56 compute-0 nova_compute[182935]: 2026-01-21 23:49:56.480 182939 DEBUG oslo_concurrency.lockutils [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:56 compute-0 nova_compute[182935]: 2026-01-21 23:49:56.506 182939 INFO nova.scheduler.client.report [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Deleted allocations for instance fb3b64cb-7a89-4d0b-b821-db928d77b940
Jan 21 23:49:56 compute-0 nova_compute[182935]: 2026-01-21 23:49:56.578 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:49:56 compute-0 nova_compute[182935]: 2026-01-21 23:49:56.620 182939 DEBUG oslo_concurrency.lockutils [None req-f5477a4e-c55c-4f0a-84b6-f090cfc0898a 98cbe317d3494846bdfe48215cfbc5c0 af45596abab74cc9aca5cbb551899c80 - - default default] Lock "fb3b64cb-7a89-4d0b-b821-db928d77b940" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.887s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:49:59 compute-0 podman[216629]: 2026-01-21 23:49:59.718638679 +0000 UTC m=+0.085791973 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:49:59 compute-0 nova_compute[182935]: 2026-01-21 23:49:59.822 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:01 compute-0 nova_compute[182935]: 2026-01-21 23:50:01.581 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:03.185 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:03.185 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:03.185 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:03 compute-0 podman[216653]: 2026-01-21 23:50:03.702128381 +0000 UTC m=+0.062459683 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 21 23:50:04 compute-0 nova_compute[182935]: 2026-01-21 23:50:04.824 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:06 compute-0 nova_compute[182935]: 2026-01-21 23:50:06.584 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:09 compute-0 nova_compute[182935]: 2026-01-21 23:50:09.827 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:10 compute-0 nova_compute[182935]: 2026-01-21 23:50:10.407 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039395.4056509, fb3b64cb-7a89-4d0b-b821-db928d77b940 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:10 compute-0 nova_compute[182935]: 2026-01-21 23:50:10.408 182939 INFO nova.compute.manager [-] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] VM Stopped (Lifecycle Event)
Jan 21 23:50:10 compute-0 nova_compute[182935]: 2026-01-21 23:50:10.429 182939 DEBUG nova.compute.manager [None req-5d1f86c4-9ba2-447d-bc65-53746421fe97 - - - - - -] [instance: fb3b64cb-7a89-4d0b-b821-db928d77b940] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:10 compute-0 sshd-session[216672]: Invalid user tomcat from 188.166.69.60 port 48498
Jan 21 23:50:11 compute-0 podman[216675]: 2026-01-21 23:50:11.040636244 +0000 UTC m=+0.061393460 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 23:50:11 compute-0 podman[216674]: 2026-01-21 23:50:11.044321931 +0000 UTC m=+0.069190763 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Jan 21 23:50:11 compute-0 sshd-session[216672]: Connection closed by invalid user tomcat 188.166.69.60 port 48498 [preauth]
Jan 21 23:50:11 compute-0 nova_compute[182935]: 2026-01-21 23:50:11.113 182939 DEBUG oslo_concurrency.lockutils [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "4d84ec02-4252-4dab-8580-d9961b6e6afd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:11 compute-0 nova_compute[182935]: 2026-01-21 23:50:11.114 182939 DEBUG oslo_concurrency.lockutils [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "4d84ec02-4252-4dab-8580-d9961b6e6afd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:11 compute-0 nova_compute[182935]: 2026-01-21 23:50:11.114 182939 DEBUG oslo_concurrency.lockutils [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "4d84ec02-4252-4dab-8580-d9961b6e6afd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:11 compute-0 nova_compute[182935]: 2026-01-21 23:50:11.114 182939 DEBUG oslo_concurrency.lockutils [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "4d84ec02-4252-4dab-8580-d9961b6e6afd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:11 compute-0 nova_compute[182935]: 2026-01-21 23:50:11.115 182939 DEBUG oslo_concurrency.lockutils [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "4d84ec02-4252-4dab-8580-d9961b6e6afd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:11 compute-0 nova_compute[182935]: 2026-01-21 23:50:11.128 182939 INFO nova.compute.manager [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Terminating instance
Jan 21 23:50:11 compute-0 nova_compute[182935]: 2026-01-21 23:50:11.140 182939 DEBUG oslo_concurrency.lockutils [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:50:11 compute-0 nova_compute[182935]: 2026-01-21 23:50:11.141 182939 DEBUG oslo_concurrency.lockutils [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:50:11 compute-0 nova_compute[182935]: 2026-01-21 23:50:11.141 182939 DEBUG nova.network.neutron [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:50:11 compute-0 nova_compute[182935]: 2026-01-21 23:50:11.586 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:11 compute-0 nova_compute[182935]: 2026-01-21 23:50:11.655 182939 DEBUG nova.network.neutron [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:50:11 compute-0 nova_compute[182935]: 2026-01-21 23:50:11.997 182939 DEBUG nova.network.neutron [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.022 182939 DEBUG oslo_concurrency.lockutils [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-4d84ec02-4252-4dab-8580-d9961b6e6afd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.023 182939 DEBUG nova.compute.manager [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:50:12 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Jan 21 23:50:12 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001e.scope: Consumed 15.886s CPU time.
Jan 21 23:50:12 compute-0 systemd-machined[154182]: Machine qemu-20-instance-0000001e terminated.
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.271 182939 INFO nova.virt.libvirt.driver [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance destroyed successfully.
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.271 182939 DEBUG nova.objects.instance [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'resources' on Instance uuid 4d84ec02-4252-4dab-8580-d9961b6e6afd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.297 182939 INFO nova.virt.libvirt.driver [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Deleting instance files /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_del
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.308 182939 INFO nova.virt.libvirt.driver [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Deletion of /var/lib/nova/instances/4d84ec02-4252-4dab-8580-d9961b6e6afd_del complete
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.399 182939 INFO nova.compute.manager [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.400 182939 DEBUG oslo.service.loopingcall [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.400 182939 DEBUG nova.compute.manager [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.400 182939 DEBUG nova.network.neutron [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.766 182939 DEBUG nova.network.neutron [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.790 182939 DEBUG nova.network.neutron [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.820 182939 INFO nova.compute.manager [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Took 0.42 seconds to deallocate network for instance.
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.918 182939 DEBUG oslo_concurrency.lockutils [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.919 182939 DEBUG oslo_concurrency.lockutils [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.925 182939 DEBUG oslo_concurrency.lockutils [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:12 compute-0 nova_compute[182935]: 2026-01-21 23:50:12.960 182939 INFO nova.scheduler.client.report [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Deleted allocations for instance 4d84ec02-4252-4dab-8580-d9961b6e6afd
Jan 21 23:50:13 compute-0 nova_compute[182935]: 2026-01-21 23:50:13.040 182939 DEBUG oslo_concurrency.lockutils [None req-66a94e75-5c2d-462e-aabc-90636e569aa9 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "4d84ec02-4252-4dab-8580-d9961b6e6afd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.926s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:14 compute-0 nova_compute[182935]: 2026-01-21 23:50:14.829 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:15 compute-0 nova_compute[182935]: 2026-01-21 23:50:15.549 182939 DEBUG oslo_concurrency.lockutils [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:15 compute-0 nova_compute[182935]: 2026-01-21 23:50:15.550 182939 DEBUG oslo_concurrency.lockutils [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:15 compute-0 nova_compute[182935]: 2026-01-21 23:50:15.550 182939 DEBUG oslo_concurrency.lockutils [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:15 compute-0 nova_compute[182935]: 2026-01-21 23:50:15.550 182939 DEBUG oslo_concurrency.lockutils [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:15 compute-0 nova_compute[182935]: 2026-01-21 23:50:15.551 182939 DEBUG oslo_concurrency.lockutils [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:15 compute-0 nova_compute[182935]: 2026-01-21 23:50:15.574 182939 INFO nova.compute.manager [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Terminating instance
Jan 21 23:50:15 compute-0 nova_compute[182935]: 2026-01-21 23:50:15.584 182939 DEBUG oslo_concurrency.lockutils [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-63b2e61e-8ad4-44e9-ba44-db37454a4b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:50:15 compute-0 nova_compute[182935]: 2026-01-21 23:50:15.584 182939 DEBUG oslo_concurrency.lockutils [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-63b2e61e-8ad4-44e9-ba44-db37454a4b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:50:15 compute-0 nova_compute[182935]: 2026-01-21 23:50:15.584 182939 DEBUG nova.network.neutron [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:50:15 compute-0 nova_compute[182935]: 2026-01-21 23:50:15.849 182939 DEBUG nova.network.neutron [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:50:16 compute-0 nova_compute[182935]: 2026-01-21 23:50:16.407 182939 DEBUG nova.network.neutron [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:50:16 compute-0 nova_compute[182935]: 2026-01-21 23:50:16.445 182939 DEBUG oslo_concurrency.lockutils [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-63b2e61e-8ad4-44e9-ba44-db37454a4b34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:50:16 compute-0 nova_compute[182935]: 2026-01-21 23:50:16.446 182939 DEBUG nova.compute.manager [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:50:16 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Jan 21 23:50:16 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001a.scope: Consumed 15.818s CPU time.
Jan 21 23:50:16 compute-0 systemd-machined[154182]: Machine qemu-16-instance-0000001a terminated.
Jan 21 23:50:16 compute-0 nova_compute[182935]: 2026-01-21 23:50:16.593 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:16 compute-0 nova_compute[182935]: 2026-01-21 23:50:16.712 182939 INFO nova.virt.libvirt.driver [-] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance destroyed successfully.
Jan 21 23:50:16 compute-0 nova_compute[182935]: 2026-01-21 23:50:16.713 182939 DEBUG nova.objects.instance [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'resources' on Instance uuid 63b2e61e-8ad4-44e9-ba44-db37454a4b34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:50:16 compute-0 nova_compute[182935]: 2026-01-21 23:50:16.737 182939 INFO nova.virt.libvirt.driver [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Deleting instance files /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34_del
Jan 21 23:50:16 compute-0 nova_compute[182935]: 2026-01-21 23:50:16.740 182939 INFO nova.virt.libvirt.driver [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Deletion of /var/lib/nova/instances/63b2e61e-8ad4-44e9-ba44-db37454a4b34_del complete
Jan 21 23:50:16 compute-0 nova_compute[182935]: 2026-01-21 23:50:16.840 182939 INFO nova.compute.manager [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 21 23:50:16 compute-0 nova_compute[182935]: 2026-01-21 23:50:16.841 182939 DEBUG oslo.service.loopingcall [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:50:16 compute-0 nova_compute[182935]: 2026-01-21 23:50:16.841 182939 DEBUG nova.compute.manager [-] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:50:16 compute-0 nova_compute[182935]: 2026-01-21 23:50:16.841 182939 DEBUG nova.network.neutron [-] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:50:17 compute-0 nova_compute[182935]: 2026-01-21 23:50:17.024 182939 DEBUG nova.network.neutron [-] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:50:17 compute-0 nova_compute[182935]: 2026-01-21 23:50:17.065 182939 DEBUG nova.network.neutron [-] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:50:17 compute-0 nova_compute[182935]: 2026-01-21 23:50:17.109 182939 INFO nova.compute.manager [-] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Took 0.27 seconds to deallocate network for instance.
Jan 21 23:50:17 compute-0 nova_compute[182935]: 2026-01-21 23:50:17.209 182939 DEBUG oslo_concurrency.lockutils [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:17 compute-0 nova_compute[182935]: 2026-01-21 23:50:17.210 182939 DEBUG oslo_concurrency.lockutils [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:17 compute-0 nova_compute[182935]: 2026-01-21 23:50:17.336 182939 DEBUG nova.compute.provider_tree [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:50:17 compute-0 nova_compute[182935]: 2026-01-21 23:50:17.355 182939 DEBUG nova.scheduler.client.report [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:50:17 compute-0 nova_compute[182935]: 2026-01-21 23:50:17.384 182939 DEBUG oslo_concurrency.lockutils [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:17 compute-0 nova_compute[182935]: 2026-01-21 23:50:17.411 182939 INFO nova.scheduler.client.report [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Deleted allocations for instance 63b2e61e-8ad4-44e9-ba44-db37454a4b34
Jan 21 23:50:17 compute-0 nova_compute[182935]: 2026-01-21 23:50:17.519 182939 DEBUG oslo_concurrency.lockutils [None req-9071151b-4f51-4a15-a514-5a11cdf4d65c 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "63b2e61e-8ad4-44e9-ba44-db37454a4b34" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.969s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:18 compute-0 nova_compute[182935]: 2026-01-21 23:50:18.713 182939 DEBUG oslo_concurrency.lockutils [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "2977f489-9f9d-43f7-a617-7556b7df5171" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:18 compute-0 nova_compute[182935]: 2026-01-21 23:50:18.714 182939 DEBUG oslo_concurrency.lockutils [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "2977f489-9f9d-43f7-a617-7556b7df5171" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:18 compute-0 nova_compute[182935]: 2026-01-21 23:50:18.715 182939 DEBUG oslo_concurrency.lockutils [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "2977f489-9f9d-43f7-a617-7556b7df5171-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:18 compute-0 nova_compute[182935]: 2026-01-21 23:50:18.715 182939 DEBUG oslo_concurrency.lockutils [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "2977f489-9f9d-43f7-a617-7556b7df5171-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:18 compute-0 nova_compute[182935]: 2026-01-21 23:50:18.716 182939 DEBUG oslo_concurrency.lockutils [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "2977f489-9f9d-43f7-a617-7556b7df5171-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:18 compute-0 nova_compute[182935]: 2026-01-21 23:50:18.732 182939 INFO nova.compute.manager [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Terminating instance
Jan 21 23:50:18 compute-0 nova_compute[182935]: 2026-01-21 23:50:18.745 182939 DEBUG oslo_concurrency.lockutils [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:50:18 compute-0 nova_compute[182935]: 2026-01-21 23:50:18.745 182939 DEBUG oslo_concurrency.lockutils [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquired lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:50:18 compute-0 nova_compute[182935]: 2026-01-21 23:50:18.746 182939 DEBUG nova.network.neutron [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:50:18 compute-0 nova_compute[182935]: 2026-01-21 23:50:18.977 182939 DEBUG nova.network.neutron [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.236 182939 DEBUG nova.network.neutron [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.260 182939 DEBUG oslo_concurrency.lockutils [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Releasing lock "refresh_cache-2977f489-9f9d-43f7-a617-7556b7df5171" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.261 182939 DEBUG nova.compute.manager [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:50:19 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000017.scope: Deactivated successfully.
Jan 21 23:50:19 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000017.scope: Consumed 17.381s CPU time.
Jan 21 23:50:19 compute-0 systemd-machined[154182]: Machine qemu-14-instance-00000017 terminated.
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.517 182939 INFO nova.virt.libvirt.driver [-] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance destroyed successfully.
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.518 182939 DEBUG nova.objects.instance [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lazy-loading 'resources' on Instance uuid 2977f489-9f9d-43f7-a617-7556b7df5171 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.533 182939 INFO nova.virt.libvirt.driver [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Deleting instance files /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171_del
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.536 182939 INFO nova.virt.libvirt.driver [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Deletion of /var/lib/nova/instances/2977f489-9f9d-43f7-a617-7556b7df5171_del complete
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.636 182939 INFO nova.compute.manager [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.637 182939 DEBUG oslo.service.loopingcall [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.638 182939 DEBUG nova.compute.manager [-] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.638 182939 DEBUG nova.network.neutron [-] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.831 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.839 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.873 182939 DEBUG nova.network.neutron [-] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.887 182939 DEBUG nova.network.neutron [-] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:50:19 compute-0 nova_compute[182935]: 2026-01-21 23:50:19.913 182939 INFO nova.compute.manager [-] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Took 0.27 seconds to deallocate network for instance.
Jan 21 23:50:20 compute-0 nova_compute[182935]: 2026-01-21 23:50:20.044 182939 DEBUG oslo_concurrency.lockutils [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:20 compute-0 nova_compute[182935]: 2026-01-21 23:50:20.045 182939 DEBUG oslo_concurrency.lockutils [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:20 compute-0 nova_compute[182935]: 2026-01-21 23:50:20.106 182939 DEBUG nova.compute.provider_tree [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:50:20 compute-0 nova_compute[182935]: 2026-01-21 23:50:20.126 182939 DEBUG nova.scheduler.client.report [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:50:20 compute-0 nova_compute[182935]: 2026-01-21 23:50:20.168 182939 DEBUG oslo_concurrency.lockutils [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:20 compute-0 nova_compute[182935]: 2026-01-21 23:50:20.202 182939 INFO nova.scheduler.client.report [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Deleted allocations for instance 2977f489-9f9d-43f7-a617-7556b7df5171
Jan 21 23:50:20 compute-0 nova_compute[182935]: 2026-01-21 23:50:20.282 182939 DEBUG oslo_concurrency.lockutils [None req-927b4413-32cd-47e7-8182-592310484150 36d71830ce70436e97fbc17b6da8d3c6 95574103d0094883861c58d01690e5a3 - - default default] Lock "2977f489-9f9d-43f7-a617-7556b7df5171" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:20 compute-0 nova_compute[182935]: 2026-01-21 23:50:20.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:21 compute-0 nova_compute[182935]: 2026-01-21 23:50:21.595 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:21 compute-0 nova_compute[182935]: 2026-01-21 23:50:21.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:21 compute-0 nova_compute[182935]: 2026-01-21 23:50:21.837 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:21 compute-0 nova_compute[182935]: 2026-01-21 23:50:21.837 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:21 compute-0 nova_compute[182935]: 2026-01-21 23:50:21.837 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:21 compute-0 nova_compute[182935]: 2026-01-21 23:50:21.838 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:50:21 compute-0 podman[216745]: 2026-01-21 23:50:21.960905019 +0000 UTC m=+0.066219164 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:50:21 compute-0 podman[216744]: 2026-01-21 23:50:21.988213094 +0000 UTC m=+0.096615891 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 23:50:22 compute-0 nova_compute[182935]: 2026-01-21 23:50:22.018 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:50:22 compute-0 nova_compute[182935]: 2026-01-21 23:50:22.020 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5711MB free_disk=73.27547454833984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:50:22 compute-0 nova_compute[182935]: 2026-01-21 23:50:22.020 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:22 compute-0 nova_compute[182935]: 2026-01-21 23:50:22.020 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:22 compute-0 nova_compute[182935]: 2026-01-21 23:50:22.078 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:50:22 compute-0 nova_compute[182935]: 2026-01-21 23:50:22.078 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:50:22 compute-0 nova_compute[182935]: 2026-01-21 23:50:22.112 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:50:22 compute-0 nova_compute[182935]: 2026-01-21 23:50:22.127 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:50:22 compute-0 nova_compute[182935]: 2026-01-21 23:50:22.161 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:50:22 compute-0 nova_compute[182935]: 2026-01-21 23:50:22.162 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:23 compute-0 nova_compute[182935]: 2026-01-21 23:50:23.156 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:23 compute-0 nova_compute[182935]: 2026-01-21 23:50:23.157 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.279 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.279 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.279 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:50:23.279 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:50:23 compute-0 nova_compute[182935]: 2026-01-21 23:50:23.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:23 compute-0 nova_compute[182935]: 2026-01-21 23:50:23.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:23 compute-0 nova_compute[182935]: 2026-01-21 23:50:23.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.193 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Acquiring lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.193 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.225 182939 DEBUG nova.compute.manager [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.383 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.384 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.394 182939 DEBUG nova.virt.hardware [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.394 182939 INFO nova.compute.claims [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.590 182939 DEBUG nova.compute.provider_tree [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.611 182939 DEBUG nova.scheduler.client.report [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.634 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.635 182939 DEBUG nova.compute.manager [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.708 182939 DEBUG nova.compute.manager [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.709 182939 DEBUG nova.network.neutron [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.732 182939 INFO nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.757 182939 DEBUG nova.compute.manager [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.833 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.889 182939 DEBUG nova.compute.manager [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.891 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.891 182939 INFO nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Creating image(s)
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.892 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Acquiring lock "/var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.892 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "/var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.893 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "/var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.905 182939 DEBUG oslo_concurrency.processutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.981 182939 DEBUG nova.policy [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '782706408079485b86dfcb2709fe4bb9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6f05553fbb5e485ca9ac59d61797526f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.985 182939 DEBUG oslo_concurrency.processutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.986 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:24 compute-0 nova_compute[182935]: 2026-01-21 23:50:24.987 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.002 182939 DEBUG oslo_concurrency.processutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.079 182939 DEBUG oslo_concurrency.processutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.081 182939 DEBUG oslo_concurrency.processutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.115 182939 DEBUG oslo_concurrency.processutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.116 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.117 182939 DEBUG oslo_concurrency.processutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.185 182939 DEBUG oslo_concurrency.processutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.186 182939 DEBUG nova.virt.disk.api [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Checking if we can resize image /var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.187 182939 DEBUG oslo_concurrency.processutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.250 182939 DEBUG oslo_concurrency.processutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.251 182939 DEBUG nova.virt.disk.api [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Cannot resize image /var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.251 182939 DEBUG nova.objects.instance [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lazy-loading 'migration_context' on Instance uuid 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.309 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.310 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Ensure instance console log exists: /var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.311 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.311 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.311 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:25 compute-0 nova_compute[182935]: 2026-01-21 23:50:25.936 182939 DEBUG nova.network.neutron [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Successfully created port: 10614c52-453a-42bc-a61e-19263126e133 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:50:26 compute-0 nova_compute[182935]: 2026-01-21 23:50:26.598 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:27 compute-0 nova_compute[182935]: 2026-01-21 23:50:27.270 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039412.2694812, 4d84ec02-4252-4dab-8580-d9961b6e6afd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:27 compute-0 nova_compute[182935]: 2026-01-21 23:50:27.271 182939 INFO nova.compute.manager [-] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] VM Stopped (Lifecycle Event)
Jan 21 23:50:27 compute-0 nova_compute[182935]: 2026-01-21 23:50:27.284 182939 DEBUG nova.network.neutron [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Successfully updated port: 10614c52-453a-42bc-a61e-19263126e133 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:50:27 compute-0 nova_compute[182935]: 2026-01-21 23:50:27.306 182939 DEBUG nova.compute.manager [None req-2f672eb0-668e-40b6-a8fb-0a905092fafc - - - - - -] [instance: 4d84ec02-4252-4dab-8580-d9961b6e6afd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:27 compute-0 nova_compute[182935]: 2026-01-21 23:50:27.311 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Acquiring lock "refresh_cache-3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:50:27 compute-0 nova_compute[182935]: 2026-01-21 23:50:27.312 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Acquired lock "refresh_cache-3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:50:27 compute-0 nova_compute[182935]: 2026-01-21 23:50:27.313 182939 DEBUG nova.network.neutron [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:50:27 compute-0 nova_compute[182935]: 2026-01-21 23:50:27.543 182939 DEBUG nova.network.neutron [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:50:27 compute-0 nova_compute[182935]: 2026-01-21 23:50:27.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.152 182939 DEBUG nova.compute.manager [req-742e0425-26a3-4272-b049-2195a272ba61 req-a3ee4fab-504f-496a-858f-32a1eb34998b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Received event network-changed-10614c52-453a-42bc-a61e-19263126e133 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.153 182939 DEBUG nova.compute.manager [req-742e0425-26a3-4272-b049-2195a272ba61 req-a3ee4fab-504f-496a-858f-32a1eb34998b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Refreshing instance network info cache due to event network-changed-10614c52-453a-42bc-a61e-19263126e133. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.153 182939 DEBUG oslo_concurrency.lockutils [req-742e0425-26a3-4272-b049-2195a272ba61 req-a3ee4fab-504f-496a-858f-32a1eb34998b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.457 182939 DEBUG nova.network.neutron [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Updating instance_info_cache with network_info: [{"id": "10614c52-453a-42bc-a61e-19263126e133", "address": "fa:16:3e:66:f1:ba", "network": {"id": "9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1863023570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6f05553fbb5e485ca9ac59d61797526f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10614c52-45", "ovs_interfaceid": "10614c52-453a-42bc-a61e-19263126e133", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.486 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Releasing lock "refresh_cache-3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.486 182939 DEBUG nova.compute.manager [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Instance network_info: |[{"id": "10614c52-453a-42bc-a61e-19263126e133", "address": "fa:16:3e:66:f1:ba", "network": {"id": "9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1863023570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6f05553fbb5e485ca9ac59d61797526f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10614c52-45", "ovs_interfaceid": "10614c52-453a-42bc-a61e-19263126e133", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.487 182939 DEBUG oslo_concurrency.lockutils [req-742e0425-26a3-4272-b049-2195a272ba61 req-a3ee4fab-504f-496a-858f-32a1eb34998b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.487 182939 DEBUG nova.network.neutron [req-742e0425-26a3-4272-b049-2195a272ba61 req-a3ee4fab-504f-496a-858f-32a1eb34998b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Refreshing network info cache for port 10614c52-453a-42bc-a61e-19263126e133 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.490 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Start _get_guest_xml network_info=[{"id": "10614c52-453a-42bc-a61e-19263126e133", "address": "fa:16:3e:66:f1:ba", "network": {"id": "9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1863023570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6f05553fbb5e485ca9ac59d61797526f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10614c52-45", "ovs_interfaceid": "10614c52-453a-42bc-a61e-19263126e133", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.496 182939 WARNING nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.508 182939 DEBUG nova.virt.libvirt.host [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.509 182939 DEBUG nova.virt.libvirt.host [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.513 182939 DEBUG nova.virt.libvirt.host [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.514 182939 DEBUG nova.virt.libvirt.host [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.515 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.516 182939 DEBUG nova.virt.hardware [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.516 182939 DEBUG nova.virt.hardware [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.516 182939 DEBUG nova.virt.hardware [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.516 182939 DEBUG nova.virt.hardware [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.517 182939 DEBUG nova.virt.hardware [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.517 182939 DEBUG nova.virt.hardware [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.517 182939 DEBUG nova.virt.hardware [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.517 182939 DEBUG nova.virt.hardware [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.518 182939 DEBUG nova.virt.hardware [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.518 182939 DEBUG nova.virt.hardware [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.518 182939 DEBUG nova.virt.hardware [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.522 182939 DEBUG nova.virt.libvirt.vif [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:50:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-2118866858',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-2118866858',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-211886685',id=37,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6f05553fbb5e485ca9ac59d61797526f',ramdisk_id='',reservation_id='r-grzq9ecw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-100954466',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-100954466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:50:24Z,user_data=None,user_id='782706408079485b86dfcb2709fe4bb9',uuid=3f09ea6e-f851-4c27-9ebd-0d4eec5bd236,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10614c52-453a-42bc-a61e-19263126e133", "address": "fa:16:3e:66:f1:ba", "network": {"id": "9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1863023570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6f05553fbb5e485ca9ac59d61797526f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10614c52-45", "ovs_interfaceid": "10614c52-453a-42bc-a61e-19263126e133", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.523 182939 DEBUG nova.network.os_vif_util [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Converting VIF {"id": "10614c52-453a-42bc-a61e-19263126e133", "address": "fa:16:3e:66:f1:ba", "network": {"id": "9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1863023570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6f05553fbb5e485ca9ac59d61797526f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10614c52-45", "ovs_interfaceid": "10614c52-453a-42bc-a61e-19263126e133", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.524 182939 DEBUG nova.network.os_vif_util [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:f1:ba,bridge_name='br-int',has_traffic_filtering=True,id=10614c52-453a-42bc-a61e-19263126e133,network=Network(9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10614c52-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.525 182939 DEBUG nova.objects.instance [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.538 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:50:28 compute-0 nova_compute[182935]:   <uuid>3f09ea6e-f851-4c27-9ebd-0d4eec5bd236</uuid>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   <name>instance-00000025</name>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <nova:name>tempest-FloatingIPsAssociationNegativeTestJSON-server-2118866858</nova:name>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:50:28</nova:creationTime>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:50:28 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:50:28 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:50:28 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:50:28 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:50:28 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:50:28 compute-0 nova_compute[182935]:         <nova:user uuid="782706408079485b86dfcb2709fe4bb9">tempest-FloatingIPsAssociationNegativeTestJSON-100954466-project-member</nova:user>
Jan 21 23:50:28 compute-0 nova_compute[182935]:         <nova:project uuid="6f05553fbb5e485ca9ac59d61797526f">tempest-FloatingIPsAssociationNegativeTestJSON-100954466</nova:project>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:50:28 compute-0 nova_compute[182935]:         <nova:port uuid="10614c52-453a-42bc-a61e-19263126e133">
Jan 21 23:50:28 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <system>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <entry name="serial">3f09ea6e-f851-4c27-9ebd-0d4eec5bd236</entry>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <entry name="uuid">3f09ea6e-f851-4c27-9ebd-0d4eec5bd236</entry>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     </system>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   <os>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   </os>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   <features>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   </features>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk.config"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:66:f1:ba"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <target dev="tap10614c52-45"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/console.log" append="off"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <video>
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     </video>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:50:28 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:50:28 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:50:28 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:50:28 compute-0 nova_compute[182935]: </domain>
Jan 21 23:50:28 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.539 182939 DEBUG nova.compute.manager [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Preparing to wait for external event network-vif-plugged-10614c52-453a-42bc-a61e-19263126e133 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.540 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Acquiring lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.540 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.540 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.541 182939 DEBUG nova.virt.libvirt.vif [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:50:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-2118866858',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-2118866858',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-211886685',id=37,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6f05553fbb5e485ca9ac59d61797526f',ramdisk_id='',reservation_id='r-grzq9ecw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-100954466',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-100954466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:50:24Z,user_data=None,user_id='782706408079485b86dfcb2709fe4bb9',uuid=3f09ea6e-f851-4c27-9ebd-0d4eec5bd236,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10614c52-453a-42bc-a61e-19263126e133", "address": "fa:16:3e:66:f1:ba", "network": {"id": "9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1863023570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6f05553fbb5e485ca9ac59d61797526f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10614c52-45", "ovs_interfaceid": "10614c52-453a-42bc-a61e-19263126e133", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.541 182939 DEBUG nova.network.os_vif_util [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Converting VIF {"id": "10614c52-453a-42bc-a61e-19263126e133", "address": "fa:16:3e:66:f1:ba", "network": {"id": "9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1863023570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6f05553fbb5e485ca9ac59d61797526f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10614c52-45", "ovs_interfaceid": "10614c52-453a-42bc-a61e-19263126e133", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.542 182939 DEBUG nova.network.os_vif_util [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:f1:ba,bridge_name='br-int',has_traffic_filtering=True,id=10614c52-453a-42bc-a61e-19263126e133,network=Network(9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10614c52-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.543 182939 DEBUG os_vif [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:f1:ba,bridge_name='br-int',has_traffic_filtering=True,id=10614c52-453a-42bc-a61e-19263126e133,network=Network(9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10614c52-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.543 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.544 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.544 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.549 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.549 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap10614c52-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.550 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap10614c52-45, col_values=(('external_ids', {'iface-id': '10614c52-453a-42bc-a61e-19263126e133', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:f1:ba', 'vm-uuid': '3f09ea6e-f851-4c27-9ebd-0d4eec5bd236'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.551 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:28 compute-0 NetworkManager[55139]: <info>  [1769039428.5527] manager: (tap10614c52-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.553 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.560 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.560 182939 INFO os_vif [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:f1:ba,bridge_name='br-int',has_traffic_filtering=True,id=10614c52-453a-42bc-a61e-19263126e133,network=Network(9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10614c52-45')
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.621 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.621 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.622 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] No VIF found with MAC fa:16:3e:66:f1:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:50:28 compute-0 nova_compute[182935]: 2026-01-21 23:50:28.622 182939 INFO nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Using config drive
Jan 21 23:50:29 compute-0 nova_compute[182935]: 2026-01-21 23:50:29.438 182939 INFO nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Creating config drive at /var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk.config
Jan 21 23:50:29 compute-0 nova_compute[182935]: 2026-01-21 23:50:29.444 182939 DEBUG oslo_concurrency.processutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqksj_7vb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:29 compute-0 nova_compute[182935]: 2026-01-21 23:50:29.570 182939 DEBUG oslo_concurrency.processutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqksj_7vb" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:29 compute-0 kernel: tap10614c52-45: entered promiscuous mode
Jan 21 23:50:29 compute-0 NetworkManager[55139]: <info>  [1769039429.6388] manager: (tap10614c52-45): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Jan 21 23:50:29 compute-0 ovn_controller[95047]: 2026-01-21T23:50:29Z|00145|binding|INFO|Claiming lport 10614c52-453a-42bc-a61e-19263126e133 for this chassis.
Jan 21 23:50:29 compute-0 ovn_controller[95047]: 2026-01-21T23:50:29Z|00146|binding|INFO|10614c52-453a-42bc-a61e-19263126e133: Claiming fa:16:3e:66:f1:ba 10.100.0.8
Jan 21 23:50:29 compute-0 nova_compute[182935]: 2026-01-21 23:50:29.639 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:29 compute-0 nova_compute[182935]: 2026-01-21 23:50:29.642 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:29 compute-0 nova_compute[182935]: 2026-01-21 23:50:29.645 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.652 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:f1:ba 10.100.0.8'], port_security=['fa:16:3e:66:f1:ba 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3f09ea6e-f851-4c27-9ebd-0d4eec5bd236', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6f05553fbb5e485ca9ac59d61797526f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '270dc604-cd9d-4305-95b2-895f7ed99803', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b5d73aa-d6c3-48e2-89f6-48f3b8f3a8a9, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=10614c52-453a-42bc-a61e-19263126e133) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.653 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 10614c52-453a-42bc-a61e-19263126e133 in datapath 9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398 bound to our chassis
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.654 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.669 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5e0f2943-3f77-4c61-91ae-fc3c5c384d98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.670 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9dd7a7ea-e1 in ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:50:29 compute-0 systemd-udevd[216830]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:50:29 compute-0 systemd-machined[154182]: New machine qemu-22-instance-00000025.
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.673 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9dd7a7ea-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.674 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2d68b93d-ea99-462a-88c7-a650e4ae4aa2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.675 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e6f99de4-d158-4210-8523-8b472701c311]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 NetworkManager[55139]: <info>  [1769039429.6845] device (tap10614c52-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:50:29 compute-0 NetworkManager[55139]: <info>  [1769039429.6851] device (tap10614c52-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.688 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[e0ca4df1-e6ca-44f0-983d-3a40b43b16ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-00000025.
Jan 21 23:50:29 compute-0 nova_compute[182935]: 2026-01-21 23:50:29.702 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:29 compute-0 ovn_controller[95047]: 2026-01-21T23:50:29Z|00147|binding|INFO|Setting lport 10614c52-453a-42bc-a61e-19263126e133 ovn-installed in OVS
Jan 21 23:50:29 compute-0 ovn_controller[95047]: 2026-01-21T23:50:29Z|00148|binding|INFO|Setting lport 10614c52-453a-42bc-a61e-19263126e133 up in Southbound
Jan 21 23:50:29 compute-0 nova_compute[182935]: 2026-01-21 23:50:29.706 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.715 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e92c69cb-1428-476e-a770-17def0bb8351]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.744 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[a84f1718-5ba4-410d-9780-9a0fea9c194b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.749 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed5a17c-7c02-405c-9e3e-2c8956edddac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 NetworkManager[55139]: <info>  [1769039429.7509] manager: (tap9dd7a7ea-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.783 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3767a4-3ef5-485c-8231-9d965ca83d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.789 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[6a4e9eda-d64d-4378-8d39-8968fadbf5d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 NetworkManager[55139]: <info>  [1769039429.8120] device (tap9dd7a7ea-e0): carrier: link connected
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.817 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[2b0faeee-5dde-4477-948a-428fcf1bd434]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 nova_compute[182935]: 2026-01-21 23:50:29.835 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.838 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9ee38e35-e4d6-473d-841a-aef9d2241ff9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9dd7a7ea-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0b:26:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393255, 'reachable_time': 35669, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216874, 'error': None, 'target': 'ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 podman[216851]: 2026-01-21 23:50:29.864738833 +0000 UTC m=+0.071201359 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.870 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[76065050-3390-4fda-903b-29b78f910b54]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0b:2650'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 393255, 'tstamp': 393255}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216886, 'error': None, 'target': 'ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.890 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d008a9aa-353a-4de9-8eda-7f6c05348724]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9dd7a7ea-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0b:26:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393255, 'reachable_time': 35669, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216894, 'error': None, 'target': 'ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:29.932 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8df3497f-1ac2-44b6-964e-f5611721a180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:29 compute-0 nova_compute[182935]: 2026-01-21 23:50:29.981 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039429.9805021, 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:29 compute-0 nova_compute[182935]: 2026-01-21 23:50:29.982 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] VM Started (Lifecycle Event)
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.007 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:30.011 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8f7bbf67-79c0-4914-a1e6-eb33cefbd5cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.013 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039429.980695, 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.013 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] VM Paused (Lifecycle Event)
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:30.014 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9dd7a7ea-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:30.014 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:30.015 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9dd7a7ea-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.017 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:30 compute-0 NetworkManager[55139]: <info>  [1769039430.0178] manager: (tap9dd7a7ea-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Jan 21 23:50:30 compute-0 kernel: tap9dd7a7ea-e0: entered promiscuous mode
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.019 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:30.020 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9dd7a7ea-e0, col_values=(('external_ids', {'iface-id': '3f5bd4f8-1457-4927-b0d0-09b8c38e9f1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.021 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:30 compute-0 ovn_controller[95047]: 2026-01-21T23:50:30Z|00149|binding|INFO|Releasing lport 3f5bd4f8-1457-4927-b0d0-09b8c38e9f1e from this chassis (sb_readonly=0)
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.022 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:30.023 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:30.024 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0d4e2d-76a4-4722-a6a5-382875c8fe74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:30.025 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398.pid.haproxy
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:50:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:30.026 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398', 'env', 'PROCESS_TAG=haproxy-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.036 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.047 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.051 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.077 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:50:30 compute-0 podman[216928]: 2026-01-21 23:50:30.42039811 +0000 UTC m=+0.056933306 container create df56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:50:30 compute-0 systemd[1]: Started libpod-conmon-df56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa.scope.
Jan 21 23:50:30 compute-0 podman[216928]: 2026-01-21 23:50:30.38988741 +0000 UTC m=+0.026422686 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.484 182939 DEBUG nova.network.neutron [req-742e0425-26a3-4272-b049-2195a272ba61 req-a3ee4fab-504f-496a-858f-32a1eb34998b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Updated VIF entry in instance network info cache for port 10614c52-453a-42bc-a61e-19263126e133. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.485 182939 DEBUG nova.network.neutron [req-742e0425-26a3-4272-b049-2195a272ba61 req-a3ee4fab-504f-496a-858f-32a1eb34998b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Updating instance_info_cache with network_info: [{"id": "10614c52-453a-42bc-a61e-19263126e133", "address": "fa:16:3e:66:f1:ba", "network": {"id": "9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1863023570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6f05553fbb5e485ca9ac59d61797526f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10614c52-45", "ovs_interfaceid": "10614c52-453a-42bc-a61e-19263126e133", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:50:30 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:50:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81f9ac58177fd34756e3dc4c6984a3d90d095baab5f4b47d3a69e330f5d40aa0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:50:30 compute-0 nova_compute[182935]: 2026-01-21 23:50:30.504 182939 DEBUG oslo_concurrency.lockutils [req-742e0425-26a3-4272-b049-2195a272ba61 req-a3ee4fab-504f-496a-858f-32a1eb34998b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:50:30 compute-0 podman[216928]: 2026-01-21 23:50:30.511280416 +0000 UTC m=+0.147815612 container init df56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:50:30 compute-0 podman[216928]: 2026-01-21 23:50:30.519767473 +0000 UTC m=+0.156302669 container start df56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:50:30 compute-0 neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398[216944]: [NOTICE]   (216948) : New worker (216950) forked
Jan 21 23:50:30 compute-0 neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398[216944]: [NOTICE]   (216948) : Loading success.
Jan 21 23:50:31 compute-0 nova_compute[182935]: 2026-01-21 23:50:31.709 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039416.7084332, 63b2e61e-8ad4-44e9-ba44-db37454a4b34 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:31 compute-0 nova_compute[182935]: 2026-01-21 23:50:31.710 182939 INFO nova.compute.manager [-] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] VM Stopped (Lifecycle Event)
Jan 21 23:50:31 compute-0 nova_compute[182935]: 2026-01-21 23:50:31.741 182939 DEBUG nova.compute.manager [None req-dcc84fb4-dcc0-43b0-a0cf-069d1c8d86ab - - - - - -] [instance: 63b2e61e-8ad4-44e9-ba44-db37454a4b34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:33 compute-0 nova_compute[182935]: 2026-01-21 23:50:33.552 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:34 compute-0 nova_compute[182935]: 2026-01-21 23:50:34.517 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039419.5149868, 2977f489-9f9d-43f7-a617-7556b7df5171 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:34 compute-0 nova_compute[182935]: 2026-01-21 23:50:34.517 182939 INFO nova.compute.manager [-] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] VM Stopped (Lifecycle Event)
Jan 21 23:50:34 compute-0 nova_compute[182935]: 2026-01-21 23:50:34.555 182939 DEBUG nova.compute.manager [None req-aba0a4bb-5be8-4511-9deb-97a834978ebe - - - - - -] [instance: 2977f489-9f9d-43f7-a617-7556b7df5171] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:34 compute-0 podman[216959]: 2026-01-21 23:50:34.726463237 +0000 UTC m=+0.099649791 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 21 23:50:34 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 21 23:50:34 compute-0 nova_compute[182935]: 2026-01-21 23:50:34.837 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.146 182939 DEBUG nova.compute.manager [req-ca17bd5e-03fd-46d0-9128-b4795604dcf6 req-d1e24d57-d2d7-4b3e-b78e-6f43f3c1241c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Received event network-vif-plugged-10614c52-453a-42bc-a61e-19263126e133 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.146 182939 DEBUG oslo_concurrency.lockutils [req-ca17bd5e-03fd-46d0-9128-b4795604dcf6 req-d1e24d57-d2d7-4b3e-b78e-6f43f3c1241c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.147 182939 DEBUG oslo_concurrency.lockutils [req-ca17bd5e-03fd-46d0-9128-b4795604dcf6 req-d1e24d57-d2d7-4b3e-b78e-6f43f3c1241c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.147 182939 DEBUG oslo_concurrency.lockutils [req-ca17bd5e-03fd-46d0-9128-b4795604dcf6 req-d1e24d57-d2d7-4b3e-b78e-6f43f3c1241c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.147 182939 DEBUG nova.compute.manager [req-ca17bd5e-03fd-46d0-9128-b4795604dcf6 req-d1e24d57-d2d7-4b3e-b78e-6f43f3c1241c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Processing event network-vif-plugged-10614c52-453a-42bc-a61e-19263126e133 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.148 182939 DEBUG nova.compute.manager [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.153 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039435.1529799, 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.153 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] VM Resumed (Lifecycle Event)
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.156 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.161 182939 INFO nova.virt.libvirt.driver [-] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Instance spawned successfully.
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.162 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.192 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.200 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.204 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.204 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.205 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.205 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.206 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.206 182939 DEBUG nova.virt.libvirt.driver [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.290 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.386 182939 INFO nova.compute.manager [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Took 10.50 seconds to spawn the instance on the hypervisor.
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.387 182939 DEBUG nova.compute.manager [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.472 182939 INFO nova.compute.manager [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Took 11.14 seconds to build instance.
Jan 21 23:50:35 compute-0 nova_compute[182935]: 2026-01-21 23:50:35.495 182939 DEBUG oslo_concurrency.lockutils [None req-f8234e9b-07b7-445a-95e3-78fdd2788009 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:37 compute-0 nova_compute[182935]: 2026-01-21 23:50:37.257 182939 DEBUG nova.compute.manager [req-1158ec7b-d96b-428e-98bd-7b2045570f40 req-1fb07539-ad39-4a46-8333-e6e0f0740ff8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Received event network-vif-plugged-10614c52-453a-42bc-a61e-19263126e133 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:50:37 compute-0 nova_compute[182935]: 2026-01-21 23:50:37.258 182939 DEBUG oslo_concurrency.lockutils [req-1158ec7b-d96b-428e-98bd-7b2045570f40 req-1fb07539-ad39-4a46-8333-e6e0f0740ff8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:37 compute-0 nova_compute[182935]: 2026-01-21 23:50:37.259 182939 DEBUG oslo_concurrency.lockutils [req-1158ec7b-d96b-428e-98bd-7b2045570f40 req-1fb07539-ad39-4a46-8333-e6e0f0740ff8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:37 compute-0 nova_compute[182935]: 2026-01-21 23:50:37.259 182939 DEBUG oslo_concurrency.lockutils [req-1158ec7b-d96b-428e-98bd-7b2045570f40 req-1fb07539-ad39-4a46-8333-e6e0f0740ff8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:37 compute-0 nova_compute[182935]: 2026-01-21 23:50:37.259 182939 DEBUG nova.compute.manager [req-1158ec7b-d96b-428e-98bd-7b2045570f40 req-1fb07539-ad39-4a46-8333-e6e0f0740ff8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] No waiting events found dispatching network-vif-plugged-10614c52-453a-42bc-a61e-19263126e133 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:50:37 compute-0 nova_compute[182935]: 2026-01-21 23:50:37.259 182939 WARNING nova.compute.manager [req-1158ec7b-d96b-428e-98bd-7b2045570f40 req-1fb07539-ad39-4a46-8333-e6e0f0740ff8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Received unexpected event network-vif-plugged-10614c52-453a-42bc-a61e-19263126e133 for instance with vm_state active and task_state None.
Jan 21 23:50:38 compute-0 nova_compute[182935]: 2026-01-21 23:50:38.553 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:39 compute-0 nova_compute[182935]: 2026-01-21 23:50:39.839 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:40 compute-0 nova_compute[182935]: 2026-01-21 23:50:40.786 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:40 compute-0 NetworkManager[55139]: <info>  [1769039440.7872] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/65)
Jan 21 23:50:40 compute-0 NetworkManager[55139]: <info>  [1769039440.7878] device (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:50:40 compute-0 NetworkManager[55139]: <warn>  [1769039440.7880] device (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:50:40 compute-0 NetworkManager[55139]: <info>  [1769039440.7890] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/66)
Jan 21 23:50:40 compute-0 NetworkManager[55139]: <info>  [1769039440.7894] device (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 21 23:50:40 compute-0 NetworkManager[55139]: <warn>  [1769039440.7895] device (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 21 23:50:40 compute-0 NetworkManager[55139]: <info>  [1769039440.7902] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 21 23:50:40 compute-0 NetworkManager[55139]: <info>  [1769039440.7908] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Jan 21 23:50:40 compute-0 NetworkManager[55139]: <info>  [1769039440.7912] device (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 21 23:50:40 compute-0 NetworkManager[55139]: <info>  [1769039440.7915] device (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 21 23:50:40 compute-0 nova_compute[182935]: 2026-01-21 23:50:40.883 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:40 compute-0 ovn_controller[95047]: 2026-01-21T23:50:40Z|00150|binding|INFO|Releasing lport 3f5bd4f8-1457-4927-b0d0-09b8c38e9f1e from this chassis (sb_readonly=0)
Jan 21 23:50:40 compute-0 nova_compute[182935]: 2026-01-21 23:50:40.898 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:41 compute-0 nova_compute[182935]: 2026-01-21 23:50:41.183 182939 DEBUG nova.compute.manager [req-da89f1ef-b34c-4560-b3b7-66dfd303923a req-b7323bae-2444-4fb1-93f0-ac6f065eb78a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Received event network-changed-10614c52-453a-42bc-a61e-19263126e133 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:50:41 compute-0 nova_compute[182935]: 2026-01-21 23:50:41.184 182939 DEBUG nova.compute.manager [req-da89f1ef-b34c-4560-b3b7-66dfd303923a req-b7323bae-2444-4fb1-93f0-ac6f065eb78a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Refreshing instance network info cache due to event network-changed-10614c52-453a-42bc-a61e-19263126e133. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:50:41 compute-0 nova_compute[182935]: 2026-01-21 23:50:41.184 182939 DEBUG oslo_concurrency.lockutils [req-da89f1ef-b34c-4560-b3b7-66dfd303923a req-b7323bae-2444-4fb1-93f0-ac6f065eb78a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:50:41 compute-0 nova_compute[182935]: 2026-01-21 23:50:41.185 182939 DEBUG oslo_concurrency.lockutils [req-da89f1ef-b34c-4560-b3b7-66dfd303923a req-b7323bae-2444-4fb1-93f0-ac6f065eb78a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:50:41 compute-0 nova_compute[182935]: 2026-01-21 23:50:41.185 182939 DEBUG nova.network.neutron [req-da89f1ef-b34c-4560-b3b7-66dfd303923a req-b7323bae-2444-4fb1-93f0-ac6f065eb78a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Refreshing network info cache for port 10614c52-453a-42bc-a61e-19263126e133 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:50:41 compute-0 podman[216981]: 2026-01-21 23:50:41.721980622 +0000 UTC m=+0.088027220 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 23:50:41 compute-0 podman[216980]: 2026-01-21 23:50:41.723225822 +0000 UTC m=+0.094234545 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 23:50:43 compute-0 nova_compute[182935]: 2026-01-21 23:50:43.514 182939 DEBUG nova.network.neutron [req-da89f1ef-b34c-4560-b3b7-66dfd303923a req-b7323bae-2444-4fb1-93f0-ac6f065eb78a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Updated VIF entry in instance network info cache for port 10614c52-453a-42bc-a61e-19263126e133. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:50:43 compute-0 nova_compute[182935]: 2026-01-21 23:50:43.514 182939 DEBUG nova.network.neutron [req-da89f1ef-b34c-4560-b3b7-66dfd303923a req-b7323bae-2444-4fb1-93f0-ac6f065eb78a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Updating instance_info_cache with network_info: [{"id": "10614c52-453a-42bc-a61e-19263126e133", "address": "fa:16:3e:66:f1:ba", "network": {"id": "9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1863023570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6f05553fbb5e485ca9ac59d61797526f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10614c52-45", "ovs_interfaceid": "10614c52-453a-42bc-a61e-19263126e133", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:50:43 compute-0 nova_compute[182935]: 2026-01-21 23:50:43.553 182939 DEBUG oslo_concurrency.lockutils [req-da89f1ef-b34c-4560-b3b7-66dfd303923a req-b7323bae-2444-4fb1-93f0-ac6f065eb78a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:50:43 compute-0 nova_compute[182935]: 2026-01-21 23:50:43.556 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:44 compute-0 nova_compute[182935]: 2026-01-21 23:50:44.846 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:46.842 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:50:46 compute-0 nova_compute[182935]: 2026-01-21 23:50:46.843 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:46.846 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:50:47 compute-0 ovn_controller[95047]: 2026-01-21T23:50:47Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:66:f1:ba 10.100.0.8
Jan 21 23:50:47 compute-0 ovn_controller[95047]: 2026-01-21T23:50:47Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:66:f1:ba 10.100.0.8
Jan 21 23:50:48 compute-0 nova_compute[182935]: 2026-01-21 23:50:48.558 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:49 compute-0 nova_compute[182935]: 2026-01-21 23:50:49.891 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:51 compute-0 sshd-session[217034]: Invalid user tomcat from 188.166.69.60 port 59842
Jan 21 23:50:51 compute-0 sshd-session[217034]: Connection closed by invalid user tomcat 188.166.69.60 port 59842 [preauth]
Jan 21 23:50:52 compute-0 podman[217037]: 2026-01-21 23:50:52.705472413 +0000 UTC m=+0.069301275 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:50:52 compute-0 podman[217036]: 2026-01-21 23:50:52.767378264 +0000 UTC m=+0.125945243 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 23:50:53 compute-0 nova_compute[182935]: 2026-01-21 23:50:53.560 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:54 compute-0 nova_compute[182935]: 2026-01-21 23:50:54.894 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:55 compute-0 nova_compute[182935]: 2026-01-21 23:50:55.836 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "f36ee268-83c4-4567-bae3-ee40afbb7882" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:55 compute-0 nova_compute[182935]: 2026-01-21 23:50:55.836 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "f36ee268-83c4-4567-bae3-ee40afbb7882" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:50:55.851 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:50:55 compute-0 nova_compute[182935]: 2026-01-21 23:50:55.871 182939 DEBUG nova.compute.manager [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.116 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.117 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.125 182939 DEBUG nova.virt.hardware [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.125 182939 INFO nova.compute.claims [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.379 182939 DEBUG nova.compute.provider_tree [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.407 182939 DEBUG nova.scheduler.client.report [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.443 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.444 182939 DEBUG nova.compute.manager [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.557 182939 DEBUG nova.compute.manager [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.558 182939 DEBUG nova.network.neutron [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.609 182939 INFO nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.639 182939 DEBUG nova.compute.manager [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.715 182939 DEBUG nova.compute.manager [req-82847e51-5bed-48bb-9cce-5f0bb96f5dd2 req-59bca607-6ff0-46e5-81c5-600b6f1ef499 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Received event network-changed-10614c52-453a-42bc-a61e-19263126e133 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.716 182939 DEBUG nova.compute.manager [req-82847e51-5bed-48bb-9cce-5f0bb96f5dd2 req-59bca607-6ff0-46e5-81c5-600b6f1ef499 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Refreshing instance network info cache due to event network-changed-10614c52-453a-42bc-a61e-19263126e133. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.716 182939 DEBUG oslo_concurrency.lockutils [req-82847e51-5bed-48bb-9cce-5f0bb96f5dd2 req-59bca607-6ff0-46e5-81c5-600b6f1ef499 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.716 182939 DEBUG oslo_concurrency.lockutils [req-82847e51-5bed-48bb-9cce-5f0bb96f5dd2 req-59bca607-6ff0-46e5-81c5-600b6f1ef499 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.717 182939 DEBUG nova.network.neutron [req-82847e51-5bed-48bb-9cce-5f0bb96f5dd2 req-59bca607-6ff0-46e5-81c5-600b6f1ef499 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Refreshing network info cache for port 10614c52-453a-42bc-a61e-19263126e133 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.879 182939 DEBUG nova.compute.manager [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.880 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.881 182939 INFO nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Creating image(s)
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.881 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "/var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.881 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "/var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.882 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "/var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.894 182939 DEBUG oslo_concurrency.processutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.956 182939 DEBUG oslo_concurrency.processutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.957 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.958 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.969 182939 DEBUG oslo_concurrency.processutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:56 compute-0 nova_compute[182935]: 2026-01-21 23:50:56.990 182939 DEBUG nova.policy [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8da2db8893d4442aaaada7d43ff2500f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bdcd24bf916b4c3aa2e173bea9dd7202', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.027 182939 DEBUG oslo_concurrency.processutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.027 182939 DEBUG oslo_concurrency.processutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.066 182939 DEBUG oslo_concurrency.processutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.067 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.068 182939 DEBUG oslo_concurrency.processutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.127 182939 DEBUG oslo_concurrency.processutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.128 182939 DEBUG nova.virt.disk.api [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Checking if we can resize image /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.129 182939 DEBUG oslo_concurrency.processutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.196 182939 DEBUG oslo_concurrency.processutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.198 182939 DEBUG nova.virt.disk.api [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Cannot resize image /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.198 182939 DEBUG nova.objects.instance [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lazy-loading 'migration_context' on Instance uuid f36ee268-83c4-4567-bae3-ee40afbb7882 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.222 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.223 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Ensure instance console log exists: /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.223 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.224 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:50:57 compute-0 nova_compute[182935]: 2026-01-21 23:50:57.224 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:50:58 compute-0 nova_compute[182935]: 2026-01-21 23:50:58.562 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:50:58 compute-0 nova_compute[182935]: 2026-01-21 23:50:58.577 182939 DEBUG nova.network.neutron [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Successfully created port: 022cfedf-447c-4fd1-9013-480f624fe044 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:50:59 compute-0 nova_compute[182935]: 2026-01-21 23:50:59.471 182939 DEBUG nova.network.neutron [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Successfully updated port: 022cfedf-447c-4fd1-9013-480f624fe044 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:50:59 compute-0 nova_compute[182935]: 2026-01-21 23:50:59.496 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:50:59 compute-0 nova_compute[182935]: 2026-01-21 23:50:59.496 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquired lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:50:59 compute-0 nova_compute[182935]: 2026-01-21 23:50:59.496 182939 DEBUG nova.network.neutron [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:50:59 compute-0 nova_compute[182935]: 2026-01-21 23:50:59.790 182939 DEBUG nova.network.neutron [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:50:59 compute-0 nova_compute[182935]: 2026-01-21 23:50:59.943 182939 DEBUG nova.compute.manager [req-fed78a81-7cf2-4825-9225-68764ed9b049 req-6c58208d-4f40-48b4-a9c0-dc6ebf741c13 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Received event network-changed-022cfedf-447c-4fd1-9013-480f624fe044 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:50:59 compute-0 nova_compute[182935]: 2026-01-21 23:50:59.943 182939 DEBUG nova.compute.manager [req-fed78a81-7cf2-4825-9225-68764ed9b049 req-6c58208d-4f40-48b4-a9c0-dc6ebf741c13 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Refreshing instance network info cache due to event network-changed-022cfedf-447c-4fd1-9013-480f624fe044. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:50:59 compute-0 nova_compute[182935]: 2026-01-21 23:50:59.944 182939 DEBUG oslo_concurrency.lockutils [req-fed78a81-7cf2-4825-9225-68764ed9b049 req-6c58208d-4f40-48b4-a9c0-dc6ebf741c13 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:50:59 compute-0 nova_compute[182935]: 2026-01-21 23:50:59.944 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.097 182939 DEBUG nova.network.neutron [req-82847e51-5bed-48bb-9cce-5f0bb96f5dd2 req-59bca607-6ff0-46e5-81c5-600b6f1ef499 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Updated VIF entry in instance network info cache for port 10614c52-453a-42bc-a61e-19263126e133. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.097 182939 DEBUG nova.network.neutron [req-82847e51-5bed-48bb-9cce-5f0bb96f5dd2 req-59bca607-6ff0-46e5-81c5-600b6f1ef499 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Updating instance_info_cache with network_info: [{"id": "10614c52-453a-42bc-a61e-19263126e133", "address": "fa:16:3e:66:f1:ba", "network": {"id": "9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1863023570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6f05553fbb5e485ca9ac59d61797526f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10614c52-45", "ovs_interfaceid": "10614c52-453a-42bc-a61e-19263126e133", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.124 182939 DEBUG oslo_concurrency.lockutils [req-82847e51-5bed-48bb-9cce-5f0bb96f5dd2 req-59bca607-6ff0-46e5-81c5-600b6f1ef499 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:00 compute-0 podman[217100]: 2026-01-21 23:51:00.699527847 +0000 UTC m=+0.061476363 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.848 182939 DEBUG nova.network.neutron [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Updating instance_info_cache with network_info: [{"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.895 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Releasing lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.896 182939 DEBUG nova.compute.manager [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Instance network_info: |[{"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.897 182939 DEBUG oslo_concurrency.lockutils [req-fed78a81-7cf2-4825-9225-68764ed9b049 req-6c58208d-4f40-48b4-a9c0-dc6ebf741c13 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.898 182939 DEBUG nova.network.neutron [req-fed78a81-7cf2-4825-9225-68764ed9b049 req-6c58208d-4f40-48b4-a9c0-dc6ebf741c13 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Refreshing network info cache for port 022cfedf-447c-4fd1-9013-480f624fe044 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.900 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Start _get_guest_xml network_info=[{"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.905 182939 WARNING nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.912 182939 DEBUG nova.virt.libvirt.host [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.913 182939 DEBUG nova.virt.libvirt.host [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.917 182939 DEBUG nova.virt.libvirt.host [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.918 182939 DEBUG nova.virt.libvirt.host [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.919 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.919 182939 DEBUG nova.virt.hardware [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.920 182939 DEBUG nova.virt.hardware [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.920 182939 DEBUG nova.virt.hardware [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.920 182939 DEBUG nova.virt.hardware [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.921 182939 DEBUG nova.virt.hardware [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.921 182939 DEBUG nova.virt.hardware [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.921 182939 DEBUG nova.virt.hardware [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.921 182939 DEBUG nova.virt.hardware [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.921 182939 DEBUG nova.virt.hardware [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.922 182939 DEBUG nova.virt.hardware [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.922 182939 DEBUG nova.virt.hardware [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.926 182939 DEBUG nova.virt.libvirt.vif [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:50:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1618416311',display_name='tempest-FloatingIPsAssociationTestJSON-server-1618416311',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1618416311',id=39,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bdcd24bf916b4c3aa2e173bea9dd7202',ramdisk_id='',reservation_id='r-nda2hwq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1164348821',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1164348821-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:50:56Z,user_data=None,user_id='8da2db8893d4442aaaada7d43ff2500f',uuid=f36ee268-83c4-4567-bae3-ee40afbb7882,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.926 182939 DEBUG nova.network.os_vif_util [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Converting VIF {"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.927 182939 DEBUG nova.network.os_vif_util [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:af:5b,bridge_name='br-int',has_traffic_filtering=True,id=022cfedf-447c-4fd1-9013-480f624fe044,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap022cfedf-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.928 182939 DEBUG nova.objects.instance [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lazy-loading 'pci_devices' on Instance uuid f36ee268-83c4-4567-bae3-ee40afbb7882 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.948 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:51:00 compute-0 nova_compute[182935]:   <uuid>f36ee268-83c4-4567-bae3-ee40afbb7882</uuid>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   <name>instance-00000027</name>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <nova:name>tempest-FloatingIPsAssociationTestJSON-server-1618416311</nova:name>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:51:00</nova:creationTime>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:51:00 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:51:00 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:51:00 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:51:00 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:51:00 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:51:00 compute-0 nova_compute[182935]:         <nova:user uuid="8da2db8893d4442aaaada7d43ff2500f">tempest-FloatingIPsAssociationTestJSON-1164348821-project-member</nova:user>
Jan 21 23:51:00 compute-0 nova_compute[182935]:         <nova:project uuid="bdcd24bf916b4c3aa2e173bea9dd7202">tempest-FloatingIPsAssociationTestJSON-1164348821</nova:project>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:51:00 compute-0 nova_compute[182935]:         <nova:port uuid="022cfedf-447c-4fd1-9013-480f624fe044">
Jan 21 23:51:00 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <system>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <entry name="serial">f36ee268-83c4-4567-bae3-ee40afbb7882</entry>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <entry name="uuid">f36ee268-83c4-4567-bae3-ee40afbb7882</entry>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     </system>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   <os>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   </os>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   <features>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   </features>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk.config"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:a7:af:5b"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <target dev="tap022cfedf-44"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/console.log" append="off"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <video>
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     </video>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:51:00 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:51:00 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:51:00 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:51:00 compute-0 nova_compute[182935]: </domain>
Jan 21 23:51:00 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.951 182939 DEBUG nova.compute.manager [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Preparing to wait for external event network-vif-plugged-022cfedf-447c-4fd1-9013-480f624fe044 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.952 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "f36ee268-83c4-4567-bae3-ee40afbb7882-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.952 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "f36ee268-83c4-4567-bae3-ee40afbb7882-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.953 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "f36ee268-83c4-4567-bae3-ee40afbb7882-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.954 182939 DEBUG nova.virt.libvirt.vif [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:50:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1618416311',display_name='tempest-FloatingIPsAssociationTestJSON-server-1618416311',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1618416311',id=39,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bdcd24bf916b4c3aa2e173bea9dd7202',ramdisk_id='',reservation_id='r-nda2hwq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1164348821',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1164348821-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:50:56Z,user_data=None,user_id='8da2db8893d4442aaaada7d43ff2500f',uuid=f36ee268-83c4-4567-bae3-ee40afbb7882,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.954 182939 DEBUG nova.network.os_vif_util [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Converting VIF {"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.955 182939 DEBUG nova.network.os_vif_util [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:af:5b,bridge_name='br-int',has_traffic_filtering=True,id=022cfedf-447c-4fd1-9013-480f624fe044,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap022cfedf-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.956 182939 DEBUG os_vif [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:af:5b,bridge_name='br-int',has_traffic_filtering=True,id=022cfedf-447c-4fd1-9013-480f624fe044,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap022cfedf-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.957 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.957 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.958 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.961 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.962 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap022cfedf-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.962 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap022cfedf-44, col_values=(('external_ids', {'iface-id': '022cfedf-447c-4fd1-9013-480f624fe044', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a7:af:5b', 'vm-uuid': 'f36ee268-83c4-4567-bae3-ee40afbb7882'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.964 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:00 compute-0 NetworkManager[55139]: <info>  [1769039460.9654] manager: (tap022cfedf-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.970 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.972 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:00 compute-0 nova_compute[182935]: 2026-01-21 23:51:00.973 182939 INFO os_vif [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:af:5b,bridge_name='br-int',has_traffic_filtering=True,id=022cfedf-447c-4fd1-9013-480f624fe044,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap022cfedf-44')
Jan 21 23:51:01 compute-0 nova_compute[182935]: 2026-01-21 23:51:01.056 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:51:01 compute-0 nova_compute[182935]: 2026-01-21 23:51:01.057 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:51:01 compute-0 nova_compute[182935]: 2026-01-21 23:51:01.057 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] No VIF found with MAC fa:16:3e:a7:af:5b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:51:01 compute-0 nova_compute[182935]: 2026-01-21 23:51:01.057 182939 INFO nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Using config drive
Jan 21 23:51:01 compute-0 nova_compute[182935]: 2026-01-21 23:51:01.688 182939 INFO nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Creating config drive at /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk.config
Jan 21 23:51:01 compute-0 nova_compute[182935]: 2026-01-21 23:51:01.695 182939 DEBUG oslo_concurrency.processutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpohj6gmqs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:01 compute-0 nova_compute[182935]: 2026-01-21 23:51:01.823 182939 DEBUG oslo_concurrency.processutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpohj6gmqs" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:01 compute-0 kernel: tap022cfedf-44: entered promiscuous mode
Jan 21 23:51:01 compute-0 NetworkManager[55139]: <info>  [1769039461.8880] manager: (tap022cfedf-44): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Jan 21 23:51:01 compute-0 ovn_controller[95047]: 2026-01-21T23:51:01Z|00151|binding|INFO|Claiming lport 022cfedf-447c-4fd1-9013-480f624fe044 for this chassis.
Jan 21 23:51:01 compute-0 ovn_controller[95047]: 2026-01-21T23:51:01Z|00152|binding|INFO|022cfedf-447c-4fd1-9013-480f624fe044: Claiming fa:16:3e:a7:af:5b 10.100.0.8
Jan 21 23:51:01 compute-0 nova_compute[182935]: 2026-01-21 23:51:01.889 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:01 compute-0 ovn_controller[95047]: 2026-01-21T23:51:01Z|00153|binding|INFO|Setting lport 022cfedf-447c-4fd1-9013-480f624fe044 up in Southbound
Jan 21 23:51:01 compute-0 ovn_controller[95047]: 2026-01-21T23:51:01Z|00154|binding|INFO|Setting lport 022cfedf-447c-4fd1-9013-480f624fe044 ovn-installed in OVS
Jan 21 23:51:01 compute-0 nova_compute[182935]: 2026-01-21 23:51:01.919 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:01 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:01.920 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:af:5b 10.100.0.8'], port_security=['fa:16:3e:a7:af:5b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f36ee268-83c4-4567-bae3-ee40afbb7882', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bdcd24bf916b4c3aa2e173bea9dd7202', 'neutron:revision_number': '2', 'neutron:security_group_ids': '288c5115-ef70-4922-9c68-a1234762984e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e09721f1-c960-4ea4-8636-beb23b3dfb25, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=022cfedf-447c-4fd1-9013-480f624fe044) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:51:01 compute-0 nova_compute[182935]: 2026-01-21 23:51:01.923 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:01 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:01.927 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 022cfedf-447c-4fd1-9013-480f624fe044 in datapath b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 bound to our chassis
Jan 21 23:51:01 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:01.932 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b94414b2-c7ed-4d1b-b462-f41cb84cbcd8
Jan 21 23:51:01 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:01.951 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[30d35702-5cf4-4248-b06c-7fce98621ca2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:01 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:01.952 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb94414b2-c1 in ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:51:01 compute-0 systemd-udevd[217146]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:51:01 compute-0 systemd-machined[154182]: New machine qemu-23-instance-00000027.
Jan 21 23:51:01 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:01.957 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb94414b2-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:51:01 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:01.957 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[10fde3e3-abf6-4ca2-bc48-35ba09d61a73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:01 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:01.958 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[724f93b4-4e54-43be-aed5-0a9a5e5c1c40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:01 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-00000027.
Jan 21 23:51:01 compute-0 NetworkManager[55139]: <info>  [1769039461.9776] device (tap022cfedf-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:51:01 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:01.976 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[c5c46128-00f1-4639-bb34-46ae1716fd1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:01 compute-0 NetworkManager[55139]: <info>  [1769039461.9792] device (tap022cfedf-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.006 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e0f4b512-a839-4c66-8b5e-465034957a32]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.043 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[0854bb05-1b05-417d-9c01-a81414c8ed66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 systemd-udevd[217149]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:51:02 compute-0 NetworkManager[55139]: <info>  [1769039462.0535] manager: (tapb94414b2-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/71)
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.052 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1976a032-c70e-4866-9b68-ddf36a5e5891]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.098 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[57a66ef0-2c63-4ed2-bd22-506f71fd243e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.104 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4a880e-d4fd-40ee-a86c-e43d6ddb5317]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 NetworkManager[55139]: <info>  [1769039462.1447] device (tapb94414b2-c0): carrier: link connected
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.154 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[28089732-689f-4dd7-99c3-a441638ce837]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.181 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4077789a-1d71-448d-984f-e6818bb4b0fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb94414b2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:82:31'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396488, 'reachable_time': 17937, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217184, 'error': None, 'target': 'ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.209 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d16e255b-a223-4826-a25b-8011cfd2c9e4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedf:8231'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396488, 'tstamp': 396488}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217185, 'error': None, 'target': 'ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.234 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ada2a780-7a17-43fe-ae45-824d5983b3d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb94414b2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:82:31'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396488, 'reachable_time': 17937, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217187, 'error': None, 'target': 'ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.241 182939 DEBUG nova.compute.manager [req-c6875c98-04c0-468e-9f29-4622dfffa664 req-78329ff8-3744-4c31-9258-1c5b9c9a10f3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Received event network-vif-plugged-022cfedf-447c-4fd1-9013-480f624fe044 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.242 182939 DEBUG oslo_concurrency.lockutils [req-c6875c98-04c0-468e-9f29-4622dfffa664 req-78329ff8-3744-4c31-9258-1c5b9c9a10f3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f36ee268-83c4-4567-bae3-ee40afbb7882-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.242 182939 DEBUG oslo_concurrency.lockutils [req-c6875c98-04c0-468e-9f29-4622dfffa664 req-78329ff8-3744-4c31-9258-1c5b9c9a10f3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f36ee268-83c4-4567-bae3-ee40afbb7882-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.242 182939 DEBUG oslo_concurrency.lockutils [req-c6875c98-04c0-468e-9f29-4622dfffa664 req-78329ff8-3744-4c31-9258-1c5b9c9a10f3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f36ee268-83c4-4567-bae3-ee40afbb7882-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.243 182939 DEBUG nova.compute.manager [req-c6875c98-04c0-468e-9f29-4622dfffa664 req-78329ff8-3744-4c31-9258-1c5b9c9a10f3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Processing event network-vif-plugged-022cfedf-447c-4fd1-9013-480f624fe044 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.248 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039462.247578, f36ee268-83c4-4567-bae3-ee40afbb7882 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.248 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] VM Started (Lifecycle Event)
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.252 182939 DEBUG nova.compute.manager [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.257 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.262 182939 INFO nova.virt.libvirt.driver [-] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Instance spawned successfully.
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.262 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.267 182939 DEBUG nova.network.neutron [req-fed78a81-7cf2-4825-9225-68764ed9b049 req-6c58208d-4f40-48b4-a9c0-dc6ebf741c13 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Updated VIF entry in instance network info cache for port 022cfedf-447c-4fd1-9013-480f624fe044. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.267 182939 DEBUG nova.network.neutron [req-fed78a81-7cf2-4825-9225-68764ed9b049 req-6c58208d-4f40-48b4-a9c0-dc6ebf741c13 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Updating instance_info_cache with network_info: [{"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.282 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8a28e2b7-077b-44ca-a156-e011c976e142]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.310 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.315 182939 DEBUG oslo_concurrency.lockutils [req-fed78a81-7cf2-4825-9225-68764ed9b049 req-6c58208d-4f40-48b4-a9c0-dc6ebf741c13 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.320 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.324 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.324 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.325 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.325 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.325 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.326 182939 DEBUG nova.virt.libvirt.driver [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.359 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.359 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039462.2478664, f36ee268-83c4-4567-bae3-ee40afbb7882 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.359 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] VM Paused (Lifecycle Event)
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.368 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[efb75c06-ee17-4bc1-9d48-67a7d32ec040]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.370 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb94414b2-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.370 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.371 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb94414b2-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.373 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:02 compute-0 NetworkManager[55139]: <info>  [1769039462.3744] manager: (tapb94414b2-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Jan 21 23:51:02 compute-0 kernel: tapb94414b2-c0: entered promiscuous mode
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.377 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb94414b2-c0, col_values=(('external_ids', {'iface-id': 'e1559aec-27c5-46a3-81fc-ddeb80ee3759'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:02 compute-0 ovn_controller[95047]: 2026-01-21T23:51:02Z|00155|binding|INFO|Releasing lport e1559aec-27c5-46a3-81fc-ddeb80ee3759 from this chassis (sb_readonly=0)
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.379 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.384 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b94414b2-c7ed-4d1b-b462-f41cb84cbcd8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b94414b2-c7ed-4d1b-b462-f41cb84cbcd8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.385 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9998c355-ad7b-4234-82dd-c3a098803515]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.387 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/b94414b2-c7ed-4d1b-b462-f41cb84cbcd8.pid.haproxy
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID b94414b2-c7ed-4d1b-b462-f41cb84cbcd8
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.389 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'env', 'PROCESS_TAG=haproxy-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b94414b2-c7ed-4d1b-b462-f41cb84cbcd8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.394 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.406 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.410 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039462.2555447, f36ee268-83c4-4567-bae3-ee40afbb7882 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.411 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] VM Resumed (Lifecycle Event)
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.435 182939 INFO nova.compute.manager [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Took 5.56 seconds to spawn the instance on the hypervisor.
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.437 182939 DEBUG nova.compute.manager [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.440 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.450 182939 DEBUG oslo_concurrency.lockutils [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Acquiring lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.450 182939 DEBUG oslo_concurrency.lockutils [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.451 182939 DEBUG oslo_concurrency.lockutils [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Acquiring lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.454 182939 DEBUG oslo_concurrency.lockutils [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.454 182939 DEBUG oslo_concurrency.lockutils [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.457 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.466 182939 INFO nova.compute.manager [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Terminating instance
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.486 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.487 182939 DEBUG nova.compute.manager [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:51:02 compute-0 kernel: tap10614c52-45 (unregistering): left promiscuous mode
Jan 21 23:51:02 compute-0 NetworkManager[55139]: <info>  [1769039462.5194] device (tap10614c52-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:51:02 compute-0 ovn_controller[95047]: 2026-01-21T23:51:02Z|00156|binding|INFO|Releasing lport 10614c52-453a-42bc-a61e-19263126e133 from this chassis (sb_readonly=0)
Jan 21 23:51:02 compute-0 ovn_controller[95047]: 2026-01-21T23:51:02Z|00157|binding|INFO|Setting lport 10614c52-453a-42bc-a61e-19263126e133 down in Southbound
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.543 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.544 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:02 compute-0 ovn_controller[95047]: 2026-01-21T23:51:02Z|00158|binding|INFO|Removing iface tap10614c52-45 ovn-installed in OVS
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.555 182939 INFO nova.compute.manager [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Took 6.55 seconds to build instance.
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.560 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.562 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:f1:ba 10.100.0.8'], port_security=['fa:16:3e:66:f1:ba 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3f09ea6e-f851-4c27-9ebd-0d4eec5bd236', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6f05553fbb5e485ca9ac59d61797526f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '270dc604-cd9d-4305-95b2-895f7ed99803', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b5d73aa-d6c3-48e2-89f6-48f3b8f3a8a9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=10614c52-453a-42bc-a61e-19263126e133) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:51:02 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000025.scope: Deactivated successfully.
Jan 21 23:51:02 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000025.scope: Consumed 13.727s CPU time.
Jan 21 23:51:02 compute-0 systemd-machined[154182]: Machine qemu-22-instance-00000025 terminated.
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.590 182939 DEBUG oslo_concurrency.lockutils [None req-bea23169-d5f3-4599-88c9-33595a32b5a6 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "f36ee268-83c4-4567-bae3-ee40afbb7882" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.754s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.714 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.722 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.752 182939 INFO nova.virt.libvirt.driver [-] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Instance destroyed successfully.
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.753 182939 DEBUG nova.objects.instance [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lazy-loading 'resources' on Instance uuid 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.775 182939 DEBUG nova.virt.libvirt.vif [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:50:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-2118866858',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-2118866858',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-211886685',id=37,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:50:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6f05553fbb5e485ca9ac59d61797526f',ramdisk_id='',reservation_id='r-grzq9ecw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-100954466',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-100954466-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:50:35Z,user_data=None,user_id='782706408079485b86dfcb2709fe4bb9',uuid=3f09ea6e-f851-4c27-9ebd-0d4eec5bd236,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "10614c52-453a-42bc-a61e-19263126e133", "address": "fa:16:3e:66:f1:ba", "network": {"id": "9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1863023570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6f05553fbb5e485ca9ac59d61797526f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10614c52-45", "ovs_interfaceid": "10614c52-453a-42bc-a61e-19263126e133", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.776 182939 DEBUG nova.network.os_vif_util [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Converting VIF {"id": "10614c52-453a-42bc-a61e-19263126e133", "address": "fa:16:3e:66:f1:ba", "network": {"id": "9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1863023570-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6f05553fbb5e485ca9ac59d61797526f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10614c52-45", "ovs_interfaceid": "10614c52-453a-42bc-a61e-19263126e133", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.777 182939 DEBUG nova.network.os_vif_util [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:66:f1:ba,bridge_name='br-int',has_traffic_filtering=True,id=10614c52-453a-42bc-a61e-19263126e133,network=Network(9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10614c52-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.778 182939 DEBUG os_vif [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:f1:ba,bridge_name='br-int',has_traffic_filtering=True,id=10614c52-453a-42bc-a61e-19263126e133,network=Network(9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10614c52-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.780 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.781 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap10614c52-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.782 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.785 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.787 182939 INFO os_vif [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:f1:ba,bridge_name='br-int',has_traffic_filtering=True,id=10614c52-453a-42bc-a61e-19263126e133,network=Network(9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10614c52-45')
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.788 182939 INFO nova.virt.libvirt.driver [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Deleting instance files /var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236_del
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.789 182939 INFO nova.virt.libvirt.driver [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Deletion of /var/lib/nova/instances/3f09ea6e-f851-4c27-9ebd-0d4eec5bd236_del complete
Jan 21 23:51:02 compute-0 podman[217223]: 2026-01-21 23:51:02.788924325 +0000 UTC m=+0.056649550 container create 1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 21 23:51:02 compute-0 systemd[1]: Started libpod-conmon-1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea.scope.
Jan 21 23:51:02 compute-0 podman[217223]: 2026-01-21 23:51:02.758821874 +0000 UTC m=+0.026547109 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:51:02 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:51:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af3097b8cf6d9ebcaf96149fe3e4d6a0586454705455182f48918e0d115cc45a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:51:02 compute-0 podman[217223]: 2026-01-21 23:51:02.883371274 +0000 UTC m=+0.151096489 container init 1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.884 182939 INFO nova.compute.manager [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.885 182939 DEBUG oslo.service.loopingcall [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.885 182939 DEBUG nova.compute.manager [-] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:51:02 compute-0 nova_compute[182935]: 2026-01-21 23:51:02.886 182939 DEBUG nova.network.neutron [-] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:51:02 compute-0 podman[217223]: 2026-01-21 23:51:02.890919699 +0000 UTC m=+0.158644904 container start 1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:51:02 compute-0 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[217248]: [NOTICE]   (217252) : New worker (217254) forked
Jan 21 23:51:02 compute-0 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[217248]: [NOTICE]   (217252) : Loading success.
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.967 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 10614c52-453a-42bc-a61e-19263126e133 in datapath 9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398 unbound from our chassis
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.969 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.970 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3966763a-f6a2-44d8-8f4f-367b5792f34d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:02.970 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398 namespace which is not needed anymore
Jan 21 23:51:03 compute-0 neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398[216944]: [NOTICE]   (216948) : haproxy version is 2.8.14-c23fe91
Jan 21 23:51:03 compute-0 neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398[216944]: [NOTICE]   (216948) : path to executable is /usr/sbin/haproxy
Jan 21 23:51:03 compute-0 neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398[216944]: [WARNING]  (216948) : Exiting Master process...
Jan 21 23:51:03 compute-0 neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398[216944]: [ALERT]    (216948) : Current worker (216950) exited with code 143 (Terminated)
Jan 21 23:51:03 compute-0 neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398[216944]: [WARNING]  (216948) : All workers exited. Exiting... (0)
Jan 21 23:51:03 compute-0 systemd[1]: libpod-df56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa.scope: Deactivated successfully.
Jan 21 23:51:03 compute-0 podman[217280]: 2026-01-21 23:51:03.136034896 +0000 UTC m=+0.050721772 container died df56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 21 23:51:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-df56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa-userdata-shm.mount: Deactivated successfully.
Jan 21 23:51:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-81f9ac58177fd34756e3dc4c6984a3d90d095baab5f4b47d3a69e330f5d40aa0-merged.mount: Deactivated successfully.
Jan 21 23:51:03 compute-0 podman[217280]: 2026-01-21 23:51:03.178827803 +0000 UTC m=+0.093514659 container cleanup df56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 23:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:03.186 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:03.186 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:03.187 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:03 compute-0 systemd[1]: libpod-conmon-df56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa.scope: Deactivated successfully.
Jan 21 23:51:03 compute-0 podman[217308]: 2026-01-21 23:51:03.248761351 +0000 UTC m=+0.045425768 container remove df56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:03.254 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[13f685da-b599-47c3-9862-e0a1629add66]: (4, ('Wed Jan 21 11:51:03 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398 (df56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa)\ndf56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa\nWed Jan 21 11:51:03 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398 (df56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa)\ndf56c82afac9dcb13a7d729a197474a513aa00b453a95f2b3beea761323353aa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:03.257 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f2aa35ba-4f9d-4c28-b6a8-c3e60049244e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:03.258 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9dd7a7ea-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:03 compute-0 kernel: tap9dd7a7ea-e0: left promiscuous mode
Jan 21 23:51:03 compute-0 nova_compute[182935]: 2026-01-21 23:51:03.260 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:03 compute-0 nova_compute[182935]: 2026-01-21 23:51:03.274 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:03.279 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0e642206-4f83-4b88-9c12-5c1b9fc5cfc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:03.304 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[241e141b-c8d5-42e4-802b-4a4d37986b35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:03.306 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[582d0483-e23c-4b3d-b394-2c408b87958a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:03.328 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[46fa735d-e15b-40d5-b283-3f731b6d56f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393247, 'reachable_time': 20743, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217323, 'error': None, 'target': 'ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:03.330 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9dd7a7ea-efa4-4cb2-95c0-860bfa3ad398 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:03.330 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[bbce3d6b-36d3-4dbc-aa0a-8d1773b74e10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d9dd7a7ea\x2defa4\x2d4cb2\x2d95c0\x2d860bfa3ad398.mount: Deactivated successfully.
Jan 21 23:51:03 compute-0 nova_compute[182935]: 2026-01-21 23:51:03.510 182939 DEBUG nova.network.neutron [-] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:03 compute-0 nova_compute[182935]: 2026-01-21 23:51:03.540 182939 INFO nova.compute.manager [-] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Took 0.65 seconds to deallocate network for instance.
Jan 21 23:51:03 compute-0 nova_compute[182935]: 2026-01-21 23:51:03.561 182939 DEBUG nova.compute.manager [req-fcbc00e0-5e19-4d9a-953b-ab2b276d8cf1 req-9808ed78-5975-449e-8a2c-2f0777c80048 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Received event network-vif-deleted-10614c52-453a-42bc-a61e-19263126e133 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:03 compute-0 nova_compute[182935]: 2026-01-21 23:51:03.796 182939 DEBUG oslo_concurrency.lockutils [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:03 compute-0 nova_compute[182935]: 2026-01-21 23:51:03.798 182939 DEBUG oslo_concurrency.lockutils [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:03 compute-0 nova_compute[182935]: 2026-01-21 23:51:03.935 182939 DEBUG nova.compute.provider_tree [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:51:03 compute-0 nova_compute[182935]: 2026-01-21 23:51:03.958 182939 DEBUG nova.scheduler.client.report [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.002 182939 DEBUG oslo_concurrency.lockutils [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.026 182939 INFO nova.scheduler.client.report [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Deleted allocations for instance 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.144 182939 DEBUG oslo_concurrency.lockutils [None req-8c59e732-4a78-452b-89ad-8c50fe516fa1 782706408079485b86dfcb2709fe4bb9 6f05553fbb5e485ca9ac59d61797526f - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.420 182939 DEBUG nova.compute.manager [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Received event network-vif-plugged-022cfedf-447c-4fd1-9013-480f624fe044 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.421 182939 DEBUG oslo_concurrency.lockutils [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f36ee268-83c4-4567-bae3-ee40afbb7882-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.421 182939 DEBUG oslo_concurrency.lockutils [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f36ee268-83c4-4567-bae3-ee40afbb7882-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.422 182939 DEBUG oslo_concurrency.lockutils [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f36ee268-83c4-4567-bae3-ee40afbb7882-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.422 182939 DEBUG nova.compute.manager [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] No waiting events found dispatching network-vif-plugged-022cfedf-447c-4fd1-9013-480f624fe044 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.422 182939 WARNING nova.compute.manager [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Received unexpected event network-vif-plugged-022cfedf-447c-4fd1-9013-480f624fe044 for instance with vm_state active and task_state None.
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.422 182939 DEBUG nova.compute.manager [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Received event network-vif-unplugged-10614c52-453a-42bc-a61e-19263126e133 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.422 182939 DEBUG oslo_concurrency.lockutils [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.423 182939 DEBUG oslo_concurrency.lockutils [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.423 182939 DEBUG oslo_concurrency.lockutils [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.423 182939 DEBUG nova.compute.manager [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] No waiting events found dispatching network-vif-unplugged-10614c52-453a-42bc-a61e-19263126e133 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.423 182939 WARNING nova.compute.manager [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Received unexpected event network-vif-unplugged-10614c52-453a-42bc-a61e-19263126e133 for instance with vm_state deleted and task_state None.
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.424 182939 DEBUG nova.compute.manager [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Received event network-vif-plugged-10614c52-453a-42bc-a61e-19263126e133 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.424 182939 DEBUG oslo_concurrency.lockutils [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.424 182939 DEBUG oslo_concurrency.lockutils [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.424 182939 DEBUG oslo_concurrency.lockutils [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f09ea6e-f851-4c27-9ebd-0d4eec5bd236-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.424 182939 DEBUG nova.compute.manager [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] No waiting events found dispatching network-vif-plugged-10614c52-453a-42bc-a61e-19263126e133 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.425 182939 WARNING nova.compute.manager [req-924ebd2c-0f2b-44a8-b77a-5c06f348c1f0 req-ddcfacae-df31-45b6-8ea9-d08bbcea0324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Received unexpected event network-vif-plugged-10614c52-453a-42bc-a61e-19263126e133 for instance with vm_state deleted and task_state None.
Jan 21 23:51:04 compute-0 nova_compute[182935]: 2026-01-21 23:51:04.945 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:05 compute-0 podman[217324]: 2026-01-21 23:51:05.709403751 +0000 UTC m=+0.075157950 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:51:05 compute-0 ovn_controller[95047]: 2026-01-21T23:51:05Z|00159|binding|INFO|Releasing lport e1559aec-27c5-46a3-81fc-ddeb80ee3759 from this chassis (sb_readonly=0)
Jan 21 23:51:06 compute-0 nova_compute[182935]: 2026-01-21 23:51:06.065 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:07 compute-0 nova_compute[182935]: 2026-01-21 23:51:07.784 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:08 compute-0 ovn_controller[95047]: 2026-01-21T23:51:08Z|00160|binding|INFO|Releasing lport e1559aec-27c5-46a3-81fc-ddeb80ee3759 from this chassis (sb_readonly=0)
Jan 21 23:51:08 compute-0 nova_compute[182935]: 2026-01-21 23:51:08.081 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:10 compute-0 nova_compute[182935]: 2026-01-21 23:51:10.592 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:12 compute-0 podman[217346]: 2026-01-21 23:51:12.692070688 +0000 UTC m=+0.061114264 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:51:12 compute-0 podman[217345]: 2026-01-21 23:51:12.722442425 +0000 UTC m=+0.091619034 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, version=9.6, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7)
Jan 21 23:51:12 compute-0 nova_compute[182935]: 2026-01-21 23:51:12.823 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.451 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "49bdd247-3853-4f47-8620-33c45e08fd56" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.452 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "49bdd247-3853-4f47-8620-33c45e08fd56" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.491 182939 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.621 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.622 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.630 182939 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.630 182939 INFO nova.compute.claims [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.855 182939 DEBUG nova.compute.provider_tree [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.876 182939 DEBUG nova.scheduler.client.report [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.903 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.917 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "e07971b1-069f-4494-9b4a-04c296f1e891" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.917 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "e07971b1-069f-4494-9b4a-04c296f1e891" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.944 182939 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] No node specified, defaulting to compute-0.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.992 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "e07971b1-069f-4494-9b4a-04c296f1e891" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:13 compute-0 nova_compute[182935]: 2026-01-21 23:51:13.993 182939 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.047 182939 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.048 182939 DEBUG nova.network.neutron [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.067 182939 INFO nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.090 182939 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.209 182939 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.212 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.213 182939 INFO nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Creating image(s)
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.214 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "/var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.214 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "/var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.216 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "/var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.241 182939 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.302 182939 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.304 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.305 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.317 182939 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.369 182939 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.370 182939 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.400 182939 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.401 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.402 182939 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.454 182939 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.455 182939 DEBUG nova.virt.disk.api [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Checking if we can resize image /var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.455 182939 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:14 compute-0 ovn_controller[95047]: 2026-01-21T23:51:14Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a7:af:5b 10.100.0.8
Jan 21 23:51:14 compute-0 ovn_controller[95047]: 2026-01-21T23:51:14Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a7:af:5b 10.100.0.8
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.551 182939 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.552 182939 DEBUG nova.virt.disk.api [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Cannot resize image /var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.553 182939 DEBUG nova.objects.instance [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 49bdd247-3853-4f47-8620-33c45e08fd56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.573 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.573 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Ensure instance console log exists: /var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.574 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.574 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.574 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.796 182939 DEBUG nova.compute.manager [req-211f4e63-9691-449a-9e24-440589a07d3f req-98c126ed-a4e0-488c-aa07-f7b14f881618 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Received event network-changed-022cfedf-447c-4fd1-9013-480f624fe044 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.796 182939 DEBUG nova.compute.manager [req-211f4e63-9691-449a-9e24-440589a07d3f req-98c126ed-a4e0-488c-aa07-f7b14f881618 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Refreshing instance network info cache due to event network-changed-022cfedf-447c-4fd1-9013-480f624fe044. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.797 182939 DEBUG oslo_concurrency.lockutils [req-211f4e63-9691-449a-9e24-440589a07d3f req-98c126ed-a4e0-488c-aa07-f7b14f881618 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.797 182939 DEBUG oslo_concurrency.lockutils [req-211f4e63-9691-449a-9e24-440589a07d3f req-98c126ed-a4e0-488c-aa07-f7b14f881618 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:14 compute-0 nova_compute[182935]: 2026-01-21 23:51:14.797 182939 DEBUG nova.network.neutron [req-211f4e63-9691-449a-9e24-440589a07d3f req-98c126ed-a4e0-488c-aa07-f7b14f881618 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Refreshing network info cache for port 022cfedf-447c-4fd1-9013-480f624fe044 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.032 182939 DEBUG nova.network.neutron [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.033 182939 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.034 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.039 182939 WARNING nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.044 182939 DEBUG nova.virt.libvirt.host [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.044 182939 DEBUG nova.virt.libvirt.host [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.048 182939 DEBUG nova.virt.libvirt.host [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.049 182939 DEBUG nova.virt.libvirt.host [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.050 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.050 182939 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.051 182939 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.051 182939 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.051 182939 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.052 182939 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.052 182939 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.052 182939 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.053 182939 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.053 182939 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.053 182939 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.053 182939 DEBUG nova.virt.hardware [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.057 182939 DEBUG nova.objects.instance [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 49bdd247-3853-4f47-8620-33c45e08fd56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.084 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:51:15 compute-0 nova_compute[182935]:   <uuid>49bdd247-3853-4f47-8620-33c45e08fd56</uuid>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   <name>instance-0000002b</name>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersOnMultiNodesTest-server-689613698-1</nova:name>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:51:15</nova:creationTime>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:51:15 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:51:15 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:51:15 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:51:15 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:51:15 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:51:15 compute-0 nova_compute[182935]:         <nova:user uuid="d0c4727b6f6e46339b56a8168cf80a7b">tempest-ServersOnMultiNodesTest-1927863391-project-member</nova:user>
Jan 21 23:51:15 compute-0 nova_compute[182935]:         <nova:project uuid="c1e85e2b0f934b719d3ad4076dc719f2">tempest-ServersOnMultiNodesTest-1927863391</nova:project>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <system>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <entry name="serial">49bdd247-3853-4f47-8620-33c45e08fd56</entry>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <entry name="uuid">49bdd247-3853-4f47-8620-33c45e08fd56</entry>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     </system>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   <os>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   </os>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   <features>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   </features>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk.config"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/console.log" append="off"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <video>
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     </video>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:51:15 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:51:15 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:51:15 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:51:15 compute-0 nova_compute[182935]: </domain>
Jan 21 23:51:15 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.168 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.169 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.169 182939 INFO nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Using config drive
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.400 182939 INFO nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Creating config drive at /var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk.config
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.406 182939 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_rjc_nop execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.534 182939 DEBUG oslo_concurrency.processutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_rjc_nop" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:15 compute-0 nova_compute[182935]: 2026-01-21 23:51:15.593 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:15 compute-0 systemd-machined[154182]: New machine qemu-24-instance-0000002b.
Jan 21 23:51:15 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-0000002b.
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.152 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039476.1520443, 49bdd247-3853-4f47-8620-33c45e08fd56 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.154 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] VM Resumed (Lifecycle Event)
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.156 182939 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.156 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.160 182939 INFO nova.virt.libvirt.driver [-] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Instance spawned successfully.
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.161 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.175 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.182 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.186 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.186 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.187 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.187 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.188 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.188 182939 DEBUG nova.virt.libvirt.driver [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.211 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.212 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039476.1525972, 49bdd247-3853-4f47-8620-33c45e08fd56 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.213 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] VM Started (Lifecycle Event)
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.238 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.243 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.283 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.286 182939 INFO nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Took 2.08 seconds to spawn the instance on the hypervisor.
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.287 182939 DEBUG nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.418 182939 INFO nova.compute.manager [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Took 2.84 seconds to build instance.
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.463 182939 DEBUG oslo_concurrency.lockutils [None req-d2489ffc-0686-4414-ae85-79f8ea32295b d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "49bdd247-3853-4f47-8620-33c45e08fd56" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.782 182939 DEBUG nova.network.neutron [req-211f4e63-9691-449a-9e24-440589a07d3f req-98c126ed-a4e0-488c-aa07-f7b14f881618 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Updated VIF entry in instance network info cache for port 022cfedf-447c-4fd1-9013-480f624fe044. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.784 182939 DEBUG nova.network.neutron [req-211f4e63-9691-449a-9e24-440589a07d3f req-98c126ed-a4e0-488c-aa07-f7b14f881618 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Updating instance_info_cache with network_info: [{"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.819 182939 DEBUG oslo_concurrency.lockutils [req-211f4e63-9691-449a-9e24-440589a07d3f req-98c126ed-a4e0-488c-aa07-f7b14f881618 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:16 compute-0 nova_compute[182935]: 2026-01-21 23:51:16.926 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:17 compute-0 nova_compute[182935]: 2026-01-21 23:51:17.754 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039462.7499585, 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:17 compute-0 nova_compute[182935]: 2026-01-21 23:51:17.755 182939 INFO nova.compute.manager [-] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] VM Stopped (Lifecycle Event)
Jan 21 23:51:17 compute-0 nova_compute[182935]: 2026-01-21 23:51:17.783 182939 DEBUG nova.compute.manager [None req-248c5ea3-61ef-4e0a-b9ff-10736c559de1 - - - - - -] [instance: 3f09ea6e-f851-4c27-9ebd-0d4eec5bd236] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:17 compute-0 nova_compute[182935]: 2026-01-21 23:51:17.825 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:20 compute-0 nova_compute[182935]: 2026-01-21 23:51:20.040 182939 DEBUG oslo_concurrency.lockutils [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "49bdd247-3853-4f47-8620-33c45e08fd56" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:20 compute-0 nova_compute[182935]: 2026-01-21 23:51:20.041 182939 DEBUG oslo_concurrency.lockutils [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "49bdd247-3853-4f47-8620-33c45e08fd56" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:20 compute-0 nova_compute[182935]: 2026-01-21 23:51:20.041 182939 DEBUG oslo_concurrency.lockutils [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "49bdd247-3853-4f47-8620-33c45e08fd56-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:20 compute-0 nova_compute[182935]: 2026-01-21 23:51:20.041 182939 DEBUG oslo_concurrency.lockutils [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "49bdd247-3853-4f47-8620-33c45e08fd56-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:20 compute-0 nova_compute[182935]: 2026-01-21 23:51:20.042 182939 DEBUG oslo_concurrency.lockutils [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "49bdd247-3853-4f47-8620-33c45e08fd56-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:20 compute-0 nova_compute[182935]: 2026-01-21 23:51:20.056 182939 INFO nova.compute.manager [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Terminating instance
Jan 21 23:51:20 compute-0 nova_compute[182935]: 2026-01-21 23:51:20.065 182939 DEBUG oslo_concurrency.lockutils [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "refresh_cache-49bdd247-3853-4f47-8620-33c45e08fd56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:51:20 compute-0 nova_compute[182935]: 2026-01-21 23:51:20.066 182939 DEBUG oslo_concurrency.lockutils [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquired lock "refresh_cache-49bdd247-3853-4f47-8620-33c45e08fd56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:20 compute-0 nova_compute[182935]: 2026-01-21 23:51:20.066 182939 DEBUG nova.network.neutron [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:51:20 compute-0 nova_compute[182935]: 2026-01-21 23:51:20.343 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:20 compute-0 nova_compute[182935]: 2026-01-21 23:51:20.434 182939 DEBUG nova.network.neutron [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:51:20 compute-0 nova_compute[182935]: 2026-01-21 23:51:20.595 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:21 compute-0 nova_compute[182935]: 2026-01-21 23:51:21.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:21 compute-0 nova_compute[182935]: 2026-01-21 23:51:21.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:51:21 compute-0 nova_compute[182935]: 2026-01-21 23:51:21.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:51:21 compute-0 nova_compute[182935]: 2026-01-21 23:51:21.820 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.189 182939 DEBUG nova.network.neutron [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.218 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.218 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.219 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.219 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f36ee268-83c4-4567-bae3-ee40afbb7882 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.227 182939 DEBUG oslo_concurrency.lockutils [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Releasing lock "refresh_cache-49bdd247-3853-4f47-8620-33c45e08fd56" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.228 182939 DEBUG nova.compute.manager [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:51:22 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Jan 21 23:51:22 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000002b.scope: Consumed 6.685s CPU time.
Jan 21 23:51:22 compute-0 systemd-machined[154182]: Machine qemu-24-instance-0000002b terminated.
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.474 182939 INFO nova.virt.libvirt.driver [-] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Instance destroyed successfully.
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.475 182939 DEBUG nova.objects.instance [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lazy-loading 'resources' on Instance uuid 49bdd247-3853-4f47-8620-33c45e08fd56 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.490 182939 INFO nova.virt.libvirt.driver [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Deleting instance files /var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56_del
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.491 182939 INFO nova.virt.libvirt.driver [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Deletion of /var/lib/nova/instances/49bdd247-3853-4f47-8620-33c45e08fd56_del complete
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.507 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.575 182939 INFO nova.compute.manager [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.576 182939 DEBUG oslo.service.loopingcall [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.576 182939 DEBUG nova.compute.manager [-] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.576 182939 DEBUG nova.network.neutron [-] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.782 182939 DEBUG nova.network.neutron [-] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.805 182939 DEBUG nova.network.neutron [-] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.826 182939 INFO nova.compute.manager [-] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Took 0.25 seconds to deallocate network for instance.
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.827 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.951 182939 DEBUG oslo_concurrency.lockutils [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.952 182939 DEBUG oslo_concurrency.lockutils [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.965 182939 DEBUG nova.compute.manager [req-5f05da9b-27b0-4eb4-871a-811eeeff6baa req-21f1e056-4ac1-4d69-8abf-a9460297ffb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Received event network-changed-022cfedf-447c-4fd1-9013-480f624fe044 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.965 182939 DEBUG nova.compute.manager [req-5f05da9b-27b0-4eb4-871a-811eeeff6baa req-21f1e056-4ac1-4d69-8abf-a9460297ffb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Refreshing instance network info cache due to event network-changed-022cfedf-447c-4fd1-9013-480f624fe044. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:51:22 compute-0 nova_compute[182935]: 2026-01-21 23:51:22.965 182939 DEBUG oslo_concurrency.lockutils [req-5f05da9b-27b0-4eb4-871a-811eeeff6baa req-21f1e056-4ac1-4d69-8abf-a9460297ffb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:51:23 compute-0 nova_compute[182935]: 2026-01-21 23:51:23.047 182939 DEBUG nova.compute.provider_tree [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:51:23 compute-0 nova_compute[182935]: 2026-01-21 23:51:23.063 182939 DEBUG nova.scheduler.client.report [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:51:23 compute-0 nova_compute[182935]: 2026-01-21 23:51:23.094 182939 DEBUG oslo_concurrency.lockutils [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:23 compute-0 nova_compute[182935]: 2026-01-21 23:51:23.125 182939 INFO nova.scheduler.client.report [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Deleted allocations for instance 49bdd247-3853-4f47-8620-33c45e08fd56
Jan 21 23:51:23 compute-0 nova_compute[182935]: 2026-01-21 23:51:23.216 182939 DEBUG oslo_concurrency.lockutils [None req-917f65a4-0c0a-40ee-9e90-ca36f9aab4b9 d0c4727b6f6e46339b56a8168cf80a7b c1e85e2b0f934b719d3ad4076dc719f2 - - default default] Lock "49bdd247-3853-4f47-8620-33c45e08fd56" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:23 compute-0 podman[217458]: 2026-01-21 23:51:23.712354291 +0000 UTC m=+0.066874468 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:51:23 compute-0 podman[217457]: 2026-01-21 23:51:23.740531827 +0000 UTC m=+0.100651004 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.117 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.117 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.143 182939 DEBUG nova.compute.manager [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.163 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Updating instance_info_cache with network_info: [{"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.203 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.204 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.206 182939 DEBUG oslo_concurrency.lockutils [req-5f05da9b-27b0-4eb4-871a-811eeeff6baa req-21f1e056-4ac1-4d69-8abf-a9460297ffb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.206 182939 DEBUG nova.network.neutron [req-5f05da9b-27b0-4eb4-871a-811eeeff6baa req-21f1e056-4ac1-4d69-8abf-a9460297ffb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Refreshing network info cache for port 022cfedf-447c-4fd1-9013-480f624fe044 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.208 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.208 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.209 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.245 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.246 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.246 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.246 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.248 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.249 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.255 182939 DEBUG nova.virt.hardware [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.255 182939 INFO nova.compute.claims [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.359 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.432 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.433 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.458 182939 DEBUG nova.compute.provider_tree [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.477 182939 DEBUG nova.scheduler.client.report [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.492 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.504 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.505 182939 DEBUG nova.compute.manager [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.583 182939 DEBUG nova.compute.manager [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.584 182939 DEBUG nova.network.neutron [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.627 182939 INFO nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.660 182939 DEBUG nova.compute.manager [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.682 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.683 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5548MB free_disk=73.24670028686523GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.684 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.684 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.692 182939 DEBUG oslo_concurrency.lockutils [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "f36ee268-83c4-4567-bae3-ee40afbb7882" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.692 182939 DEBUG oslo_concurrency.lockutils [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "f36ee268-83c4-4567-bae3-ee40afbb7882" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.693 182939 DEBUG oslo_concurrency.lockutils [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "f36ee268-83c4-4567-bae3-ee40afbb7882-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.693 182939 DEBUG oslo_concurrency.lockutils [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "f36ee268-83c4-4567-bae3-ee40afbb7882-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.693 182939 DEBUG oslo_concurrency.lockutils [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "f36ee268-83c4-4567-bae3-ee40afbb7882-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.704 182939 INFO nova.compute.manager [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Terminating instance
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.713 182939 DEBUG nova.compute.manager [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:51:24 compute-0 kernel: tap022cfedf-44 (unregistering): left promiscuous mode
Jan 21 23:51:24 compute-0 NetworkManager[55139]: <info>  [1769039484.7339] device (tap022cfedf-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:51:24 compute-0 ovn_controller[95047]: 2026-01-21T23:51:24Z|00161|binding|INFO|Releasing lport 022cfedf-447c-4fd1-9013-480f624fe044 from this chassis (sb_readonly=0)
Jan 21 23:51:24 compute-0 ovn_controller[95047]: 2026-01-21T23:51:24Z|00162|binding|INFO|Setting lport 022cfedf-447c-4fd1-9013-480f624fe044 down in Southbound
Jan 21 23:51:24 compute-0 ovn_controller[95047]: 2026-01-21T23:51:24Z|00163|binding|INFO|Removing iface tap022cfedf-44 ovn-installed in OVS
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.741 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.745 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:24.761 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:af:5b 10.100.0.8'], port_security=['fa:16:3e:a7:af:5b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f36ee268-83c4-4567-bae3-ee40afbb7882', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bdcd24bf916b4c3aa2e173bea9dd7202', 'neutron:revision_number': '4', 'neutron:security_group_ids': '288c5115-ef70-4922-9c68-a1234762984e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e09721f1-c960-4ea4-8636-beb23b3dfb25, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=022cfedf-447c-4fd1-9013-480f624fe044) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:51:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:24.762 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 022cfedf-447c-4fd1-9013-480f624fe044 in datapath b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 unbound from our chassis
Jan 21 23:51:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:24.763 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:51:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:24.766 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca74765-1a37-4642-b13f-bec085a67228]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:24.768 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 namespace which is not needed anymore
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.769 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:24 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000027.scope: Deactivated successfully.
Jan 21 23:51:24 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000027.scope: Consumed 13.199s CPU time.
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.813 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance f36ee268-83c4-4567-bae3-ee40afbb7882 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.813 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 07512c08-85ed-4cd4-8f13-bb1698a30b8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.814 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.814 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:51:24 compute-0 systemd-machined[154182]: Machine qemu-23-instance-00000027 terminated.
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.841 182939 DEBUG nova.compute.manager [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.844 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.844 182939 INFO nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Creating image(s)
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.845 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "/var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.845 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.846 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.861 182939 DEBUG oslo_concurrency.processutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:24 compute-0 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[217248]: [NOTICE]   (217252) : haproxy version is 2.8.14-c23fe91
Jan 21 23:51:24 compute-0 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[217248]: [NOTICE]   (217252) : path to executable is /usr/sbin/haproxy
Jan 21 23:51:24 compute-0 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[217248]: [WARNING]  (217252) : Exiting Master process...
Jan 21 23:51:24 compute-0 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[217248]: [ALERT]    (217252) : Current worker (217254) exited with code 143 (Terminated)
Jan 21 23:51:24 compute-0 neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8[217248]: [WARNING]  (217252) : All workers exited. Exiting... (0)
Jan 21 23:51:24 compute-0 systemd[1]: libpod-1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea.scope: Deactivated successfully.
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.917 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:51:24 compute-0 podman[217536]: 2026-01-21 23:51:24.920844758 +0000 UTC m=+0.046474703 container died 1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.924 182939 DEBUG oslo_concurrency.processutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.925 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.926 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:24 compute-0 NetworkManager[55139]: <info>  [1769039484.9357] manager: (tap022cfedf-44): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.937 182939 DEBUG oslo_concurrency.processutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:24 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea-userdata-shm.mount: Deactivated successfully.
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.961 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-af3097b8cf6d9ebcaf96149fe3e4d6a0586454705455182f48918e0d115cc45a-merged.mount: Deactivated successfully.
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.967 182939 DEBUG nova.policy [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.971 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:51:24 compute-0 podman[217536]: 2026-01-21 23:51:24.974899497 +0000 UTC m=+0.100529442 container cleanup 1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:51:24 compute-0 systemd[1]: libpod-conmon-1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea.scope: Deactivated successfully.
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.989 182939 INFO nova.virt.libvirt.driver [-] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Instance destroyed successfully.
Jan 21 23:51:24 compute-0 nova_compute[182935]: 2026-01-21 23:51:24.990 182939 DEBUG nova.objects.instance [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lazy-loading 'resources' on Instance uuid f36ee268-83c4-4567-bae3-ee40afbb7882 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.003 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.004 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.004 182939 DEBUG nova.virt.libvirt.vif [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:50:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1618416311',display_name='tempest-FloatingIPsAssociationTestJSON-server-1618416311',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1618416311',id=39,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:51:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bdcd24bf916b4c3aa2e173bea9dd7202',ramdisk_id='',reservation_id='r-nda2hwq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-1164348821',owner_user_name='tempest-FloatingIPsAssociationTestJSON-1164348821-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:51:02Z,user_data=None,user_id='8da2db8893d4442aaaada7d43ff2500f',uuid=f36ee268-83c4-4567-bae3-ee40afbb7882,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.005 182939 DEBUG nova.network.os_vif_util [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Converting VIF {"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.006 182939 DEBUG nova.network.os_vif_util [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a7:af:5b,bridge_name='br-int',has_traffic_filtering=True,id=022cfedf-447c-4fd1-9013-480f624fe044,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap022cfedf-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.006 182939 DEBUG os_vif [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:af:5b,bridge_name='br-int',has_traffic_filtering=True,id=022cfedf-447c-4fd1-9013-480f624fe044,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap022cfedf-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.009 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.009 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap022cfedf-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.013 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.015 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.018 182939 DEBUG oslo_concurrency.processutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.018 182939 DEBUG oslo_concurrency.processutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:25 compute-0 podman[217588]: 2026-01-21 23:51:25.039932032 +0000 UTC m=+0.042134393 container remove 1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.045 182939 INFO os_vif [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:af:5b,bridge_name='br-int',has_traffic_filtering=True,id=022cfedf-447c-4fd1-9013-480f624fe044,network=Network(b94414b2-c7ed-4d1b-b462-f41cb84cbcd8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap022cfedf-44')
Jan 21 23:51:25 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:25.046 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[85335722-0b76-437e-bf9f-c3567273992d]: (4, ('Wed Jan 21 11:51:24 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 (1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea)\n1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea\nWed Jan 21 11:51:24 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 (1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea)\n1f8f5849afafc470841e8c93c25279ed6ba77a8d68331d77ea409bce37ea54ea\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.048 182939 INFO nova.virt.libvirt.driver [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Deleting instance files /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882_del
Jan 21 23:51:25 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:25.048 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b140aeab-fe4f-4f8e-a939-ea3a76978980]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.049 182939 INFO nova.virt.libvirt.driver [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Deletion of /var/lib/nova/instances/f36ee268-83c4-4567-bae3-ee40afbb7882_del complete
Jan 21 23:51:25 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:25.049 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb94414b2-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:25 compute-0 kernel: tapb94414b2-c0: left promiscuous mode
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.054 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.063 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.066 182939 DEBUG oslo_concurrency.processutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.067 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.067 182939 DEBUG oslo_concurrency.processutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:25 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:25.067 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a344b56a-c626-49b4-8c00-b6a6ce92b641]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:25 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:25.079 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[abd76a31-4227-4d7e-a179-9f43b609157b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:25 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:25.080 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a6714416-4cc0-4aec-8fd2-375243c0c7e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:25 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:25.101 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[de0e917d-dc51-4bbd-864f-b78f9e274485]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396477, 'reachable_time': 23970, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217610, 'error': None, 'target': 'ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:25 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:25.104 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b94414b2-c7ed-4d1b-b462-f41cb84cbcd8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:51:25 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:25.104 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[6a3598f8-a73e-4125-aaa2-f44bc095ebb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:25 compute-0 systemd[1]: run-netns-ovnmeta\x2db94414b2\x2dc7ed\x2d4d1b\x2db462\x2df41cb84cbcd8.mount: Deactivated successfully.
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.133 182939 DEBUG oslo_concurrency.processutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.134 182939 DEBUG nova.virt.disk.api [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Checking if we can resize image /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.134 182939 DEBUG oslo_concurrency.processutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.182 182939 INFO nova.compute.manager [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.183 182939 DEBUG oslo.service.loopingcall [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.183 182939 DEBUG nova.compute.manager [-] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.184 182939 DEBUG nova.network.neutron [-] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.193 182939 DEBUG oslo_concurrency.processutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.202 182939 DEBUG nova.virt.disk.api [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Cannot resize image /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.202 182939 DEBUG nova.objects.instance [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'migration_context' on Instance uuid 07512c08-85ed-4cd4-8f13-bb1698a30b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.231 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.231 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Ensure instance console log exists: /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.232 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.232 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.232 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.597 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.599 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.603 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.603 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.603 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:51:25 compute-0 nova_compute[182935]: 2026-01-21 23:51:25.867 182939 DEBUG nova.network.neutron [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Successfully created port: cd6e5db4-4773-4806-9d60-534f8bd105be _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.540 182939 DEBUG nova.network.neutron [-] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.592 182939 INFO nova.compute.manager [-] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Took 1.41 seconds to deallocate network for instance.
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.716 182939 DEBUG oslo_concurrency.lockutils [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.717 182939 DEBUG oslo_concurrency.lockutils [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.732 182939 DEBUG nova.compute.manager [req-364d832b-f722-4ed6-9fed-49b6fdd2d23a req-d63569d7-6fc5-4e48-93e9-5cecc789f922 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Received event network-vif-deleted-022cfedf-447c-4fd1-9013-480f624fe044 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.738 182939 DEBUG nova.network.neutron [req-5f05da9b-27b0-4eb4-871a-811eeeff6baa req-21f1e056-4ac1-4d69-8abf-a9460297ffb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Updated VIF entry in instance network info cache for port 022cfedf-447c-4fd1-9013-480f624fe044. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.739 182939 DEBUG nova.network.neutron [req-5f05da9b-27b0-4eb4-871a-811eeeff6baa req-21f1e056-4ac1-4d69-8abf-a9460297ffb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Updating instance_info_cache with network_info: [{"id": "022cfedf-447c-4fd1-9013-480f624fe044", "address": "fa:16:3e:a7:af:5b", "network": {"id": "b94414b2-c7ed-4d1b-b462-f41cb84cbcd8", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-823172346-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bdcd24bf916b4c3aa2e173bea9dd7202", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap022cfedf-44", "ovs_interfaceid": "022cfedf-447c-4fd1-9013-480f624fe044", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.797 182939 DEBUG oslo_concurrency.lockutils [req-5f05da9b-27b0-4eb4-871a-811eeeff6baa req-21f1e056-4ac1-4d69-8abf-a9460297ffb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f36ee268-83c4-4567-bae3-ee40afbb7882" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.841 182939 DEBUG nova.compute.provider_tree [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.860 182939 DEBUG nova.scheduler.client.report [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.892 182939 DEBUG oslo_concurrency.lockutils [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:26 compute-0 nova_compute[182935]: 2026-01-21 23:51:26.920 182939 INFO nova.scheduler.client.report [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Deleted allocations for instance f36ee268-83c4-4567-bae3-ee40afbb7882
Jan 21 23:51:27 compute-0 nova_compute[182935]: 2026-01-21 23:51:27.023 182939 DEBUG oslo_concurrency.lockutils [None req-1d10abdc-a08d-4946-8916-baa55cf40fb4 8da2db8893d4442aaaada7d43ff2500f bdcd24bf916b4c3aa2e173bea9dd7202 - - default default] Lock "f36ee268-83c4-4567-bae3-ee40afbb7882" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.330s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:27 compute-0 nova_compute[182935]: 2026-01-21 23:51:27.041 182939 DEBUG nova.network.neutron [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Successfully updated port: cd6e5db4-4773-4806-9d60-534f8bd105be _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:51:27 compute-0 nova_compute[182935]: 2026-01-21 23:51:27.093 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "refresh_cache-07512c08-85ed-4cd4-8f13-bb1698a30b8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:51:27 compute-0 nova_compute[182935]: 2026-01-21 23:51:27.094 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquired lock "refresh_cache-07512c08-85ed-4cd4-8f13-bb1698a30b8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:27 compute-0 nova_compute[182935]: 2026-01-21 23:51:27.094 182939 DEBUG nova.network.neutron [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:51:27 compute-0 nova_compute[182935]: 2026-01-21 23:51:27.105 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:27 compute-0 nova_compute[182935]: 2026-01-21 23:51:27.227 182939 DEBUG nova.compute.manager [req-65f4ff5f-09e2-44c5-9da8-59d675fb96b8 req-a5bb1aff-372b-4311-84dc-a08303491ae7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received event network-changed-cd6e5db4-4773-4806-9d60-534f8bd105be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:27 compute-0 nova_compute[182935]: 2026-01-21 23:51:27.227 182939 DEBUG nova.compute.manager [req-65f4ff5f-09e2-44c5-9da8-59d675fb96b8 req-a5bb1aff-372b-4311-84dc-a08303491ae7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Refreshing instance network info cache due to event network-changed-cd6e5db4-4773-4806-9d60-534f8bd105be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:51:27 compute-0 nova_compute[182935]: 2026-01-21 23:51:27.228 182939 DEBUG oslo_concurrency.lockutils [req-65f4ff5f-09e2-44c5-9da8-59d675fb96b8 req-a5bb1aff-372b-4311-84dc-a08303491ae7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-07512c08-85ed-4cd4-8f13-bb1698a30b8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:51:27 compute-0 nova_compute[182935]: 2026-01-21 23:51:27.837 182939 DEBUG nova.network.neutron [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.864 182939 DEBUG nova.network.neutron [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Updating instance_info_cache with network_info: [{"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.892 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Releasing lock "refresh_cache-07512c08-85ed-4cd4-8f13-bb1698a30b8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.893 182939 DEBUG nova.compute.manager [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Instance network_info: |[{"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.893 182939 DEBUG oslo_concurrency.lockutils [req-65f4ff5f-09e2-44c5-9da8-59d675fb96b8 req-a5bb1aff-372b-4311-84dc-a08303491ae7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-07512c08-85ed-4cd4-8f13-bb1698a30b8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.894 182939 DEBUG nova.network.neutron [req-65f4ff5f-09e2-44c5-9da8-59d675fb96b8 req-a5bb1aff-372b-4311-84dc-a08303491ae7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Refreshing network info cache for port cd6e5db4-4773-4806-9d60-534f8bd105be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.897 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Start _get_guest_xml network_info=[{"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.903 182939 WARNING nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.908 182939 DEBUG nova.virt.libvirt.host [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.908 182939 DEBUG nova.virt.libvirt.host [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.912 182939 DEBUG nova.virt.libvirt.host [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.912 182939 DEBUG nova.virt.libvirt.host [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.913 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.914 182939 DEBUG nova.virt.hardware [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.914 182939 DEBUG nova.virt.hardware [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.914 182939 DEBUG nova.virt.hardware [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.915 182939 DEBUG nova.virt.hardware [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.915 182939 DEBUG nova.virt.hardware [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.915 182939 DEBUG nova.virt.hardware [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.915 182939 DEBUG nova.virt.hardware [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.916 182939 DEBUG nova.virt.hardware [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.916 182939 DEBUG nova.virt.hardware [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.916 182939 DEBUG nova.virt.hardware [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.916 182939 DEBUG nova.virt.hardware [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.920 182939 DEBUG nova.virt.libvirt.vif [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:51:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-774587708',display_name='tempest-ServerDiskConfigTestJSON-server-774587708',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-774587708',id=45,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-kzhvn2tf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:51:24Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=07512c08-85ed-4cd4-8f13-bb1698a30b8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.921 182939 DEBUG nova.network.os_vif_util [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.921 182939 DEBUG nova.network.os_vif_util [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.922 182939 DEBUG nova.objects.instance [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 07512c08-85ed-4cd4-8f13-bb1698a30b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.942 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:51:28 compute-0 nova_compute[182935]:   <uuid>07512c08-85ed-4cd4-8f13-bb1698a30b8c</uuid>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   <name>instance-0000002d</name>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-774587708</nova:name>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:51:28</nova:creationTime>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:51:28 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:51:28 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:51:28 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:51:28 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:51:28 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:51:28 compute-0 nova_compute[182935]:         <nova:user uuid="a7fb6bdd938b4fcdb749b0bc4f86f97e">tempest-ServerDiskConfigTestJSON-1417790226-project-member</nova:user>
Jan 21 23:51:28 compute-0 nova_compute[182935]:         <nova:project uuid="c09a5cf201e249f69f57cd4a632d1e2b">tempest-ServerDiskConfigTestJSON-1417790226</nova:project>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:51:28 compute-0 nova_compute[182935]:         <nova:port uuid="cd6e5db4-4773-4806-9d60-534f8bd105be">
Jan 21 23:51:28 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <system>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <entry name="serial">07512c08-85ed-4cd4-8f13-bb1698a30b8c</entry>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <entry name="uuid">07512c08-85ed-4cd4-8f13-bb1698a30b8c</entry>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     </system>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   <os>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   </os>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   <features>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   </features>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.config"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:18:a3:b9"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <target dev="tapcd6e5db4-47"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/console.log" append="off"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <video>
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     </video>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:51:28 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:51:28 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:51:28 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:51:28 compute-0 nova_compute[182935]: </domain>
Jan 21 23:51:28 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.943 182939 DEBUG nova.compute.manager [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Preparing to wait for external event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.943 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.944 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.944 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.945 182939 DEBUG nova.virt.libvirt.vif [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:51:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-774587708',display_name='tempest-ServerDiskConfigTestJSON-server-774587708',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-774587708',id=45,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-kzhvn2tf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:51:24Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=07512c08-85ed-4cd4-8f13-bb1698a30b8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.945 182939 DEBUG nova.network.os_vif_util [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.946 182939 DEBUG nova.network.os_vif_util [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.946 182939 DEBUG os_vif [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.947 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.947 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.948 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.955 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.956 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd6e5db4-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.957 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcd6e5db4-47, col_values=(('external_ids', {'iface-id': 'cd6e5db4-4773-4806-9d60-534f8bd105be', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:18:a3:b9', 'vm-uuid': '07512c08-85ed-4cd4-8f13-bb1698a30b8c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:28 compute-0 NetworkManager[55139]: <info>  [1769039488.9619] manager: (tapcd6e5db4-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.960 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.964 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.971 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:28 compute-0 nova_compute[182935]: 2026-01-21 23:51:28.973 182939 INFO os_vif [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47')
Jan 21 23:51:29 compute-0 nova_compute[182935]: 2026-01-21 23:51:29.057 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:51:29 compute-0 nova_compute[182935]: 2026-01-21 23:51:29.058 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:51:29 compute-0 nova_compute[182935]: 2026-01-21 23:51:29.058 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No VIF found with MAC fa:16:3e:18:a3:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:51:29 compute-0 nova_compute[182935]: 2026-01-21 23:51:29.058 182939 INFO nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Using config drive
Jan 21 23:51:29 compute-0 nova_compute[182935]: 2026-01-21 23:51:29.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:51:29 compute-0 nova_compute[182935]: 2026-01-21 23:51:29.842 182939 INFO nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Creating config drive at /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.config
Jan 21 23:51:29 compute-0 nova_compute[182935]: 2026-01-21 23:51:29.854 182939 DEBUG oslo_concurrency.processutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpni8phh0e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:29 compute-0 nova_compute[182935]: 2026-01-21 23:51:29.995 182939 DEBUG oslo_concurrency.processutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpni8phh0e" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:30 compute-0 kernel: tapcd6e5db4-47: entered promiscuous mode
Jan 21 23:51:30 compute-0 NetworkManager[55139]: <info>  [1769039490.0780] manager: (tapcd6e5db4-47): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Jan 21 23:51:30 compute-0 ovn_controller[95047]: 2026-01-21T23:51:30Z|00164|binding|INFO|Claiming lport cd6e5db4-4773-4806-9d60-534f8bd105be for this chassis.
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.079 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:30 compute-0 ovn_controller[95047]: 2026-01-21T23:51:30Z|00165|binding|INFO|cd6e5db4-4773-4806-9d60-534f8bd105be: Claiming fa:16:3e:18:a3:b9 10.100.0.9
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.091 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:a3:b9 10.100.0.9'], port_security=['fa:16:3e:18:a3:b9 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '07512c08-85ed-4cd4-8f13-bb1698a30b8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=cd6e5db4-4773-4806-9d60-534f8bd105be) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.093 104408 INFO neutron.agent.ovn.metadata.agent [-] Port cd6e5db4-4773-4806-9d60-534f8bd105be in datapath 7b586c54-3322-410f-9bc9-972a63b8deff bound to our chassis
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.097 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:51:30 compute-0 ovn_controller[95047]: 2026-01-21T23:51:30Z|00166|binding|INFO|Setting lport cd6e5db4-4773-4806-9d60-534f8bd105be up in Southbound
Jan 21 23:51:30 compute-0 ovn_controller[95047]: 2026-01-21T23:51:30Z|00167|binding|INFO|Setting lport cd6e5db4-4773-4806-9d60-534f8bd105be ovn-installed in OVS
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.112 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.114 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.114 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[eeaf7460-41b4-444e-a6c4-2e89d62a6fba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.116 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b586c54-31 in ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.118 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b586c54-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.118 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fda1f4be-9503-498c-a12a-8aab84111ac3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.119 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.121 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[304a76e1-e29e-48f6-ad94-72e33a0be312]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 systemd-udevd[217638]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:51:30 compute-0 systemd-machined[154182]: New machine qemu-25-instance-0000002d.
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.139 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[82b1c01e-054e-4913-b550-626b984f41b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 NetworkManager[55139]: <info>  [1769039490.1424] device (tapcd6e5db4-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:51:30 compute-0 NetworkManager[55139]: <info>  [1769039490.1431] device (tapcd6e5db4-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:51:30 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-0000002d.
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.172 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bb56a9d7-ead8-4664-bb41-58b87a67842d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.218 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b3c1797d-dc35-41c6-9242-f6680f5f4100]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.226 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[68f907cc-2d55-4808-bf68-bb344ea0cb3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 systemd-udevd[217641]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:51:30 compute-0 NetworkManager[55139]: <info>  [1769039490.2285] manager: (tap7b586c54-30): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.268 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[a66ab945-adc6-4a5e-a8ea-cf54226bda2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.272 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[15096d93-8574-4b2b-a588-bb51674d70f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 NetworkManager[55139]: <info>  [1769039490.2992] device (tap7b586c54-30): carrier: link connected
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.305 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[eb80acdd-a56f-4b3e-b8ed-5246f5d6682b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.326 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[26eb33e3-e380-4b34-b107-9e4e781df51f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399303, 'reachable_time': 38813, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217670, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.346 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3a1c5a90-3a4d-4fa2-b842-8800e5ff05df]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:a9f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399303, 'tstamp': 399303}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217671, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.373 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b9a74a5d-e60a-472f-84c1-36fa9d04ece5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399303, 'reachable_time': 38813, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217672, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.423 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[adc01fae-f347-4a64-9246-fe0a26db9c43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.496 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039490.4957888, 07512c08-85ed-4cd4-8f13-bb1698a30b8c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.497 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] VM Started (Lifecycle Event)
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.510 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8e020a7c-0aa0-4f01-903e-cb56b1b813c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.512 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.512 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.512 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b586c54-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:30 compute-0 kernel: tap7b586c54-30: entered promiscuous mode
Jan 21 23:51:30 compute-0 NetworkManager[55139]: <info>  [1769039490.5155] manager: (tap7b586c54-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.515 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.519 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b586c54-30, col_values=(('external_ids', {'iface-id': '52e5d5d5-be78-49fa-86d7-24ac4adf40c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:30 compute-0 ovn_controller[95047]: 2026-01-21T23:51:30Z|00168|binding|INFO|Releasing lport 52e5d5d5-be78-49fa-86d7-24ac4adf40c1 from this chassis (sb_readonly=0)
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.521 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.522 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.523 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.524 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[70a9c318-4ed4-418c-8711-4c580fbf7d3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.526 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:51:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:30.528 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'env', 'PROCESS_TAG=haproxy-7b586c54-3322-410f-9bc9-972a63b8deff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b586c54-3322-410f-9bc9-972a63b8deff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.527 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039490.4959462, 07512c08-85ed-4cd4-8f13-bb1698a30b8c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.528 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] VM Paused (Lifecycle Event)
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.548 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.552 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.573 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:51:30 compute-0 nova_compute[182935]: 2026-01-21 23:51:30.601 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:30 compute-0 podman[217710]: 2026-01-21 23:51:30.927269625 +0000 UTC m=+0.062752021 container create e67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 21 23:51:30 compute-0 systemd[1]: Started libpod-conmon-e67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c.scope.
Jan 21 23:51:30 compute-0 podman[217710]: 2026-01-21 23:51:30.889196388 +0000 UTC m=+0.024678824 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:51:31 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:51:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2c85cd8695d6bc44cac8e60e719474f0471eea69a3938c54842fc900e6a5154/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:51:31 compute-0 podman[217710]: 2026-01-21 23:51:31.027046549 +0000 UTC m=+0.162528975 container init e67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:51:31 compute-0 podman[217710]: 2026-01-21 23:51:31.038160137 +0000 UTC m=+0.173642533 container start e67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 21 23:51:31 compute-0 podman[217723]: 2026-01-21 23:51:31.060304242 +0000 UTC m=+0.088270826 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:51:31 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217726]: [NOTICE]   (217744) : New worker (217753) forked
Jan 21 23:51:31 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217726]: [NOTICE]   (217744) : Loading success.
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.182 182939 DEBUG nova.compute.manager [req-c6825734-c613-43cf-990e-c20872d6300d req-b14e56ad-ba55-44c1-b767-9d1537d000b3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.184 182939 DEBUG oslo_concurrency.lockutils [req-c6825734-c613-43cf-990e-c20872d6300d req-b14e56ad-ba55-44c1-b767-9d1537d000b3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.185 182939 DEBUG oslo_concurrency.lockutils [req-c6825734-c613-43cf-990e-c20872d6300d req-b14e56ad-ba55-44c1-b767-9d1537d000b3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.186 182939 DEBUG oslo_concurrency.lockutils [req-c6825734-c613-43cf-990e-c20872d6300d req-b14e56ad-ba55-44c1-b767-9d1537d000b3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.186 182939 DEBUG nova.compute.manager [req-c6825734-c613-43cf-990e-c20872d6300d req-b14e56ad-ba55-44c1-b767-9d1537d000b3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Processing event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.188 182939 DEBUG nova.compute.manager [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.193 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039491.1935992, 07512c08-85ed-4cd4-8f13-bb1698a30b8c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.194 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] VM Resumed (Lifecycle Event)
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.196 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.200 182939 INFO nova.virt.libvirt.driver [-] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Instance spawned successfully.
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.201 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.224 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.232 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.234 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.235 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.235 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.236 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.236 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.237 182939 DEBUG nova.virt.libvirt.driver [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.299 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.347 182939 INFO nova.compute.manager [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Took 6.50 seconds to spawn the instance on the hypervisor.
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.348 182939 DEBUG nova.compute.manager [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.437 182939 INFO nova.compute.manager [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Took 7.23 seconds to build instance.
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.456 182939 DEBUG oslo_concurrency.lockutils [None req-04d03bac-e27c-40aa-a60a-34c8b3e78620 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.901 182939 DEBUG nova.network.neutron [req-65f4ff5f-09e2-44c5-9da8-59d675fb96b8 req-a5bb1aff-372b-4311-84dc-a08303491ae7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Updated VIF entry in instance network info cache for port cd6e5db4-4773-4806-9d60-534f8bd105be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.901 182939 DEBUG nova.network.neutron [req-65f4ff5f-09e2-44c5-9da8-59d675fb96b8 req-a5bb1aff-372b-4311-84dc-a08303491ae7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Updating instance_info_cache with network_info: [{"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:31 compute-0 nova_compute[182935]: 2026-01-21 23:51:31.928 182939 DEBUG oslo_concurrency.lockutils [req-65f4ff5f-09e2-44c5-9da8-59d675fb96b8 req-a5bb1aff-372b-4311-84dc-a08303491ae7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-07512c08-85ed-4cd4-8f13-bb1698a30b8c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:51:33 compute-0 nova_compute[182935]: 2026-01-21 23:51:33.329 182939 DEBUG nova.compute.manager [req-72df0ef1-91d2-4d03-a55d-23b3b844fa9a req-7b1ff178-f100-4cb0-bd09-1fdb67e95c5a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:33 compute-0 nova_compute[182935]: 2026-01-21 23:51:33.329 182939 DEBUG oslo_concurrency.lockutils [req-72df0ef1-91d2-4d03-a55d-23b3b844fa9a req-7b1ff178-f100-4cb0-bd09-1fdb67e95c5a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:33 compute-0 nova_compute[182935]: 2026-01-21 23:51:33.330 182939 DEBUG oslo_concurrency.lockutils [req-72df0ef1-91d2-4d03-a55d-23b3b844fa9a req-7b1ff178-f100-4cb0-bd09-1fdb67e95c5a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:33 compute-0 nova_compute[182935]: 2026-01-21 23:51:33.330 182939 DEBUG oslo_concurrency.lockutils [req-72df0ef1-91d2-4d03-a55d-23b3b844fa9a req-7b1ff178-f100-4cb0-bd09-1fdb67e95c5a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:33 compute-0 nova_compute[182935]: 2026-01-21 23:51:33.330 182939 DEBUG nova.compute.manager [req-72df0ef1-91d2-4d03-a55d-23b3b844fa9a req-7b1ff178-f100-4cb0-bd09-1fdb67e95c5a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] No waiting events found dispatching network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:51:33 compute-0 nova_compute[182935]: 2026-01-21 23:51:33.331 182939 WARNING nova.compute.manager [req-72df0ef1-91d2-4d03-a55d-23b3b844fa9a req-7b1ff178-f100-4cb0-bd09-1fdb67e95c5a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received unexpected event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be for instance with vm_state active and task_state None.
Jan 21 23:51:33 compute-0 nova_compute[182935]: 2026-01-21 23:51:33.961 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:34 compute-0 sshd-session[217762]: Invalid user weblogic from 188.166.69.60 port 60688
Jan 21 23:51:34 compute-0 sshd-session[217762]: Connection closed by invalid user weblogic 188.166.69.60 port 60688 [preauth]
Jan 21 23:51:35 compute-0 nova_compute[182935]: 2026-01-21 23:51:35.213 182939 INFO nova.compute.manager [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Rebuilding instance
Jan 21 23:51:35 compute-0 nova_compute[182935]: 2026-01-21 23:51:35.604 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:35 compute-0 nova_compute[182935]: 2026-01-21 23:51:35.698 182939 DEBUG nova.compute.manager [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:35 compute-0 nova_compute[182935]: 2026-01-21 23:51:35.809 182939 DEBUG nova.objects.instance [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_requests' on Instance uuid 07512c08-85ed-4cd4-8f13-bb1698a30b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:35 compute-0 nova_compute[182935]: 2026-01-21 23:51:35.826 182939 DEBUG nova.objects.instance [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 07512c08-85ed-4cd4-8f13-bb1698a30b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:35 compute-0 nova_compute[182935]: 2026-01-21 23:51:35.842 182939 DEBUG nova.objects.instance [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'resources' on Instance uuid 07512c08-85ed-4cd4-8f13-bb1698a30b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:35 compute-0 nova_compute[182935]: 2026-01-21 23:51:35.855 182939 DEBUG nova.objects.instance [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'migration_context' on Instance uuid 07512c08-85ed-4cd4-8f13-bb1698a30b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:35 compute-0 nova_compute[182935]: 2026-01-21 23:51:35.869 182939 DEBUG nova.objects.instance [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:51:35 compute-0 nova_compute[182935]: 2026-01-21 23:51:35.873 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:51:36 compute-0 podman[217764]: 2026-01-21 23:51:36.706789829 +0000 UTC m=+0.073914262 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 21 23:51:37 compute-0 nova_compute[182935]: 2026-01-21 23:51:37.474 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039482.4726477, 49bdd247-3853-4f47-8620-33c45e08fd56 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:37 compute-0 nova_compute[182935]: 2026-01-21 23:51:37.475 182939 INFO nova.compute.manager [-] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] VM Stopped (Lifecycle Event)
Jan 21 23:51:37 compute-0 nova_compute[182935]: 2026-01-21 23:51:37.496 182939 DEBUG nova.compute.manager [None req-3d5c9dc7-1307-4fd7-bb95-655ec350fad3 - - - - - -] [instance: 49bdd247-3853-4f47-8620-33c45e08fd56] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:38 compute-0 nova_compute[182935]: 2026-01-21 23:51:38.964 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:39 compute-0 nova_compute[182935]: 2026-01-21 23:51:39.982 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039484.9810152, f36ee268-83c4-4567-bae3-ee40afbb7882 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:39 compute-0 nova_compute[182935]: 2026-01-21 23:51:39.982 182939 INFO nova.compute.manager [-] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] VM Stopped (Lifecycle Event)
Jan 21 23:51:40 compute-0 nova_compute[182935]: 2026-01-21 23:51:40.005 182939 DEBUG nova.compute.manager [None req-a6efd6c0-b065-45c7-97b5-d72eefc7f31d - - - - - -] [instance: f36ee268-83c4-4567-bae3-ee40afbb7882] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:40 compute-0 nova_compute[182935]: 2026-01-21 23:51:40.606 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:43 compute-0 podman[217799]: 2026-01-21 23:51:43.701686451 +0000 UTC m=+0.069951030 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64)
Jan 21 23:51:43 compute-0 podman[217800]: 2026-01-21 23:51:43.7115206 +0000 UTC m=+0.073464481 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute)
Jan 21 23:51:43 compute-0 nova_compute[182935]: 2026-01-21 23:51:43.966 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:44 compute-0 ovn_controller[95047]: 2026-01-21T23:51:44Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:18:a3:b9 10.100.0.9
Jan 21 23:51:44 compute-0 ovn_controller[95047]: 2026-01-21T23:51:44Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:18:a3:b9 10.100.0.9
Jan 21 23:51:45 compute-0 nova_compute[182935]: 2026-01-21 23:51:45.608 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:45 compute-0 nova_compute[182935]: 2026-01-21 23:51:45.922 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 21 23:51:47 compute-0 ovn_controller[95047]: 2026-01-21T23:51:47Z|00169|binding|INFO|Releasing lport 52e5d5d5-be78-49fa-86d7-24ac4adf40c1 from this chassis (sb_readonly=0)
Jan 21 23:51:47 compute-0 nova_compute[182935]: 2026-01-21 23:51:47.068 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:47 compute-0 ovn_controller[95047]: 2026-01-21T23:51:47Z|00170|binding|INFO|Releasing lport 52e5d5d5-be78-49fa-86d7-24ac4adf40c1 from this chassis (sb_readonly=0)
Jan 21 23:51:47 compute-0 nova_compute[182935]: 2026-01-21 23:51:47.182 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:47 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:47.921 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:51:47 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:47.923 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:51:47 compute-0 nova_compute[182935]: 2026-01-21 23:51:47.955 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:48 compute-0 kernel: tapcd6e5db4-47 (unregistering): left promiscuous mode
Jan 21 23:51:48 compute-0 NetworkManager[55139]: <info>  [1769039508.1646] device (tapcd6e5db4-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.176 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:48 compute-0 ovn_controller[95047]: 2026-01-21T23:51:48Z|00171|binding|INFO|Releasing lport cd6e5db4-4773-4806-9d60-534f8bd105be from this chassis (sb_readonly=0)
Jan 21 23:51:48 compute-0 ovn_controller[95047]: 2026-01-21T23:51:48Z|00172|binding|INFO|Setting lport cd6e5db4-4773-4806-9d60-534f8bd105be down in Southbound
Jan 21 23:51:48 compute-0 ovn_controller[95047]: 2026-01-21T23:51:48Z|00173|binding|INFO|Removing iface tapcd6e5db4-47 ovn-installed in OVS
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.181 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.194 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:a3:b9 10.100.0.9'], port_security=['fa:16:3e:18:a3:b9 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '07512c08-85ed-4cd4-8f13-bb1698a30b8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=cd6e5db4-4773-4806-9d60-534f8bd105be) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.195 104408 INFO neutron.agent.ovn.metadata.agent [-] Port cd6e5db4-4773-4806-9d60-534f8bd105be in datapath 7b586c54-3322-410f-9bc9-972a63b8deff unbound from our chassis
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.197 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b586c54-3322-410f-9bc9-972a63b8deff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.198 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[18c732ea-2dcf-4cd1-bdb5-94fcb311b89f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.199 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace which is not needed anymore
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.203 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:48 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Jan 21 23:51:48 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000002d.scope: Consumed 14.428s CPU time.
Jan 21 23:51:48 compute-0 systemd-machined[154182]: Machine qemu-25-instance-0000002d terminated.
Jan 21 23:51:48 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217726]: [NOTICE]   (217744) : haproxy version is 2.8.14-c23fe91
Jan 21 23:51:48 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217726]: [NOTICE]   (217744) : path to executable is /usr/sbin/haproxy
Jan 21 23:51:48 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217726]: [WARNING]  (217744) : Exiting Master process...
Jan 21 23:51:48 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217726]: [ALERT]    (217744) : Current worker (217753) exited with code 143 (Terminated)
Jan 21 23:51:48 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[217726]: [WARNING]  (217744) : All workers exited. Exiting... (0)
Jan 21 23:51:48 compute-0 systemd[1]: libpod-e67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c.scope: Deactivated successfully.
Jan 21 23:51:48 compute-0 podman[217865]: 2026-01-21 23:51:48.367326051 +0000 UTC m=+0.051701155 container died e67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:51:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c-userdata-shm.mount: Deactivated successfully.
Jan 21 23:51:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-a2c85cd8695d6bc44cac8e60e719474f0471eea69a3938c54842fc900e6a5154-merged.mount: Deactivated successfully.
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.404 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.412 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:48 compute-0 podman[217865]: 2026-01-21 23:51:48.414393687 +0000 UTC m=+0.098768761 container cleanup e67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:51:48 compute-0 systemd[1]: libpod-conmon-e67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c.scope: Deactivated successfully.
Jan 21 23:51:48 compute-0 podman[217907]: 2026-01-21 23:51:48.483960447 +0000 UTC m=+0.041645771 container remove e67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.491 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e1d84198-f5c9-4b52-95e6-c6f077311b6b]: (4, ('Wed Jan 21 11:51:48 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (e67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c)\ne67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c\nWed Jan 21 11:51:48 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (e67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c)\ne67f252230faad41dd6fbdab39faa09b099f8734b3683c3af64038769cb23a1c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.494 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[caa5e0fc-ad7a-4e72-be3d-ea127ce7cb66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.496 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.498 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:48 compute-0 kernel: tap7b586c54-30: left promiscuous mode
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.524 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.529 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a9544d3d-9ada-4981-b263-10dc0b4f0df3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.545 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[27b65271-bb7d-419f-9da1-46eb55c57da2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.548 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dd42555a-9ad1-4563-a2a9-3d0758115ea7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.568 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b823f4af-5e29-448a-afaf-87691fd7e5d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399294, 'reachable_time': 41972, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217930, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.572 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:51:48 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:48.572 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[b2188c61-85b3-4e4b-a7e9-a34b56ff78ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:48 compute-0 systemd[1]: run-netns-ovnmeta\x2d7b586c54\x2d3322\x2d410f\x2d9bc9\x2d972a63b8deff.mount: Deactivated successfully.
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.641 182939 DEBUG nova.compute.manager [req-7dae8e7d-2ed5-44c6-a73e-0390be4de527 req-fcaed026-0490-4d91-a2b5-86b80660f8af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received event network-vif-unplugged-cd6e5db4-4773-4806-9d60-534f8bd105be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.642 182939 DEBUG oslo_concurrency.lockutils [req-7dae8e7d-2ed5-44c6-a73e-0390be4de527 req-fcaed026-0490-4d91-a2b5-86b80660f8af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.643 182939 DEBUG oslo_concurrency.lockutils [req-7dae8e7d-2ed5-44c6-a73e-0390be4de527 req-fcaed026-0490-4d91-a2b5-86b80660f8af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.643 182939 DEBUG oslo_concurrency.lockutils [req-7dae8e7d-2ed5-44c6-a73e-0390be4de527 req-fcaed026-0490-4d91-a2b5-86b80660f8af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.643 182939 DEBUG nova.compute.manager [req-7dae8e7d-2ed5-44c6-a73e-0390be4de527 req-fcaed026-0490-4d91-a2b5-86b80660f8af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] No waiting events found dispatching network-vif-unplugged-cd6e5db4-4773-4806-9d60-534f8bd105be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.644 182939 WARNING nova.compute.manager [req-7dae8e7d-2ed5-44c6-a73e-0390be4de527 req-fcaed026-0490-4d91-a2b5-86b80660f8af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received unexpected event network-vif-unplugged-cd6e5db4-4773-4806-9d60-534f8bd105be for instance with vm_state active and task_state rebuilding.
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.944 182939 INFO nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Instance shutdown successfully after 13 seconds.
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.952 182939 INFO nova.virt.libvirt.driver [-] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Instance destroyed successfully.
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.960 182939 INFO nova.virt.libvirt.driver [-] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Instance destroyed successfully.
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.962 182939 DEBUG nova.virt.libvirt.vif [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:51:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-774587708',display_name='tempest-ServerDiskConfigTestJSON-server-774587708',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-774587708',id=45,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:51:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-kzhvn2tf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:51:34Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=07512c08-85ed-4cd4-8f13-bb1698a30b8c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.962 182939 DEBUG nova.network.os_vif_util [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.963 182939 DEBUG nova.network.os_vif_util [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.964 182939 DEBUG os_vif [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.965 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:48 compute-0 nova_compute[182935]: 2026-01-21 23:51:48.966 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd6e5db4-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.000 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.003 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.006 182939 INFO os_vif [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47')
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.007 182939 INFO nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Deleting instance files /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c_del
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.008 182939 INFO nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Deletion of /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c_del complete
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.250 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.251 182939 INFO nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Creating image(s)
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.251 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "/var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.252 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.252 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.265 182939 DEBUG oslo_concurrency.processutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.357 182939 DEBUG oslo_concurrency.processutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.358 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.359 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.376 182939 DEBUG oslo_concurrency.processutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.443 182939 DEBUG oslo_concurrency.processutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.445 182939 DEBUG oslo_concurrency.processutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.481 182939 DEBUG oslo_concurrency.processutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.482 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.483 182939 DEBUG oslo_concurrency.processutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.552 182939 DEBUG oslo_concurrency.processutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.553 182939 DEBUG nova.virt.disk.api [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Checking if we can resize image /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.554 182939 DEBUG oslo_concurrency.processutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.617 182939 DEBUG oslo_concurrency.processutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.618 182939 DEBUG nova.virt.disk.api [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Cannot resize image /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.619 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.619 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Ensure instance console log exists: /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.620 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.620 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.621 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.623 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Start _get_guest_xml network_info=[{"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.629 182939 WARNING nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.639 182939 DEBUG nova.virt.libvirt.host [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.640 182939 DEBUG nova.virt.libvirt.host [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.644 182939 DEBUG nova.virt.libvirt.host [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.645 182939 DEBUG nova.virt.libvirt.host [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.647 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.648 182939 DEBUG nova.virt.hardware [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.649 182939 DEBUG nova.virt.hardware [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.649 182939 DEBUG nova.virt.hardware [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.650 182939 DEBUG nova.virt.hardware [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.650 182939 DEBUG nova.virt.hardware [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.651 182939 DEBUG nova.virt.hardware [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.651 182939 DEBUG nova.virt.hardware [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.652 182939 DEBUG nova.virt.hardware [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.652 182939 DEBUG nova.virt.hardware [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.652 182939 DEBUG nova.virt.hardware [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.653 182939 DEBUG nova.virt.hardware [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.653 182939 DEBUG nova.objects.instance [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 07512c08-85ed-4cd4-8f13-bb1698a30b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.686 182939 DEBUG nova.virt.libvirt.vif [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:51:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-774587708',display_name='tempest-ServerDiskConfigTestJSON-server-774587708',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-774587708',id=45,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:51:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-kzhvn2tf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:51:49Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=07512c08-85ed-4cd4-8f13-bb1698a30b8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.687 182939 DEBUG nova.network.os_vif_util [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.689 182939 DEBUG nova.network.os_vif_util [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.692 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:51:49 compute-0 nova_compute[182935]:   <uuid>07512c08-85ed-4cd4-8f13-bb1698a30b8c</uuid>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   <name>instance-0000002d</name>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-774587708</nova:name>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:51:49</nova:creationTime>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:51:49 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:51:49 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:51:49 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:51:49 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:51:49 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:51:49 compute-0 nova_compute[182935]:         <nova:user uuid="a7fb6bdd938b4fcdb749b0bc4f86f97e">tempest-ServerDiskConfigTestJSON-1417790226-project-member</nova:user>
Jan 21 23:51:49 compute-0 nova_compute[182935]:         <nova:project uuid="c09a5cf201e249f69f57cd4a632d1e2b">tempest-ServerDiskConfigTestJSON-1417790226</nova:project>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="3e1dda74-3c6a-4d29-8792-32134d1c36c5"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:51:49 compute-0 nova_compute[182935]:         <nova:port uuid="cd6e5db4-4773-4806-9d60-534f8bd105be">
Jan 21 23:51:49 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <system>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <entry name="serial">07512c08-85ed-4cd4-8f13-bb1698a30b8c</entry>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <entry name="uuid">07512c08-85ed-4cd4-8f13-bb1698a30b8c</entry>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     </system>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   <os>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   </os>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   <features>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   </features>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.config"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:18:a3:b9"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <target dev="tapcd6e5db4-47"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/console.log" append="off"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <video>
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     </video>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:51:49 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:51:49 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:51:49 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:51:49 compute-0 nova_compute[182935]: </domain>
Jan 21 23:51:49 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.694 182939 DEBUG nova.compute.manager [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Preparing to wait for external event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.696 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.696 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.697 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.698 182939 DEBUG nova.virt.libvirt.vif [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:51:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-774587708',display_name='tempest-ServerDiskConfigTestJSON-server-774587708',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-774587708',id=45,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:51:31Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-kzhvn2tf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:51:49Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=07512c08-85ed-4cd4-8f13-bb1698a30b8c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.698 182939 DEBUG nova.network.os_vif_util [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.699 182939 DEBUG nova.network.os_vif_util [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.699 182939 DEBUG os_vif [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.700 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.700 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.701 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.705 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.705 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd6e5db4-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.706 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcd6e5db4-47, col_values=(('external_ids', {'iface-id': 'cd6e5db4-4773-4806-9d60-534f8bd105be', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:18:a3:b9', 'vm-uuid': '07512c08-85ed-4cd4-8f13-bb1698a30b8c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.708 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:49 compute-0 NetworkManager[55139]: <info>  [1769039509.7088] manager: (tapcd6e5db4-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.711 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.714 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.715 182939 INFO os_vif [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47')
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.785 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.785 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.786 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No VIF found with MAC fa:16:3e:18:a3:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.786 182939 INFO nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Using config drive
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.806 182939 DEBUG nova.objects.instance [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'ec2_ids' on Instance uuid 07512c08-85ed-4cd4-8f13-bb1698a30b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:49 compute-0 nova_compute[182935]: 2026-01-21 23:51:49.850 182939 DEBUG nova.objects.instance [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'keypairs' on Instance uuid 07512c08-85ed-4cd4-8f13-bb1698a30b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:50 compute-0 nova_compute[182935]: 2026-01-21 23:51:50.610 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:50 compute-0 nova_compute[182935]: 2026-01-21 23:51:50.799 182939 DEBUG nova.compute.manager [req-d76b7f4a-cf58-4c74-a09d-64a609739d38 req-79fa2a43-165c-41e9-ac68-c20afcab4cbb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:50 compute-0 nova_compute[182935]: 2026-01-21 23:51:50.799 182939 DEBUG oslo_concurrency.lockutils [req-d76b7f4a-cf58-4c74-a09d-64a609739d38 req-79fa2a43-165c-41e9-ac68-c20afcab4cbb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:50 compute-0 nova_compute[182935]: 2026-01-21 23:51:50.799 182939 DEBUG oslo_concurrency.lockutils [req-d76b7f4a-cf58-4c74-a09d-64a609739d38 req-79fa2a43-165c-41e9-ac68-c20afcab4cbb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:50 compute-0 nova_compute[182935]: 2026-01-21 23:51:50.800 182939 DEBUG oslo_concurrency.lockutils [req-d76b7f4a-cf58-4c74-a09d-64a609739d38 req-79fa2a43-165c-41e9-ac68-c20afcab4cbb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:50 compute-0 nova_compute[182935]: 2026-01-21 23:51:50.800 182939 DEBUG nova.compute.manager [req-d76b7f4a-cf58-4c74-a09d-64a609739d38 req-79fa2a43-165c-41e9-ac68-c20afcab4cbb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Processing event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:51:50 compute-0 nova_compute[182935]: 2026-01-21 23:51:50.860 182939 INFO nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Creating config drive at /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.config
Jan 21 23:51:50 compute-0 nova_compute[182935]: 2026-01-21 23:51:50.865 182939 DEBUG oslo_concurrency.processutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt7vxqg7d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.000 182939 DEBUG oslo_concurrency.processutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt7vxqg7d" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:51:51 compute-0 kernel: tapcd6e5db4-47: entered promiscuous mode
Jan 21 23:51:51 compute-0 NetworkManager[55139]: <info>  [1769039511.0745] manager: (tapcd6e5db4-47): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Jan 21 23:51:51 compute-0 ovn_controller[95047]: 2026-01-21T23:51:51Z|00174|binding|INFO|Claiming lport cd6e5db4-4773-4806-9d60-534f8bd105be for this chassis.
Jan 21 23:51:51 compute-0 systemd-udevd[217844]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.074 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:51 compute-0 ovn_controller[95047]: 2026-01-21T23:51:51Z|00175|binding|INFO|cd6e5db4-4773-4806-9d60-534f8bd105be: Claiming fa:16:3e:18:a3:b9 10.100.0.9
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.083 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:a3:b9 10.100.0.9'], port_security=['fa:16:3e:18:a3:b9 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '07512c08-85ed-4cd4-8f13-bb1698a30b8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=cd6e5db4-4773-4806-9d60-534f8bd105be) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.085 104408 INFO neutron.agent.ovn.metadata.agent [-] Port cd6e5db4-4773-4806-9d60-534f8bd105be in datapath 7b586c54-3322-410f-9bc9-972a63b8deff bound to our chassis
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.085 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:51:51 compute-0 ovn_controller[95047]: 2026-01-21T23:51:51Z|00176|binding|INFO|Setting lport cd6e5db4-4773-4806-9d60-534f8bd105be ovn-installed in OVS
Jan 21 23:51:51 compute-0 ovn_controller[95047]: 2026-01-21T23:51:51Z|00177|binding|INFO|Setting lport cd6e5db4-4773-4806-9d60-534f8bd105be up in Southbound
Jan 21 23:51:51 compute-0 NetworkManager[55139]: <info>  [1769039511.0901] device (tapcd6e5db4-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:51:51 compute-0 NetworkManager[55139]: <info>  [1769039511.0915] device (tapcd6e5db4-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.095 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.102 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a0dc1be3-17a9-4dc5-8a46-286573f8261a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.103 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b586c54-31 in ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.105 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b586c54-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.105 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ca3a5590-82c1-4edc-9e67-e9cf6c23c3f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.106 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b5b0edf8-3f9a-453d-ac25-452c15a11b17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.121 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[a460fc52-2e07-46de-8211-6a2680882ec2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 systemd-machined[154182]: New machine qemu-26-instance-0000002d.
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.139 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[079ffc4e-1ee8-48fd-a2d4-d3a96343066b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 systemd[1]: Started Virtual Machine qemu-26-instance-0000002d.
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.175 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[605721ce-b93e-412f-a72a-27d9bb413101]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 NetworkManager[55139]: <info>  [1769039511.1834] manager: (tap7b586c54-30): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.182 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7bdb63a2-360f-4e2a-9069-1031ffafea6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 systemd-udevd[217978]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.232 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d05a095c-1849-48ef-b14d-ff64a7c8af3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.239 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[e1142cfd-af9d-4f47-ad19-aa23875e41ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 NetworkManager[55139]: <info>  [1769039511.2693] device (tap7b586c54-30): carrier: link connected
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.275 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[3334b013-8f8e-45dd-8d54-b4d2a84ba9b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.296 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[255e924b-c52c-44bd-b10c-2aae3f664ff6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401400, 'reachable_time': 25322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218000, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.311 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9fea15a7-f6a1-40e6-b208-63d378b1bf27]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:a9f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 401400, 'tstamp': 401400}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218001, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.331 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c82fae8f-2a3c-4d9f-b78e-5cc1a7233317]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401400, 'reachable_time': 25322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218002, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.369 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d0339018-437d-4807-9910-86562c86a050]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.430 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f22a8c5d-b39e-48a2-9e17-a149271eb28b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.432 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.432 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.432 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b586c54-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:51 compute-0 kernel: tap7b586c54-30: entered promiscuous mode
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.434 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:51 compute-0 NetworkManager[55139]: <info>  [1769039511.4394] manager: (tap7b586c54-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.442 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b586c54-30, col_values=(('external_ids', {'iface-id': '52e5d5d5-be78-49fa-86d7-24ac4adf40c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.443 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:51 compute-0 ovn_controller[95047]: 2026-01-21T23:51:51Z|00178|binding|INFO|Releasing lport 52e5d5d5-be78-49fa-86d7-24ac4adf40c1 from this chassis (sb_readonly=0)
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.455 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.456 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.457 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b27a138c-c147-4da0-81e9-9ee9f8d51467]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.457 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:51:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:51.458 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'env', 'PROCESS_TAG=haproxy-7b586c54-3322-410f-9bc9-972a63b8deff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b586c54-3322-410f-9bc9-972a63b8deff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:51:51 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.838 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for 07512c08-85ed-4cd4-8f13-bb1698a30b8c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.838 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039511.8372536, 07512c08-85ed-4cd4-8f13-bb1698a30b8c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.839 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] VM Started (Lifecycle Event)
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.842 182939 DEBUG nova.compute.manager [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.846 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:51:51 compute-0 podman[218042]: 2026-01-21 23:51:51.847884489 +0000 UTC m=+0.059752043 container create 8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.851 182939 INFO nova.virt.libvirt.driver [-] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Instance spawned successfully.
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.851 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:51:51 compute-0 systemd[1]: Started libpod-conmon-8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129.scope.
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.890 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.896 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.899 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.899 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.899 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.900 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.900 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:51 compute-0 nova_compute[182935]: 2026-01-21 23:51:51.901 182939 DEBUG nova.virt.libvirt.driver [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:51:51 compute-0 podman[218042]: 2026-01-21 23:51:51.81570281 +0000 UTC m=+0.027570384 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:51:51 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:51:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/afe4dbe32ac876b0a7a58da1a6166a12f97db442d86f2fd4d84b54ede75ae594/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:51:51 compute-0 podman[218042]: 2026-01-21 23:51:51.943944266 +0000 UTC m=+0.155811840 container init 8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 23:51:51 compute-0 podman[218042]: 2026-01-21 23:51:51.950039988 +0000 UTC m=+0.161907542 container start 8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:51:51 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218057]: [NOTICE]   (218061) : New worker (218063) forked
Jan 21 23:51:51 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218057]: [NOTICE]   (218061) : Loading success.
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.179 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.180 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039511.8386996, 07512c08-85ed-4cd4-8f13-bb1698a30b8c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.180 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] VM Paused (Lifecycle Event)
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.209 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.213 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039511.845261, 07512c08-85ed-4cd4-8f13-bb1698a30b8c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.214 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] VM Resumed (Lifecycle Event)
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.239 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.243 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.250 182939 DEBUG nova.compute.manager [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.278 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.403 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.404 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.405 182939 DEBUG nova.objects.instance [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.519 182939 DEBUG oslo_concurrency.lockutils [None req-4ef31ecf-da05-4597-891c-1666f0cf811c a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.904 182939 DEBUG nova.compute.manager [req-c4715bcf-f4c3-41f3-97f9-4ae3daada626 req-7a12c9f1-9439-4753-af55-239b68461af9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.905 182939 DEBUG oslo_concurrency.lockutils [req-c4715bcf-f4c3-41f3-97f9-4ae3daada626 req-7a12c9f1-9439-4753-af55-239b68461af9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.906 182939 DEBUG oslo_concurrency.lockutils [req-c4715bcf-f4c3-41f3-97f9-4ae3daada626 req-7a12c9f1-9439-4753-af55-239b68461af9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.907 182939 DEBUG oslo_concurrency.lockutils [req-c4715bcf-f4c3-41f3-97f9-4ae3daada626 req-7a12c9f1-9439-4753-af55-239b68461af9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.907 182939 DEBUG nova.compute.manager [req-c4715bcf-f4c3-41f3-97f9-4ae3daada626 req-7a12c9f1-9439-4753-af55-239b68461af9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] No waiting events found dispatching network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.907 182939 WARNING nova.compute.manager [req-c4715bcf-f4c3-41f3-97f9-4ae3daada626 req-7a12c9f1-9439-4753-af55-239b68461af9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received unexpected event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be for instance with vm_state active and task_state None.
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.908 182939 DEBUG nova.compute.manager [req-c4715bcf-f4c3-41f3-97f9-4ae3daada626 req-7a12c9f1-9439-4753-af55-239b68461af9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.908 182939 DEBUG oslo_concurrency.lockutils [req-c4715bcf-f4c3-41f3-97f9-4ae3daada626 req-7a12c9f1-9439-4753-af55-239b68461af9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.909 182939 DEBUG oslo_concurrency.lockutils [req-c4715bcf-f4c3-41f3-97f9-4ae3daada626 req-7a12c9f1-9439-4753-af55-239b68461af9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.909 182939 DEBUG oslo_concurrency.lockutils [req-c4715bcf-f4c3-41f3-97f9-4ae3daada626 req-7a12c9f1-9439-4753-af55-239b68461af9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.910 182939 DEBUG nova.compute.manager [req-c4715bcf-f4c3-41f3-97f9-4ae3daada626 req-7a12c9f1-9439-4753-af55-239b68461af9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] No waiting events found dispatching network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:51:52 compute-0 nova_compute[182935]: 2026-01-21 23:51:52.910 182939 WARNING nova.compute.manager [req-c4715bcf-f4c3-41f3-97f9-4ae3daada626 req-7a12c9f1-9439-4753-af55-239b68461af9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received unexpected event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be for instance with vm_state active and task_state None.
Jan 21 23:51:54 compute-0 podman[218073]: 2026-01-21 23:51:54.703152738 +0000 UTC m=+0.065170838 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:51:54 compute-0 nova_compute[182935]: 2026-01-21 23:51:54.708 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:54 compute-0 podman[218072]: 2026-01-21 23:51:54.733653658 +0000 UTC m=+0.102054567 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.122 182939 DEBUG oslo_concurrency.lockutils [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.123 182939 DEBUG oslo_concurrency.lockutils [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.124 182939 DEBUG oslo_concurrency.lockutils [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.124 182939 DEBUG oslo_concurrency.lockutils [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.125 182939 DEBUG oslo_concurrency.lockutils [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.137 182939 INFO nova.compute.manager [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Terminating instance
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.154 182939 DEBUG nova.compute.manager [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:51:55 compute-0 kernel: tapcd6e5db4-47 (unregistering): left promiscuous mode
Jan 21 23:51:55 compute-0 NetworkManager[55139]: <info>  [1769039515.1777] device (tapcd6e5db4-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:51:55 compute-0 ovn_controller[95047]: 2026-01-21T23:51:55Z|00179|binding|INFO|Releasing lport cd6e5db4-4773-4806-9d60-534f8bd105be from this chassis (sb_readonly=0)
Jan 21 23:51:55 compute-0 ovn_controller[95047]: 2026-01-21T23:51:55Z|00180|binding|INFO|Setting lport cd6e5db4-4773-4806-9d60-534f8bd105be down in Southbound
Jan 21 23:51:55 compute-0 ovn_controller[95047]: 2026-01-21T23:51:55Z|00181|binding|INFO|Removing iface tapcd6e5db4-47 ovn-installed in OVS
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.186 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.188 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.195 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:a3:b9 10.100.0.9'], port_security=['fa:16:3e:18:a3:b9 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '07512c08-85ed-4cd4-8f13-bb1698a30b8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=cd6e5db4-4773-4806-9d60-534f8bd105be) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.197 104408 INFO neutron.agent.ovn.metadata.agent [-] Port cd6e5db4-4773-4806-9d60-534f8bd105be in datapath 7b586c54-3322-410f-9bc9-972a63b8deff unbound from our chassis
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.198 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b586c54-3322-410f-9bc9-972a63b8deff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.200 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5cda7d-56e6-45e9-8d96-8ccdfc0c1465]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.200 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace which is not needed anymore
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.214 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:55 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Jan 21 23:51:55 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000002d.scope: Consumed 3.910s CPU time.
Jan 21 23:51:55 compute-0 systemd-machined[154182]: Machine qemu-26-instance-0000002d terminated.
Jan 21 23:51:55 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218057]: [NOTICE]   (218061) : haproxy version is 2.8.14-c23fe91
Jan 21 23:51:55 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218057]: [NOTICE]   (218061) : path to executable is /usr/sbin/haproxy
Jan 21 23:51:55 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218057]: [WARNING]  (218061) : Exiting Master process...
Jan 21 23:51:55 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218057]: [ALERT]    (218061) : Current worker (218063) exited with code 143 (Terminated)
Jan 21 23:51:55 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218057]: [WARNING]  (218061) : All workers exited. Exiting... (0)
Jan 21 23:51:55 compute-0 systemd[1]: libpod-8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129.scope: Deactivated successfully.
Jan 21 23:51:55 compute-0 podman[218150]: 2026-01-21 23:51:55.373214469 +0000 UTC m=+0.058968514 container died 8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:51:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129-userdata-shm.mount: Deactivated successfully.
Jan 21 23:51:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-afe4dbe32ac876b0a7a58da1a6166a12f97db442d86f2fd4d84b54ede75ae594-merged.mount: Deactivated successfully.
Jan 21 23:51:55 compute-0 podman[218150]: 2026-01-21 23:51:55.425309992 +0000 UTC m=+0.111064007 container cleanup 8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.440 182939 INFO nova.virt.libvirt.driver [-] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Instance destroyed successfully.
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.441 182939 DEBUG nova.objects.instance [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'resources' on Instance uuid 07512c08-85ed-4cd4-8f13-bb1698a30b8c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:51:55 compute-0 systemd[1]: libpod-conmon-8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129.scope: Deactivated successfully.
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.468 182939 DEBUG nova.virt.libvirt.vif [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:51:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-774587708',display_name='tempest-ServerDiskConfigTestJSON-server-774587708',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-774587708',id=45,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:51:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-kzhvn2tf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:51:52Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=07512c08-85ed-4cd4-8f13-bb1698a30b8c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.469 182939 DEBUG nova.network.os_vif_util [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "cd6e5db4-4773-4806-9d60-534f8bd105be", "address": "fa:16:3e:18:a3:b9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd6e5db4-47", "ovs_interfaceid": "cd6e5db4-4773-4806-9d60-534f8bd105be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.470 182939 DEBUG nova.network.os_vif_util [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.471 182939 DEBUG os_vif [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.473 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.473 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd6e5db4-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.513 182939 DEBUG nova.compute.manager [req-3189c656-e8fa-4f4e-a0a3-12ca018c461f req-5313430e-c6cd-47e4-bcc3-8ae2070b646d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received event network-vif-unplugged-cd6e5db4-4773-4806-9d60-534f8bd105be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.514 182939 DEBUG oslo_concurrency.lockutils [req-3189c656-e8fa-4f4e-a0a3-12ca018c461f req-5313430e-c6cd-47e4-bcc3-8ae2070b646d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.514 182939 DEBUG oslo_concurrency.lockutils [req-3189c656-e8fa-4f4e-a0a3-12ca018c461f req-5313430e-c6cd-47e4-bcc3-8ae2070b646d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.515 182939 DEBUG oslo_concurrency.lockutils [req-3189c656-e8fa-4f4e-a0a3-12ca018c461f req-5313430e-c6cd-47e4-bcc3-8ae2070b646d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.515 182939 DEBUG nova.compute.manager [req-3189c656-e8fa-4f4e-a0a3-12ca018c461f req-5313430e-c6cd-47e4-bcc3-8ae2070b646d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] No waiting events found dispatching network-vif-unplugged-cd6e5db4-4773-4806-9d60-534f8bd105be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.515 182939 DEBUG nova.compute.manager [req-3189c656-e8fa-4f4e-a0a3-12ca018c461f req-5313430e-c6cd-47e4-bcc3-8ae2070b646d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received event network-vif-unplugged-cd6e5db4-4773-4806-9d60-534f8bd105be for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.516 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.519 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.524 182939 INFO os_vif [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:18:a3:b9,bridge_name='br-int',has_traffic_filtering=True,id=cd6e5db4-4773-4806-9d60-534f8bd105be,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd6e5db4-47')
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.525 182939 INFO nova.virt.libvirt.driver [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Deleting instance files /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c_del
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.526 182939 INFO nova.virt.libvirt.driver [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Deletion of /var/lib/nova/instances/07512c08-85ed-4cd4-8f13-bb1698a30b8c_del complete
Jan 21 23:51:55 compute-0 podman[218194]: 2026-01-21 23:51:55.553850635 +0000 UTC m=+0.089230799 container remove 8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.561 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[12247e78-e3ad-4197-8f15-1e2dc8eabe8e]: (4, ('Wed Jan 21 11:51:55 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129)\n8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129\nWed Jan 21 11:51:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129)\n8923732be07c325d14b840c2f431a297916bb387e632166448f520c7af566129\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.563 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[332bf7c9-da29-4d9c-9e05-fcbc07bc9241]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.565 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.567 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:55 compute-0 kernel: tap7b586c54-30: left promiscuous mode
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.593 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.596 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[da3c8284-65a7-46fd-aa7c-cd616407b18d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.611 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.616 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[046249cc-10b1-440c-9632-2ea6dc525fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.617 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7981b009-c30c-470c-92fb-5bd2a157d254]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.627 182939 INFO nova.compute.manager [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.628 182939 DEBUG oslo.service.loopingcall [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.629 182939 DEBUG nova.compute.manager [-] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:51:55 compute-0 nova_compute[182935]: 2026-01-21 23:51:55.629 182939 DEBUG nova.network.neutron [-] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.643 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3f99eb67-773a-4f06-931a-c43fd70fe6c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 401390, 'reachable_time': 31844, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218211, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.647 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:51:55 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:55.647 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[cb7d9617-8a34-4d1f-960b-caa9be48ddb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:51:55 compute-0 systemd[1]: run-netns-ovnmeta\x2d7b586c54\x2d3322\x2d410f\x2d9bc9\x2d972a63b8deff.mount: Deactivated successfully.
Jan 21 23:51:56 compute-0 nova_compute[182935]: 2026-01-21 23:51:56.526 182939 DEBUG nova.network.neutron [-] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:51:56 compute-0 nova_compute[182935]: 2026-01-21 23:51:56.547 182939 INFO nova.compute.manager [-] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Took 0.92 seconds to deallocate network for instance.
Jan 21 23:51:56 compute-0 nova_compute[182935]: 2026-01-21 23:51:56.641 182939 DEBUG oslo_concurrency.lockutils [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:56 compute-0 nova_compute[182935]: 2026-01-21 23:51:56.642 182939 DEBUG oslo_concurrency.lockutils [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:56 compute-0 nova_compute[182935]: 2026-01-21 23:51:56.713 182939 DEBUG nova.compute.provider_tree [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:51:56 compute-0 nova_compute[182935]: 2026-01-21 23:51:56.738 182939 DEBUG nova.scheduler.client.report [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:51:56 compute-0 nova_compute[182935]: 2026-01-21 23:51:56.788 182939 DEBUG oslo_concurrency.lockutils [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:56 compute-0 nova_compute[182935]: 2026-01-21 23:51:56.839 182939 INFO nova.scheduler.client.report [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Deleted allocations for instance 07512c08-85ed-4cd4-8f13-bb1698a30b8c
Jan 21 23:51:56 compute-0 nova_compute[182935]: 2026-01-21 23:51:56.884 182939 DEBUG nova.compute.manager [req-1da55ec3-cb37-4fcc-bc13-34b48dcf90b2 req-b5573b7f-9802-4554-891a-71ac4e0de18a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received event network-vif-deleted-cd6e5db4-4773-4806-9d60-534f8bd105be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:56 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:51:56.925 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:51:56 compute-0 nova_compute[182935]: 2026-01-21 23:51:56.945 182939 DEBUG oslo_concurrency.lockutils [None req-e5645f3b-363f-4e53-9067-f88c03b3af92 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.821s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:57 compute-0 nova_compute[182935]: 2026-01-21 23:51:57.653 182939 DEBUG nova.compute.manager [req-be018a2b-74d1-44ea-99d9-ab7e3ade4946 req-f7b4c6e1-465b-4775-a532-e7914caadf32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:51:57 compute-0 nova_compute[182935]: 2026-01-21 23:51:57.654 182939 DEBUG oslo_concurrency.lockutils [req-be018a2b-74d1-44ea-99d9-ab7e3ade4946 req-f7b4c6e1-465b-4775-a532-e7914caadf32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:57 compute-0 nova_compute[182935]: 2026-01-21 23:51:57.654 182939 DEBUG oslo_concurrency.lockutils [req-be018a2b-74d1-44ea-99d9-ab7e3ade4946 req-f7b4c6e1-465b-4775-a532-e7914caadf32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:57 compute-0 nova_compute[182935]: 2026-01-21 23:51:57.654 182939 DEBUG oslo_concurrency.lockutils [req-be018a2b-74d1-44ea-99d9-ab7e3ade4946 req-f7b4c6e1-465b-4775-a532-e7914caadf32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07512c08-85ed-4cd4-8f13-bb1698a30b8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:57 compute-0 nova_compute[182935]: 2026-01-21 23:51:57.655 182939 DEBUG nova.compute.manager [req-be018a2b-74d1-44ea-99d9-ab7e3ade4946 req-f7b4c6e1-465b-4775-a532-e7914caadf32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] No waiting events found dispatching network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:51:57 compute-0 nova_compute[182935]: 2026-01-21 23:51:57.655 182939 WARNING nova.compute.manager [req-be018a2b-74d1-44ea-99d9-ab7e3ade4946 req-f7b4c6e1-465b-4775-a532-e7914caadf32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Received unexpected event network-vif-plugged-cd6e5db4-4773-4806-9d60-534f8bd105be for instance with vm_state deleted and task_state None.
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.340 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.341 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.358 182939 DEBUG nova.compute.manager [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.486 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.487 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.495 182939 DEBUG nova.virt.hardware [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.495 182939 INFO nova.compute.claims [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.666 182939 DEBUG nova.compute.provider_tree [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.689 182939 DEBUG nova.scheduler.client.report [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.719 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.719 182939 DEBUG nova.compute.manager [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.810 182939 DEBUG nova.compute.manager [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.811 182939 DEBUG nova.network.neutron [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.835 182939 INFO nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:51:59 compute-0 nova_compute[182935]: 2026-01-21 23:51:59.855 182939 DEBUG nova.compute.manager [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.021 182939 DEBUG nova.compute.manager [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.023 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.023 182939 INFO nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Creating image(s)
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.024 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "/var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.024 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.025 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.045 182939 DEBUG oslo_concurrency.processutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.110 182939 DEBUG oslo_concurrency.processutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.111 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.113 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.124 182939 DEBUG oslo_concurrency.processutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.150 182939 DEBUG nova.policy [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.199 182939 DEBUG oslo_concurrency.processutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.200 182939 DEBUG oslo_concurrency.processutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.254 182939 DEBUG oslo_concurrency.processutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.255 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.255 182939 DEBUG oslo_concurrency.processutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.312 182939 DEBUG oslo_concurrency.processutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.315 182939 DEBUG nova.virt.disk.api [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Checking if we can resize image /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.316 182939 DEBUG oslo_concurrency.processutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.407 182939 DEBUG oslo_concurrency.processutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.409 182939 DEBUG nova.virt.disk.api [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Cannot resize image /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.410 182939 DEBUG nova.objects.instance [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'migration_context' on Instance uuid 3b0c2900-8225-474d-ba1c-da5edb1a0058 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.436 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.437 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Ensure instance console log exists: /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.437 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.438 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.439 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.560 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:00 compute-0 nova_compute[182935]: 2026-01-21 23:52:00.614 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:01 compute-0 nova_compute[182935]: 2026-01-21 23:52:01.615 182939 DEBUG nova.network.neutron [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Successfully created port: 9b2b9286-20f4-4015-8abc-720cb546283c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:52:01 compute-0 podman[218227]: 2026-01-21 23:52:01.727460054 +0000 UTC m=+0.082476481 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 21 23:52:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:03.186 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:03.187 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:03.187 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:04 compute-0 nova_compute[182935]: 2026-01-21 23:52:04.203 182939 DEBUG nova.network.neutron [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Successfully updated port: 9b2b9286-20f4-4015-8abc-720cb546283c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:52:04 compute-0 nova_compute[182935]: 2026-01-21 23:52:04.219 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "refresh_cache-3b0c2900-8225-474d-ba1c-da5edb1a0058" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:52:04 compute-0 nova_compute[182935]: 2026-01-21 23:52:04.219 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquired lock "refresh_cache-3b0c2900-8225-474d-ba1c-da5edb1a0058" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:52:04 compute-0 nova_compute[182935]: 2026-01-21 23:52:04.220 182939 DEBUG nova.network.neutron [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:52:04 compute-0 nova_compute[182935]: 2026-01-21 23:52:04.520 182939 DEBUG nova.network.neutron [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:52:04 compute-0 nova_compute[182935]: 2026-01-21 23:52:04.712 182939 DEBUG nova.compute.manager [req-577c0130-7f2c-4b7a-8dc5-38c933d9851c req-014eb76b-1578-4a8f-81cf-03b05e3b1806 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received event network-changed-9b2b9286-20f4-4015-8abc-720cb546283c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:52:04 compute-0 nova_compute[182935]: 2026-01-21 23:52:04.712 182939 DEBUG nova.compute.manager [req-577c0130-7f2c-4b7a-8dc5-38c933d9851c req-014eb76b-1578-4a8f-81cf-03b05e3b1806 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Refreshing instance network info cache due to event network-changed-9b2b9286-20f4-4015-8abc-720cb546283c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:52:04 compute-0 nova_compute[182935]: 2026-01-21 23:52:04.713 182939 DEBUG oslo_concurrency.lockutils [req-577c0130-7f2c-4b7a-8dc5-38c933d9851c req-014eb76b-1578-4a8f-81cf-03b05e3b1806 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3b0c2900-8225-474d-ba1c-da5edb1a0058" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.565 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.616 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.764 182939 DEBUG nova.network.neutron [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Updating instance_info_cache with network_info: [{"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.794 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Releasing lock "refresh_cache-3b0c2900-8225-474d-ba1c-da5edb1a0058" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.795 182939 DEBUG nova.compute.manager [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Instance network_info: |[{"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.796 182939 DEBUG oslo_concurrency.lockutils [req-577c0130-7f2c-4b7a-8dc5-38c933d9851c req-014eb76b-1578-4a8f-81cf-03b05e3b1806 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3b0c2900-8225-474d-ba1c-da5edb1a0058" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.797 182939 DEBUG nova.network.neutron [req-577c0130-7f2c-4b7a-8dc5-38c933d9851c req-014eb76b-1578-4a8f-81cf-03b05e3b1806 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Refreshing network info cache for port 9b2b9286-20f4-4015-8abc-720cb546283c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.803 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Start _get_guest_xml network_info=[{"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.813 182939 WARNING nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.820 182939 DEBUG nova.virt.libvirt.host [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.821 182939 DEBUG nova.virt.libvirt.host [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.825 182939 DEBUG nova.virt.libvirt.host [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.826 182939 DEBUG nova.virt.libvirt.host [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.827 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.828 182939 DEBUG nova.virt.hardware [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.828 182939 DEBUG nova.virt.hardware [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.828 182939 DEBUG nova.virt.hardware [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.828 182939 DEBUG nova.virt.hardware [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.829 182939 DEBUG nova.virt.hardware [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.829 182939 DEBUG nova.virt.hardware [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.829 182939 DEBUG nova.virt.hardware [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.829 182939 DEBUG nova.virt.hardware [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.829 182939 DEBUG nova.virt.hardware [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.830 182939 DEBUG nova.virt.hardware [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.830 182939 DEBUG nova.virt.hardware [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.835 182939 DEBUG nova.virt.libvirt.vif [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:51:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1106281103',display_name='tempest-ServerDiskConfigTestJSON-server-1106281103',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1106281103',id=47,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-t9b97jsk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:51:59Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=3b0c2900-8225-474d-ba1c-da5edb1a0058,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.835 182939 DEBUG nova.network.os_vif_util [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.836 182939 DEBUG nova.network.os_vif_util [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.838 182939 DEBUG nova.objects.instance [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b0c2900-8225-474d-ba1c-da5edb1a0058 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.855 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:52:05 compute-0 nova_compute[182935]:   <uuid>3b0c2900-8225-474d-ba1c-da5edb1a0058</uuid>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   <name>instance-0000002f</name>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-1106281103</nova:name>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:52:05</nova:creationTime>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:52:05 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:52:05 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:52:05 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:52:05 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:52:05 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:52:05 compute-0 nova_compute[182935]:         <nova:user uuid="a7fb6bdd938b4fcdb749b0bc4f86f97e">tempest-ServerDiskConfigTestJSON-1417790226-project-member</nova:user>
Jan 21 23:52:05 compute-0 nova_compute[182935]:         <nova:project uuid="c09a5cf201e249f69f57cd4a632d1e2b">tempest-ServerDiskConfigTestJSON-1417790226</nova:project>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:52:05 compute-0 nova_compute[182935]:         <nova:port uuid="9b2b9286-20f4-4015-8abc-720cb546283c">
Jan 21 23:52:05 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <system>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <entry name="serial">3b0c2900-8225-474d-ba1c-da5edb1a0058</entry>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <entry name="uuid">3b0c2900-8225-474d-ba1c-da5edb1a0058</entry>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     </system>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   <os>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   </os>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   <features>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   </features>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.config"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:28:8d:f1"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <target dev="tap9b2b9286-20"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/console.log" append="off"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <video>
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     </video>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:52:05 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:52:05 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:52:05 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:52:05 compute-0 nova_compute[182935]: </domain>
Jan 21 23:52:05 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.857 182939 DEBUG nova.compute.manager [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Preparing to wait for external event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.858 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.858 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.858 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.860 182939 DEBUG nova.virt.libvirt.vif [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:51:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1106281103',display_name='tempest-ServerDiskConfigTestJSON-server-1106281103',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1106281103',id=47,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-t9b97jsk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:51:59Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=3b0c2900-8225-474d-ba1c-da5edb1a0058,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.860 182939 DEBUG nova.network.os_vif_util [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.861 182939 DEBUG nova.network.os_vif_util [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.862 182939 DEBUG os_vif [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.863 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.864 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.864 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.868 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.868 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b2b9286-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.868 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9b2b9286-20, col_values=(('external_ids', {'iface-id': '9b2b9286-20f4-4015-8abc-720cb546283c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:28:8d:f1', 'vm-uuid': '3b0c2900-8225-474d-ba1c-da5edb1a0058'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:05 compute-0 NetworkManager[55139]: <info>  [1769039525.8710] manager: (tap9b2b9286-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.873 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.879 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.880 182939 INFO os_vif [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20')
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.946 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.947 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.948 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No VIF found with MAC fa:16:3e:28:8d:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:52:05 compute-0 nova_compute[182935]: 2026-01-21 23:52:05.948 182939 INFO nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Using config drive
Jan 21 23:52:06 compute-0 nova_compute[182935]: 2026-01-21 23:52:06.573 182939 INFO nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Creating config drive at /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.config
Jan 21 23:52:06 compute-0 nova_compute[182935]: 2026-01-21 23:52:06.578 182939 DEBUG oslo_concurrency.processutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsueerl6l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:06 compute-0 nova_compute[182935]: 2026-01-21 23:52:06.708 182939 DEBUG oslo_concurrency.processutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsueerl6l" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:06 compute-0 kernel: tap9b2b9286-20: entered promiscuous mode
Jan 21 23:52:06 compute-0 NetworkManager[55139]: <info>  [1769039526.8069] manager: (tap9b2b9286-20): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Jan 21 23:52:06 compute-0 nova_compute[182935]: 2026-01-21 23:52:06.808 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:06 compute-0 ovn_controller[95047]: 2026-01-21T23:52:06Z|00182|binding|INFO|Claiming lport 9b2b9286-20f4-4015-8abc-720cb546283c for this chassis.
Jan 21 23:52:06 compute-0 ovn_controller[95047]: 2026-01-21T23:52:06Z|00183|binding|INFO|9b2b9286-20f4-4015-8abc-720cb546283c: Claiming fa:16:3e:28:8d:f1 10.100.0.6
Jan 21 23:52:06 compute-0 nova_compute[182935]: 2026-01-21 23:52:06.811 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:06.820 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:8d:f1 10.100.0.6'], port_security=['fa:16:3e:28:8d:f1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=9b2b9286-20f4-4015-8abc-720cb546283c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:52:06 compute-0 ovn_controller[95047]: 2026-01-21T23:52:06Z|00184|binding|INFO|Setting lport 9b2b9286-20f4-4015-8abc-720cb546283c ovn-installed in OVS
Jan 21 23:52:06 compute-0 ovn_controller[95047]: 2026-01-21T23:52:06Z|00185|binding|INFO|Setting lport 9b2b9286-20f4-4015-8abc-720cb546283c up in Southbound
Jan 21 23:52:06 compute-0 nova_compute[182935]: 2026-01-21 23:52:06.824 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:06.822 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 9b2b9286-20f4-4015-8abc-720cb546283c in datapath 7b586c54-3322-410f-9bc9-972a63b8deff bound to our chassis
Jan 21 23:52:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:06.825 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:52:06 compute-0 nova_compute[182935]: 2026-01-21 23:52:06.828 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:06 compute-0 systemd-udevd[218279]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:52:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:06.882 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4f308e90-6b71-45ba-8d39-8257551fa894]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:06.884 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b586c54-31 in ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:52:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:06.886 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b586c54-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:52:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:06.886 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[05eb2cc2-2d8e-47c4-a1c4-01cc3a79b1d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:06.887 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9a90c282-7bf7-45a0-a753-f6b012ce9afb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:06 compute-0 NetworkManager[55139]: <info>  [1769039526.8940] device (tap9b2b9286-20): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:52:06 compute-0 NetworkManager[55139]: <info>  [1769039526.8947] device (tap9b2b9286-20): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:52:06 compute-0 systemd-machined[154182]: New machine qemu-27-instance-0000002f.
Jan 21 23:52:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:06.909 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[03ad7d93-be01-4284-a46a-737e015e9ef4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:06 compute-0 systemd[1]: Started Virtual Machine qemu-27-instance-0000002f.
Jan 21 23:52:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:06.928 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[057872c0-55fd-448a-96f7-37f0ea7f4283]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:06 compute-0 podman[218264]: 2026-01-21 23:52:06.960858234 +0000 UTC m=+0.149264747 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:52:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:06.968 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c1294012-34af-45f0-8bb0-8c043a0603f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:06 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:06.975 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3fbd545e-f3ba-4778-8beb-c9df0500881c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:06 compute-0 NetworkManager[55139]: <info>  [1769039526.9765] manager: (tap7b586c54-30): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.023 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[1db9f2d3-4c9a-4c7b-9ce5-d07b8bab2904]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.026 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d222e125-99b9-4a38-8a01-bd595ca70db8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:07 compute-0 NetworkManager[55139]: <info>  [1769039527.0573] device (tap7b586c54-30): carrier: link connected
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.066 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[496cf375-2283-45f9-b2bd-3fe7e1704978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.090 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d5166bea-4225-4125-933f-d140c722eb5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402979, 'reachable_time': 17946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218321, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.113 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[aee8b64d-7c23-4e3a-b408-f25d3aa3e277]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:a9f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402979, 'tstamp': 402979}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218322, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.136 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[503c2730-38b9-4353-9905-4f1a33c4fa89]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402979, 'reachable_time': 17946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218323, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.188 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[03b8131b-9c5e-43e0-aec4-75bb8d7c8ec4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.241 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[af30c8c2-88a8-4c11-b17a-83ab3a052e9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.244 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.244 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.244 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b586c54-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:07 compute-0 NetworkManager[55139]: <info>  [1769039527.2473] manager: (tap7b586c54-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Jan 21 23:52:07 compute-0 kernel: tap7b586c54-30: entered promiscuous mode
Jan 21 23:52:07 compute-0 nova_compute[182935]: 2026-01-21 23:52:07.248 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.250 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b586c54-30, col_values=(('external_ids', {'iface-id': '52e5d5d5-be78-49fa-86d7-24ac4adf40c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:07 compute-0 nova_compute[182935]: 2026-01-21 23:52:07.251 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:07 compute-0 ovn_controller[95047]: 2026-01-21T23:52:07Z|00186|binding|INFO|Releasing lport 52e5d5d5-be78-49fa-86d7-24ac4adf40c1 from this chassis (sb_readonly=0)
Jan 21 23:52:07 compute-0 nova_compute[182935]: 2026-01-21 23:52:07.252 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.252 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.254 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1c5990d9-ed8d-412b-90fb-f8f96f69f2eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.255 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:52:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:07.256 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'env', 'PROCESS_TAG=haproxy-7b586c54-3322-410f-9bc9-972a63b8deff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b586c54-3322-410f-9bc9-972a63b8deff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:52:07 compute-0 nova_compute[182935]: 2026-01-21 23:52:07.263 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:07 compute-0 podman[218355]: 2026-01-21 23:52:07.666217257 +0000 UTC m=+0.073689257 container create 8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 23:52:07 compute-0 systemd[1]: Started libpod-conmon-8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d.scope.
Jan 21 23:52:07 compute-0 podman[218355]: 2026-01-21 23:52:07.630070315 +0000 UTC m=+0.037542335 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:52:07 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:52:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57a42220ef59ebbab55e30870c3365e673d975139e3705aaddf64d4157a58618/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:52:07 compute-0 podman[218355]: 2026-01-21 23:52:07.758631508 +0000 UTC m=+0.166103528 container init 8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:52:07 compute-0 podman[218355]: 2026-01-21 23:52:07.764890494 +0000 UTC m=+0.172362494 container start 8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:52:07 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218370]: [NOTICE]   (218374) : New worker (218376) forked
Jan 21 23:52:07 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218370]: [NOTICE]   (218374) : Loading success.
Jan 21 23:52:07 compute-0 nova_compute[182935]: 2026-01-21 23:52:07.988 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039527.987894, 3b0c2900-8225-474d-ba1c-da5edb1a0058 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:52:07 compute-0 nova_compute[182935]: 2026-01-21 23:52:07.989 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] VM Started (Lifecycle Event)
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.023 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.030 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039527.9935465, 3b0c2900-8225-474d-ba1c-da5edb1a0058 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.030 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] VM Paused (Lifecycle Event)
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.093 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.101 182939 DEBUG nova.compute.manager [req-104c72cf-8c0e-45bc-9462-bb6022cc866c req-4874051d-07de-4b02-ba0f-c4648da53c83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.101 182939 DEBUG oslo_concurrency.lockutils [req-104c72cf-8c0e-45bc-9462-bb6022cc866c req-4874051d-07de-4b02-ba0f-c4648da53c83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.102 182939 DEBUG oslo_concurrency.lockutils [req-104c72cf-8c0e-45bc-9462-bb6022cc866c req-4874051d-07de-4b02-ba0f-c4648da53c83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.102 182939 DEBUG oslo_concurrency.lockutils [req-104c72cf-8c0e-45bc-9462-bb6022cc866c req-4874051d-07de-4b02-ba0f-c4648da53c83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.103 182939 DEBUG nova.compute.manager [req-104c72cf-8c0e-45bc-9462-bb6022cc866c req-4874051d-07de-4b02-ba0f-c4648da53c83 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Processing event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.104 182939 DEBUG nova.compute.manager [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.105 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.110 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.115 182939 INFO nova.virt.libvirt.driver [-] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Instance spawned successfully.
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.115 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.134 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.135 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039528.1098623, 3b0c2900-8225-474d-ba1c-da5edb1a0058 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.135 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] VM Resumed (Lifecycle Event)
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.147 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.148 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.149 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.149 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.149 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.150 182939 DEBUG nova.virt.libvirt.driver [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.161 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.164 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.193 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.246 182939 INFO nova.compute.manager [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Took 8.22 seconds to spawn the instance on the hypervisor.
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.246 182939 DEBUG nova.compute.manager [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.380 182939 INFO nova.compute.manager [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Took 8.94 seconds to build instance.
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.412 182939 DEBUG oslo_concurrency.lockutils [None req-476cd8b9-d0ab-4abf-b8c8-3fabf50c6eb3 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.543 182939 DEBUG nova.network.neutron [req-577c0130-7f2c-4b7a-8dc5-38c933d9851c req-014eb76b-1578-4a8f-81cf-03b05e3b1806 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Updated VIF entry in instance network info cache for port 9b2b9286-20f4-4015-8abc-720cb546283c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.544 182939 DEBUG nova.network.neutron [req-577c0130-7f2c-4b7a-8dc5-38c933d9851c req-014eb76b-1578-4a8f-81cf-03b05e3b1806 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Updating instance_info_cache with network_info: [{"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:52:08 compute-0 nova_compute[182935]: 2026-01-21 23:52:08.568 182939 DEBUG oslo_concurrency.lockutils [req-577c0130-7f2c-4b7a-8dc5-38c933d9851c req-014eb76b-1578-4a8f-81cf-03b05e3b1806 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3b0c2900-8225-474d-ba1c-da5edb1a0058" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:52:10 compute-0 nova_compute[182935]: 2026-01-21 23:52:10.439 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039515.4374063, 07512c08-85ed-4cd4-8f13-bb1698a30b8c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:52:10 compute-0 nova_compute[182935]: 2026-01-21 23:52:10.439 182939 INFO nova.compute.manager [-] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] VM Stopped (Lifecycle Event)
Jan 21 23:52:10 compute-0 nova_compute[182935]: 2026-01-21 23:52:10.452 182939 DEBUG nova.compute.manager [req-22a2214e-2d8f-4aa1-a4c6-ac3fe95ed269 req-7b6cb518-b5e0-4d37-8111-8165ac968fd9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:52:10 compute-0 nova_compute[182935]: 2026-01-21 23:52:10.453 182939 DEBUG oslo_concurrency.lockutils [req-22a2214e-2d8f-4aa1-a4c6-ac3fe95ed269 req-7b6cb518-b5e0-4d37-8111-8165ac968fd9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:10 compute-0 nova_compute[182935]: 2026-01-21 23:52:10.453 182939 DEBUG oslo_concurrency.lockutils [req-22a2214e-2d8f-4aa1-a4c6-ac3fe95ed269 req-7b6cb518-b5e0-4d37-8111-8165ac968fd9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:10 compute-0 nova_compute[182935]: 2026-01-21 23:52:10.453 182939 DEBUG oslo_concurrency.lockutils [req-22a2214e-2d8f-4aa1-a4c6-ac3fe95ed269 req-7b6cb518-b5e0-4d37-8111-8165ac968fd9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:10 compute-0 nova_compute[182935]: 2026-01-21 23:52:10.454 182939 DEBUG nova.compute.manager [req-22a2214e-2d8f-4aa1-a4c6-ac3fe95ed269 req-7b6cb518-b5e0-4d37-8111-8165ac968fd9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] No waiting events found dispatching network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:52:10 compute-0 nova_compute[182935]: 2026-01-21 23:52:10.454 182939 WARNING nova.compute.manager [req-22a2214e-2d8f-4aa1-a4c6-ac3fe95ed269 req-7b6cb518-b5e0-4d37-8111-8165ac968fd9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received unexpected event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c for instance with vm_state active and task_state None.
Jan 21 23:52:10 compute-0 nova_compute[182935]: 2026-01-21 23:52:10.469 182939 DEBUG nova.compute.manager [None req-fd45e824-2df1-40cb-aec3-2d20dd480d70 - - - - - -] [instance: 07512c08-85ed-4cd4-8f13-bb1698a30b8c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:10 compute-0 nova_compute[182935]: 2026-01-21 23:52:10.620 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:10 compute-0 nova_compute[182935]: 2026-01-21 23:52:10.871 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:14 compute-0 podman[218392]: 2026-01-21 23:52:14.703581494 +0000 UTC m=+0.076485392 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, build-date=2025-08-20T13:12:41, architecture=x86_64, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Jan 21 23:52:14 compute-0 podman[218393]: 2026-01-21 23:52:14.708192485 +0000 UTC m=+0.077635158 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 21 23:52:14 compute-0 nova_compute[182935]: 2026-01-21 23:52:14.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:15 compute-0 nova_compute[182935]: 2026-01-21 23:52:15.623 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:15 compute-0 nova_compute[182935]: 2026-01-21 23:52:15.874 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:16 compute-0 nova_compute[182935]: 2026-01-21 23:52:16.543 182939 INFO nova.compute.manager [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Rebuilding instance
Jan 21 23:52:16 compute-0 sshd-session[218434]: Invalid user weblogic from 188.166.69.60 port 43132
Jan 21 23:52:17 compute-0 sshd-session[218434]: Connection closed by invalid user weblogic 188.166.69.60 port 43132 [preauth]
Jan 21 23:52:17 compute-0 nova_compute[182935]: 2026-01-21 23:52:17.235 182939 DEBUG nova.compute.manager [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:17 compute-0 nova_compute[182935]: 2026-01-21 23:52:17.403 182939 DEBUG nova.objects.instance [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_requests' on Instance uuid 3b0c2900-8225-474d-ba1c-da5edb1a0058 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:17 compute-0 nova_compute[182935]: 2026-01-21 23:52:17.568 182939 DEBUG nova.objects.instance [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b0c2900-8225-474d-ba1c-da5edb1a0058 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:20 compute-0 nova_compute[182935]: 2026-01-21 23:52:20.474 182939 DEBUG nova.objects.instance [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'resources' on Instance uuid 3b0c2900-8225-474d-ba1c-da5edb1a0058 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:20 compute-0 nova_compute[182935]: 2026-01-21 23:52:20.508 182939 DEBUG nova.objects.instance [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'migration_context' on Instance uuid 3b0c2900-8225-474d-ba1c-da5edb1a0058 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:20 compute-0 nova_compute[182935]: 2026-01-21 23:52:20.525 182939 DEBUG nova.objects.instance [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:52:20 compute-0 nova_compute[182935]: 2026-01-21 23:52:20.531 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:52:20 compute-0 nova_compute[182935]: 2026-01-21 23:52:20.625 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:20 compute-0 nova_compute[182935]: 2026-01-21 23:52:20.876 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:21 compute-0 ovn_controller[95047]: 2026-01-21T23:52:21Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:28:8d:f1 10.100.0.6
Jan 21 23:52:21 compute-0 ovn_controller[95047]: 2026-01-21T23:52:21Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:28:8d:f1 10.100.0.6
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.304 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'hostId': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.309 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.313 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3b0c2900-8225-474d-ba1c-da5edb1a0058 / tap9b2b9286-20 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.313 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '589650a9-9651-41ab-a372-443b1cc946b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-0000002f-3b0c2900-8225-474d-ba1c-da5edb1a0058-tap9b2b9286-20', 'timestamp': '2026-01-21T23:52:23.309423', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'tap9b2b9286-20', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:8d:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b2b9286-20'}, 'message_id': '3b3bcb28-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.103851526, 'message_signature': '824acb788b4f18c2cda7d8b905d7085738575a7a7c1834ba2afd24fb5fbe9c68'}]}, 'timestamp': '2026-01-21 23:52:23.315233', '_unique_id': 'e368a9118da2475cb4b7cee91a3b0fc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.323 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '074176e3-ef65-40cf-b9d9-2640aa0c810c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-0000002f-3b0c2900-8225-474d-ba1c-da5edb1a0058-tap9b2b9286-20', 'timestamp': '2026-01-21T23:52:23.323183', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'tap9b2b9286-20', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:8d:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b2b9286-20'}, 'message_id': '3b3d317a-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.103851526, 'message_signature': '6ca77e6bda3d871ad3f23f99665c182ac7f191a8030c3b0ac9f06a82700781be'}]}, 'timestamp': '2026-01-21 23:52:23.323733', '_unique_id': '13f542d91d034971bcb9ee6e40bb00a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.325 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.326 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.326 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc6a0c17-be45-4e5f-9dea-954402657f7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-0000002f-3b0c2900-8225-474d-ba1c-da5edb1a0058-tap9b2b9286-20', 'timestamp': '2026-01-21T23:52:23.326789', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'tap9b2b9286-20', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:8d:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b2b9286-20'}, 'message_id': '3b3dbfa0-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.103851526, 'message_signature': '4134d3fde86acb2305cbcd9af1019c89e61c48fe0e336f7a7285d4c1ad74bb29'}]}, 'timestamp': '2026-01-21 23:52:23.327421', '_unique_id': '7e1fc654ad03446580e6adc61b7a91b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.328 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.330 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.342 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.343 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2676a95f-d6e3-47d1-a164-50bd056aa8f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-vda', 'timestamp': '2026-01-21T23:52:23.330493', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3b402cc2-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.124839104, 'message_signature': '44a13b42585d0202e0634335f44317afac1bab7f087cd6e7ba17f386a8182f1a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-sda', 'timestamp': '2026-01-21T23:52:23.330493', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3b403fdc-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.124839104, 'message_signature': 'a083131b8d259ac848341866bd9d83d7f5173a5837718f8ca23a8a1c316da8f8'}]}, 'timestamp': '2026-01-21 23:52:23.343880', '_unique_id': '18af3930ac71471bba5eba9c3694bad1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.346 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.371 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.read.requests volume: 1082 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.371 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72cedc5c-b98e-4237-8ff6-aedcb1d62c1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1082, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-vda', 'timestamp': '2026-01-21T23:52:23.346738', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3b448380-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.141177508, 'message_signature': 'e79ab21e65c5cfc4e0567ecd4cfef180d00a009ca5c04ff3d7a94e4b0690076b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-sda', 'timestamp': '2026-01-21T23:52:23.346738', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3b4498fc-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.141177508, 'message_signature': '82372a093e3ac060dbfdc9cf948267a0923b383a14f25e9e1db70d507371fe53'}]}, 'timestamp': '2026-01-21 23:52:23.372225', '_unique_id': '47a12bcfb641488e9f9bcd689de3fe18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.373 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.374 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.375 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0a3ea55-aa44-4808-b316-19d42b07daf6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-0000002f-3b0c2900-8225-474d-ba1c-da5edb1a0058-tap9b2b9286-20', 'timestamp': '2026-01-21T23:52:23.375006', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'tap9b2b9286-20', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:8d:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b2b9286-20'}, 'message_id': '3b4518ea-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.103851526, 'message_signature': '334acc335c3b3ca5d5112eb8a6c9f0b99eb947166774f65d8a64268c9b0f88ab'}]}, 'timestamp': '2026-01-21 23:52:23.375553', '_unique_id': '52ebcc0e7abf47b49d070e411dc5147c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.376 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.378 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.378 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1319775-867c-4e44-be6a-148c58270fba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-0000002f-3b0c2900-8225-474d-ba1c-da5edb1a0058-tap9b2b9286-20', 'timestamp': '2026-01-21T23:52:23.378370', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'tap9b2b9286-20', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:8d:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b2b9286-20'}, 'message_id': '3b459f22-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.103851526, 'message_signature': 'd184545f2fa3b03e8059f6d107c0efe7d758f6aa61c3c58ee82be0195545537d'}]}, 'timestamp': '2026-01-21 23:52:23.379022', '_unique_id': 'fcddbe98e2f54a0fb34c4f753bbaeed8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.380 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.381 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.381 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.382 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1106281103>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1106281103>]
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.382 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.382 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.383 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1106281103>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1106281103>]
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.383 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.383 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.384 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49dd4e49-24f2-49e8-a941-7d8184ed1e4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-vda', 'timestamp': '2026-01-21T23:52:23.383581', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3b4668d0-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.124839104, 'message_signature': '2f925a33ba94d80f32c4f4049ad4050ca5db6b1eb190776f0eebbdb8961dd4d0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-sda', 'timestamp': '2026-01-21T23:52:23.383581', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3b467bea-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.124839104, 'message_signature': '32fc294dec774bc6c61246291fde889a22a7f56ae3d604cb37598d5fabde710c'}]}, 'timestamp': '2026-01-21 23:52:23.384658', '_unique_id': '5589d37291d349c1b63696a7f694d544'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.385 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.387 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.387 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/network.outgoing.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0001c65-58a2-435a-9a2b-cd9ecb3d427f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-0000002f-3b0c2900-8225-474d-ba1c-da5edb1a0058-tap9b2b9286-20', 'timestamp': '2026-01-21T23:52:23.387436', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'tap9b2b9286-20', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:8d:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b2b9286-20'}, 'message_id': '3b46fe3a-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.103851526, 'message_signature': 'fce4ba9f98d20d9c82ceac8611e7d51817e80e60fb66387e161731b41a09a14a'}]}, 'timestamp': '2026-01-21 23:52:23.388006', '_unique_id': '238c92427514448aab5ae232fd960ffd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.389 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.390 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.407 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de538212-c51a-4a68-9d5c-9df5837a90bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'timestamp': '2026-01-21T23:52:23.390694', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3b4a0076-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.201238601, 'message_signature': 'fe0cc639b10cda601fe5a8dd1e1263a72224a3718a7f49d583d370fba1fa0449'}]}, 'timestamp': '2026-01-21 23:52:23.407678', '_unique_id': '92ba701f2da848e3967a1afff3c067ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.409 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.410 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.410 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.411 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '726481d8-0f49-4bda-b94b-b8aadeb0edf8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-vda', 'timestamp': '2026-01-21T23:52:23.410705', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3b4a8c76-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.124839104, 'message_signature': '720be9c3e4ea12e50a7334cdd873443c7be703d69fbbf08abde1531d8a3164b5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-sda', 'timestamp': '2026-01-21T23:52:23.410705', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3b4a9e82-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.124839104, 'message_signature': 'f469b2e9f9a8220bab27398643acaadef2d9567268047adce367039016af50b2'}]}, 'timestamp': '2026-01-21 23:52:23.411678', '_unique_id': '7d9dee3838b64750b70ab14f17ca7fc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.412 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.415 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.415 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6813a927-097d-4498-b602-23e3c8fb01d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-0000002f-3b0c2900-8225-474d-ba1c-da5edb1a0058-tap9b2b9286-20', 'timestamp': '2026-01-21T23:52:23.415317', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'tap9b2b9286-20', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:8d:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b2b9286-20'}, 'message_id': '3b4b3f40-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.103851526, 'message_signature': 'ce774ba17a3fa17dde2ff3e551642497b74e8f24ca0c35ebed8486e290fad462'}]}, 'timestamp': '2026-01-21 23:52:23.415863', '_unique_id': '39bc1c57079147eaa68acb5cfcdded58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.416 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.418 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.418 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.418 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1106281103>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1106281103>]
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.419 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.419 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.419 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1106281103>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-1106281103>]
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.419 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.420 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.write.latency volume: 2887471761 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.420 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2776fe0-1003-4d2a-93b4-399bb51e9b91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2887471761, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-vda', 'timestamp': '2026-01-21T23:52:23.420008', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3b4bf64c-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.141177508, 'message_signature': 'a792a9eab228247baf161afc3d5f1235d0616a8cf28359eea6013fe854b21261'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-sda', 'timestamp': '2026-01-21T23:52:23.420008', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3b4c07ea-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.141177508, 'message_signature': '17e866d11f2f979fbbe68c6279fa550dcd4e498357f71336fe36d825a59b3c3a'}]}, 'timestamp': '2026-01-21 23:52:23.420984', '_unique_id': 'cbaa6b67321d4481b7b396194dc5ceab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.423 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.423 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.read.latency volume: 190337811 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.424 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.read.latency volume: 33310492 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8c36549-c11d-4cbd-8de6-4dd688d7ae51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 190337811, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-vda', 'timestamp': '2026-01-21T23:52:23.423685', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3b4c877e-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.141177508, 'message_signature': 'a1d87db0134aa3141d204ae49fa0b33bf42b1451fb71a1e3bd7919c49b90a193'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 33310492, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-sda', 'timestamp': '2026-01-21T23:52:23.423685', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3b4c996c-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.141177508, 'message_signature': 'c775b3afa5098dfd65763ea7d89ce3e217ebf31eff8ebedf116bef2acc3c2567'}]}, 'timestamp': '2026-01-21 23:52:23.424659', '_unique_id': '3740b76e92e0439c9f7e257f9b71718d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.427 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.427 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '733eee7a-0bc7-4ef0-803a-d652b80dca76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-0000002f-3b0c2900-8225-474d-ba1c-da5edb1a0058-tap9b2b9286-20', 'timestamp': '2026-01-21T23:52:23.427465', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'tap9b2b9286-20', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:8d:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b2b9286-20'}, 'message_id': '3b4d19a0-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.103851526, 'message_signature': '7e5ca4685aa67f13f2e56b7abfe469037ecadb102469c9b82ca5dae91c9c9060'}]}, 'timestamp': '2026-01-21 23:52:23.428011', '_unique_id': '0ad4018eebc4497f9a8bb120f3ed74f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.429 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.430 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.430 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.write.bytes volume: 73125888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.431 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f16e76e0-df71-433a-a69f-64d0d662fb06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73125888, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-vda', 'timestamp': '2026-01-21T23:52:23.430488', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3b4d8f52-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.141177508, 'message_signature': '1dd0c55f85c1fcfab18787e6beef5f1a43d4d48cd4fdd735c2df525a3d18612f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-sda', 'timestamp': '2026-01-21T23:52:23.430488', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3b4da2bc-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.141177508, 'message_signature': '6656e5bc6de3a216f8a892d5360de45c7031e9825384112710046728fed97618'}]}, 'timestamp': '2026-01-21 23:52:23.431449', '_unique_id': 'a9a39a3cad5f4cf19f9ff8aa511f089c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.434 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.438 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.438 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/network.outgoing.bytes volume: 1172 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c87863a-7382-4677-929e-1a87ee3294ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1172, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-0000002f-3b0c2900-8225-474d-ba1c-da5edb1a0058-tap9b2b9286-20', 'timestamp': '2026-01-21T23:52:23.438920', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'tap9b2b9286-20', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:8d:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b2b9286-20'}, 'message_id': '3b4ee14a-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.103851526, 'message_signature': '3b16f38961e2222c70808837bd30247f2dc61d04f1b19e8729e72609e62a7821'}]}, 'timestamp': '2026-01-21 23:52:23.439646', '_unique_id': 'f33691f4119a4481be4d777caa22ef81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.440 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.441 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.441 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0515bfa9-14fd-472a-a9d4-33846ed11ee2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': 'instance-0000002f-3b0c2900-8225-474d-ba1c-da5edb1a0058-tap9b2b9286-20', 'timestamp': '2026-01-21T23:52:23.441939', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'tap9b2b9286-20', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:28:8d:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9b2b9286-20'}, 'message_id': '3b4f4bd0-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.103851526, 'message_signature': 'c520b3bd0881a9209747322083fe3ede4e42305ea67a62ccbe6ad9cde7873547'}]}, 'timestamp': '2026-01-21 23:52:23.442403', '_unique_id': '7d3bb9e319df48169405194a78afcaeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.443 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.444 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.444 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/cpu volume: 12050000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 nova_compute[182935]: 2026-01-21 23:52:23.445 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:23 compute-0 nova_compute[182935]: 2026-01-21 23:52:23.445 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:52:23 compute-0 nova_compute[182935]: 2026-01-21 23:52:23.446 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83bb3d84-40ed-4fa3-8e1d-3995a0fd811f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12050000000, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'timestamp': '2026-01-21T23:52:23.444309', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3b4fad14-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.201238601, 'message_signature': 'e7fca0b56150f954ad7955546f1e6aa664392d9b99fcf690c607ed56cb20d4b6'}]}, 'timestamp': '2026-01-21 23:52:23.444855', '_unique_id': 'c5e24d9d758d43e9aea69b294ef14712'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.445 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.446 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.446 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.write.requests volume: 352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.447 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3667303d-fd5e-493e-937e-da2dd4190a80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 352, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-vda', 'timestamp': '2026-01-21T23:52:23.446787', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3b500d40-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.141177508, 'message_signature': '9c97808028248caae25473dbb2c775edcc6b4daaf0fb621ba4d415a51ebb0d2e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-sda', 'timestamp': '2026-01-21T23:52:23.446787', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3b501d3a-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.141177508, 'message_signature': '8a4938f4ecdc77ec7b10fa450a4b5f8a3dff1f2f2ca3665eb50acca8f3bfd02c'}]}, 'timestamp': '2026-01-21 23:52:23.447651', '_unique_id': 'de773a3b3b3d4bfb9fb2ca8e410d428f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.448 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.449 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.449 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.read.bytes volume: 29948416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.449 12 DEBUG ceilometer.compute.pollsters [-] 3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bb54c39-0a2e-4d65-af17-2e4d29e23038', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29948416, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-vda', 'timestamp': '2026-01-21T23:52:23.449467', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3b50737a-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.141177508, 'message_signature': 'a006085e858db5d1e25d45beee23fbfaf75b878f728f3dc80ee8d617b948bde0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a7fb6bdd938b4fcdb749b0bc4f86f97e', 'user_name': None, 'project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'project_name': None, 'resource_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058-sda', 'timestamp': '2026-01-21T23:52:23.449467', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-1106281103', 'name': 'instance-0000002f', 'instance_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'instance_type': 'm1.nano', 'host': '9767a289f20cb082e0e4c92a7386d38f6342789c5467e672af527e2b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3b5084be-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4046.141177508, 'message_signature': 'a0281c30445aec673de4b4bfb3c9512f3312be032625b434d5e5cb237b071afd'}]}, 'timestamp': '2026-01-21 23:52:23.450286', '_unique_id': '6957d5d3dee14f849a51214793854a00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:52:23.451 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:52:23 compute-0 nova_compute[182935]: 2026-01-21 23:52:23.464 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-3b0c2900-8225-474d-ba1c-da5edb1a0058" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:52:23 compute-0 nova_compute[182935]: 2026-01-21 23:52:23.465 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-3b0c2900-8225-474d-ba1c-da5edb1a0058" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:52:23 compute-0 nova_compute[182935]: 2026-01-21 23:52:23.465 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:52:23 compute-0 nova_compute[182935]: 2026-01-21 23:52:23.465 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3b0c2900-8225-474d-ba1c-da5edb1a0058 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:23 compute-0 kernel: tap9b2b9286-20 (unregistering): left promiscuous mode
Jan 21 23:52:23 compute-0 NetworkManager[55139]: <info>  [1769039543.8559] device (tap9b2b9286-20): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:52:23 compute-0 ovn_controller[95047]: 2026-01-21T23:52:23Z|00187|binding|INFO|Releasing lport 9b2b9286-20f4-4015-8abc-720cb546283c from this chassis (sb_readonly=0)
Jan 21 23:52:23 compute-0 nova_compute[182935]: 2026-01-21 23:52:23.870 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:23 compute-0 ovn_controller[95047]: 2026-01-21T23:52:23Z|00188|binding|INFO|Setting lport 9b2b9286-20f4-4015-8abc-720cb546283c down in Southbound
Jan 21 23:52:23 compute-0 ovn_controller[95047]: 2026-01-21T23:52:23Z|00189|binding|INFO|Removing iface tap9b2b9286-20 ovn-installed in OVS
Jan 21 23:52:23 compute-0 nova_compute[182935]: 2026-01-21 23:52:23.873 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:23.880 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:8d:f1 10.100.0.6'], port_security=['fa:16:3e:28:8d:f1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=9b2b9286-20f4-4015-8abc-720cb546283c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:52:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:23.882 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 9b2b9286-20f4-4015-8abc-720cb546283c in datapath 7b586c54-3322-410f-9bc9-972a63b8deff unbound from our chassis
Jan 21 23:52:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:23.883 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b586c54-3322-410f-9bc9-972a63b8deff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:52:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:23.885 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b47425e0-4b13-44ec-a6a6-ff2d7275c66b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:23.886 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace which is not needed anymore
Jan 21 23:52:23 compute-0 nova_compute[182935]: 2026-01-21 23:52:23.887 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:23 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Jan 21 23:52:23 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d0000002f.scope: Consumed 14.324s CPU time.
Jan 21 23:52:23 compute-0 systemd-machined[154182]: Machine qemu-27-instance-0000002f terminated.
Jan 21 23:52:24 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218370]: [NOTICE]   (218374) : haproxy version is 2.8.14-c23fe91
Jan 21 23:52:24 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218370]: [NOTICE]   (218374) : path to executable is /usr/sbin/haproxy
Jan 21 23:52:24 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218370]: [WARNING]  (218374) : Exiting Master process...
Jan 21 23:52:24 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218370]: [WARNING]  (218374) : Exiting Master process...
Jan 21 23:52:24 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218370]: [ALERT]    (218374) : Current worker (218376) exited with code 143 (Terminated)
Jan 21 23:52:24 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218370]: [WARNING]  (218374) : All workers exited. Exiting... (0)
Jan 21 23:52:24 compute-0 systemd[1]: libpod-8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d.scope: Deactivated successfully.
Jan 21 23:52:24 compute-0 podman[218482]: 2026-01-21 23:52:24.042179864 +0000 UTC m=+0.044552419 container died 8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 23:52:24 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d-userdata-shm.mount: Deactivated successfully.
Jan 21 23:52:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-57a42220ef59ebbab55e30870c3365e673d975139e3705aaddf64d4157a58618-merged.mount: Deactivated successfully.
Jan 21 23:52:24 compute-0 podman[218482]: 2026-01-21 23:52:24.091770073 +0000 UTC m=+0.094142618 container cleanup 8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:52:24 compute-0 systemd[1]: libpod-conmon-8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d.scope: Deactivated successfully.
Jan 21 23:52:24 compute-0 podman[218520]: 2026-01-21 23:52:24.153150718 +0000 UTC m=+0.039489166 container remove 8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 23:52:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:24.159 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c9cfc835-ee6c-47c3-806b-912bfd39a121]: (4, ('Wed Jan 21 11:52:23 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d)\n8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d\nWed Jan 21 11:52:24 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d)\n8a183646165d8048b00270b055636402260357ad9336b2cbeedf5e84f714b60d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:24.162 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[655246f0-f37e-49a3-abb7-3122f4dc265a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:24.163 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.165 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:24 compute-0 kernel: tap7b586c54-30: left promiscuous mode
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.176 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.180 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:24.184 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5d126c29-3da0-4c76-a6ec-a727c4fce11e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:24.199 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1d5810c9-38b8-4313-ba23-a45fdc68a58f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:24.200 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a1e79d31-15c9-409f-9932-4d2824808c14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:24.216 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[246dabac-2d41-4206-94ee-3beaff7955b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402969, 'reachable_time': 38687, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218546, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:24.218 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:52:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:24.218 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[da83021b-6950-41ff-854a-da188db550b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:24 compute-0 systemd[1]: run-netns-ovnmeta\x2d7b586c54\x2d3322\x2d410f\x2d9bc9\x2d972a63b8deff.mount: Deactivated successfully.
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.556 182939 INFO nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Instance shutdown successfully after 4 seconds.
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.562 182939 INFO nova.virt.libvirt.driver [-] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Instance destroyed successfully.
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.566 182939 INFO nova.virt.libvirt.driver [-] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Instance destroyed successfully.
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.567 182939 DEBUG nova.virt.libvirt.vif [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:51:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1106281103',display_name='tempest-ServerDiskConfigTestJSON-server-1106281103',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1106281103',id=47,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:52:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-t9b97jsk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:52:15Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=3b0c2900-8225-474d-ba1c-da5edb1a0058,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.568 182939 DEBUG nova.network.os_vif_util [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.568 182939 DEBUG nova.network.os_vif_util [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.569 182939 DEBUG os_vif [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.570 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.570 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b2b9286-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.572 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.573 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.576 182939 INFO os_vif [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20')
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.577 182939 INFO nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Deleting instance files /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058_del
Jan 21 23:52:24 compute-0 nova_compute[182935]: 2026-01-21 23:52:24.578 182939 INFO nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Deletion of /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058_del complete
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.626 182939 DEBUG nova.compute.manager [req-ec7eeea3-2b30-4c71-93e5-b1900c9e326b req-573ed276-4364-4a48-9d9b-653d5e18010e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received event network-vif-unplugged-9b2b9286-20f4-4015-8abc-720cb546283c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.627 182939 DEBUG oslo_concurrency.lockutils [req-ec7eeea3-2b30-4c71-93e5-b1900c9e326b req-573ed276-4364-4a48-9d9b-653d5e18010e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.627 182939 DEBUG oslo_concurrency.lockutils [req-ec7eeea3-2b30-4c71-93e5-b1900c9e326b req-573ed276-4364-4a48-9d9b-653d5e18010e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.627 182939 DEBUG oslo_concurrency.lockutils [req-ec7eeea3-2b30-4c71-93e5-b1900c9e326b req-573ed276-4364-4a48-9d9b-653d5e18010e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.627 182939 DEBUG nova.compute.manager [req-ec7eeea3-2b30-4c71-93e5-b1900c9e326b req-573ed276-4364-4a48-9d9b-653d5e18010e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] No waiting events found dispatching network-vif-unplugged-9b2b9286-20f4-4015-8abc-720cb546283c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.628 182939 WARNING nova.compute.manager [req-ec7eeea3-2b30-4c71-93e5-b1900c9e326b req-573ed276-4364-4a48-9d9b-653d5e18010e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received unexpected event network-vif-unplugged-9b2b9286-20f4-4015-8abc-720cb546283c for instance with vm_state active and task_state rebuilding.
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.671 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:25 compute-0 podman[218548]: 2026-01-21 23:52:25.723947725 +0000 UTC m=+0.053057435 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:52:25 compute-0 podman[218547]: 2026-01-21 23:52:25.766944126 +0000 UTC m=+0.096273341 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.949 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.949 182939 INFO nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Creating image(s)
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.950 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "/var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.950 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.951 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:25 compute-0 nova_compute[182935]: 2026-01-21 23:52:25.963 182939 DEBUG oslo_concurrency.processutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.041 182939 DEBUG oslo_concurrency.processutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.043 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.043 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.067 182939 DEBUG oslo_concurrency.processutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.122 182939 DEBUG oslo_concurrency.processutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.123 182939 DEBUG oslo_concurrency.processutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.162 182939 DEBUG oslo_concurrency.processutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.163 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.164 182939 DEBUG oslo_concurrency.processutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.224 182939 DEBUG oslo_concurrency.processutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.225 182939 DEBUG nova.virt.disk.api [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Checking if we can resize image /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.226 182939 DEBUG oslo_concurrency.processutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.282 182939 DEBUG oslo_concurrency.processutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.284 182939 DEBUG nova.virt.disk.api [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Cannot resize image /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.285 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.285 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Ensure instance console log exists: /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.286 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.286 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.287 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.290 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Start _get_guest_xml network_info=[{"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.296 182939 WARNING nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.303 182939 DEBUG nova.virt.libvirt.host [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.304 182939 DEBUG nova.virt.libvirt.host [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.310 182939 DEBUG nova.virt.libvirt.host [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.311 182939 DEBUG nova.virt.libvirt.host [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.312 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.313 182939 DEBUG nova.virt.hardware [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.314 182939 DEBUG nova.virt.hardware [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.314 182939 DEBUG nova.virt.hardware [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.315 182939 DEBUG nova.virt.hardware [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.315 182939 DEBUG nova.virt.hardware [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.315 182939 DEBUG nova.virt.hardware [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.316 182939 DEBUG nova.virt.hardware [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.316 182939 DEBUG nova.virt.hardware [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.317 182939 DEBUG nova.virt.hardware [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.317 182939 DEBUG nova.virt.hardware [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.317 182939 DEBUG nova.virt.hardware [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.318 182939 DEBUG nova.objects.instance [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3b0c2900-8225-474d-ba1c-da5edb1a0058 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.342 182939 DEBUG nova.virt.libvirt.vif [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:51:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1106281103',display_name='tempest-ServerDiskConfigTestJSON-server-1106281103',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1106281103',id=47,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:52:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-t9b97jsk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:52:25Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=3b0c2900-8225-474d-ba1c-da5edb1a0058,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.343 182939 DEBUG nova.network.os_vif_util [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.344 182939 DEBUG nova.network.os_vif_util [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.347 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:52:26 compute-0 nova_compute[182935]:   <uuid>3b0c2900-8225-474d-ba1c-da5edb1a0058</uuid>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   <name>instance-0000002f</name>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-1106281103</nova:name>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:52:26</nova:creationTime>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:52:26 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:52:26 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:52:26 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:52:26 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:52:26 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:52:26 compute-0 nova_compute[182935]:         <nova:user uuid="a7fb6bdd938b4fcdb749b0bc4f86f97e">tempest-ServerDiskConfigTestJSON-1417790226-project-member</nova:user>
Jan 21 23:52:26 compute-0 nova_compute[182935]:         <nova:project uuid="c09a5cf201e249f69f57cd4a632d1e2b">tempest-ServerDiskConfigTestJSON-1417790226</nova:project>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="3e1dda74-3c6a-4d29-8792-32134d1c36c5"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:52:26 compute-0 nova_compute[182935]:         <nova:port uuid="9b2b9286-20f4-4015-8abc-720cb546283c">
Jan 21 23:52:26 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <system>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <entry name="serial">3b0c2900-8225-474d-ba1c-da5edb1a0058</entry>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <entry name="uuid">3b0c2900-8225-474d-ba1c-da5edb1a0058</entry>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     </system>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   <os>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   </os>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   <features>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   </features>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.config"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:28:8d:f1"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <target dev="tap9b2b9286-20"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/console.log" append="off"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <video>
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     </video>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:52:26 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:52:26 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:52:26 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:52:26 compute-0 nova_compute[182935]: </domain>
Jan 21 23:52:26 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.348 182939 DEBUG nova.compute.manager [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Preparing to wait for external event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.349 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.349 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.349 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.350 182939 DEBUG nova.virt.libvirt.vif [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:51:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1106281103',display_name='tempest-ServerDiskConfigTestJSON-server-1106281103',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1106281103',id=47,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:52:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-t9b97jsk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:52:25Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=3b0c2900-8225-474d-ba1c-da5edb1a0058,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.350 182939 DEBUG nova.network.os_vif_util [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.351 182939 DEBUG nova.network.os_vif_util [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.352 182939 DEBUG os_vif [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.352 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.353 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.353 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.360 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.361 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b2b9286-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.362 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9b2b9286-20, col_values=(('external_ids', {'iface-id': '9b2b9286-20f4-4015-8abc-720cb546283c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:28:8d:f1', 'vm-uuid': '3b0c2900-8225-474d-ba1c-da5edb1a0058'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.367 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:26 compute-0 NetworkManager[55139]: <info>  [1769039546.3680] manager: (tap9b2b9286-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.371 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.377 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:26 compute-0 nova_compute[182935]: 2026-01-21 23:52:26.379 182939 INFO os_vif [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20')
Jan 21 23:52:27 compute-0 nova_compute[182935]: 2026-01-21 23:52:27.962 182939 DEBUG nova.compute.manager [req-42f68e26-eaba-412d-8fb8-51c66a61a411 req-a770768c-4a60-46c8-a7c6-65cace70f64d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:52:27 compute-0 nova_compute[182935]: 2026-01-21 23:52:27.963 182939 DEBUG oslo_concurrency.lockutils [req-42f68e26-eaba-412d-8fb8-51c66a61a411 req-a770768c-4a60-46c8-a7c6-65cace70f64d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:27 compute-0 nova_compute[182935]: 2026-01-21 23:52:27.964 182939 DEBUG oslo_concurrency.lockutils [req-42f68e26-eaba-412d-8fb8-51c66a61a411 req-a770768c-4a60-46c8-a7c6-65cace70f64d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:27 compute-0 nova_compute[182935]: 2026-01-21 23:52:27.964 182939 DEBUG oslo_concurrency.lockutils [req-42f68e26-eaba-412d-8fb8-51c66a61a411 req-a770768c-4a60-46c8-a7c6-65cace70f64d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:27 compute-0 nova_compute[182935]: 2026-01-21 23:52:27.965 182939 DEBUG nova.compute.manager [req-42f68e26-eaba-412d-8fb8-51c66a61a411 req-a770768c-4a60-46c8-a7c6-65cace70f64d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Processing event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:52:27 compute-0 nova_compute[182935]: 2026-01-21 23:52:27.980 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:52:27 compute-0 nova_compute[182935]: 2026-01-21 23:52:27.981 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:52:27 compute-0 nova_compute[182935]: 2026-01-21 23:52:27.981 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No VIF found with MAC fa:16:3e:28:8d:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:52:27 compute-0 nova_compute[182935]: 2026-01-21 23:52:27.982 182939 INFO nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Using config drive
Jan 21 23:52:28 compute-0 nova_compute[182935]: 2026-01-21 23:52:28.017 182939 DEBUG nova.objects.instance [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'ec2_ids' on Instance uuid 3b0c2900-8225-474d-ba1c-da5edb1a0058 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:28 compute-0 nova_compute[182935]: 2026-01-21 23:52:28.119 182939 DEBUG nova.objects.instance [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'keypairs' on Instance uuid 3b0c2900-8225-474d-ba1c-da5edb1a0058 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:28 compute-0 nova_compute[182935]: 2026-01-21 23:52:28.745 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Updating instance_info_cache with network_info: [{"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.167 182939 INFO nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Creating config drive at /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.config
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.176 182939 DEBUG oslo_concurrency.processutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2cewa547 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.312 182939 DEBUG oslo_concurrency.processutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2cewa547" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:29 compute-0 kernel: tap9b2b9286-20: entered promiscuous mode
Jan 21 23:52:29 compute-0 NetworkManager[55139]: <info>  [1769039549.4077] manager: (tap9b2b9286-20): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Jan 21 23:52:29 compute-0 ovn_controller[95047]: 2026-01-21T23:52:29Z|00190|binding|INFO|Claiming lport 9b2b9286-20f4-4015-8abc-720cb546283c for this chassis.
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.408 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:29 compute-0 ovn_controller[95047]: 2026-01-21T23:52:29Z|00191|binding|INFO|9b2b9286-20f4-4015-8abc-720cb546283c: Claiming fa:16:3e:28:8d:f1 10.100.0.6
Jan 21 23:52:29 compute-0 ovn_controller[95047]: 2026-01-21T23:52:29Z|00192|binding|INFO|Setting lport 9b2b9286-20f4-4015-8abc-720cb546283c ovn-installed in OVS
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.425 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.427 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.438 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:29 compute-0 systemd-udevd[218633]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:52:29 compute-0 systemd-machined[154182]: New machine qemu-28-instance-0000002f.
Jan 21 23:52:29 compute-0 systemd[1]: Started Virtual Machine qemu-28-instance-0000002f.
Jan 21 23:52:29 compute-0 NetworkManager[55139]: <info>  [1769039549.4675] device (tap9b2b9286-20): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:52:29 compute-0 NetworkManager[55139]: <info>  [1769039549.4694] device (tap9b2b9286-20): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.737 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for 3b0c2900-8225-474d-ba1c-da5edb1a0058 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.737 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039549.7363145, 3b0c2900-8225-474d-ba1c-da5edb1a0058 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.738 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] VM Started (Lifecycle Event)
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.743 182939 DEBUG nova.compute.manager [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.759 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.765 182939 INFO nova.virt.libvirt.driver [-] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Instance spawned successfully.
Jan 21 23:52:29 compute-0 nova_compute[182935]: 2026-01-21 23:52:29.765 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:52:29 compute-0 ovn_controller[95047]: 2026-01-21T23:52:29Z|00193|binding|INFO|Setting lport 9b2b9286-20f4-4015-8abc-720cb546283c up in Southbound
Jan 21 23:52:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:29.866 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:8d:f1 10.100.0.6'], port_security=['fa:16:3e:28:8d:f1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=9b2b9286-20f4-4015-8abc-720cb546283c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:52:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:29.867 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 9b2b9286-20f4-4015-8abc-720cb546283c in datapath 7b586c54-3322-410f-9bc9-972a63b8deff bound to our chassis
Jan 21 23:52:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:29.868 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:52:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:29.881 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9c317bd0-aa45-4c6d-93d9-fce934996434]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:29.882 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b586c54-31 in ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:52:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:29.884 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b586c54-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:52:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:29.884 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ce686c-1dfa-4304-bed6-944801322110]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:29.885 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[451a89df-b641-46c2-8cba-c8b3143059ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:29.897 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[f7df4c4e-3802-4eb2-bb3f-f92b07fe8263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:29.925 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4554c49b-86c0-44da-bb4e-83967aa6057d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:29.956 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[14642e75-e29f-4b41-8921-daf19c0eb8d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:29 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:29.963 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[58e481d9-1b3e-4e90-97cb-b04ce3179ed9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:29 compute-0 NetworkManager[55139]: <info>  [1769039549.9649] manager: (tap7b586c54-30): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.001 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[eb25e3f2-b1c9-46ce-8526-2b721641ed18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.004 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[6156b5f9-3d59-47c8-ad10-3c8b339d1201]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:30 compute-0 NetworkManager[55139]: <info>  [1769039550.0321] device (tap7b586c54-30): carrier: link connected
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.039 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[850ff5b8-0122-4060-887d-6b7609d4cade]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.061 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec0ae39-2c83-4869-a80f-10c0ffb4f0dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 405277, 'reachable_time': 19272, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218674, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.079 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ff99c702-b3c3-4c6b-bd6a-0e6c6d880e3f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:a9f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 405277, 'tstamp': 405277}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218675, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.097 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[123f343e-195e-45f0-9810-1731ad8aa27e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 405277, 'reachable_time': 19272, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218676, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.129 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-3b0c2900-8225-474d-ba1c-da5edb1a0058" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.130 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.130 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.131 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.131 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.131 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.131 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.132 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.132 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.132 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.139 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad67f76-1517-41dd-921a-d306b483ec40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.177 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.178 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.178 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.179 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.179 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.180 182939 DEBUG nova.virt.libvirt.driver [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.187 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.187 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.187 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.188 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.211 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.216 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.216 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cb9ce7a1-e47e-435f-aa9a-c5e206f8d126]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.218 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.219 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.219 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b586c54-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.260 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:30 compute-0 NetworkManager[55139]: <info>  [1769039550.2617] manager: (tap7b586c54-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Jan 21 23:52:30 compute-0 kernel: tap7b586c54-30: entered promiscuous mode
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.265 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.267 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b586c54-30, col_values=(('external_ids', {'iface-id': '52e5d5d5-be78-49fa-86d7-24ac4adf40c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:30 compute-0 ovn_controller[95047]: 2026-01-21T23:52:30Z|00194|binding|INFO|Releasing lport 52e5d5d5-be78-49fa-86d7-24ac4adf40c1 from this chassis (sb_readonly=0)
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.268 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.274 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.274 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039549.736769, 3b0c2900-8225-474d-ba1c-da5edb1a0058 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.274 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] VM Paused (Lifecycle Event)
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.283 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.285 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.287 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8d24ae85-ab70-4529-a037-db4d0cb167cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.288 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:52:30 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:30.289 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'env', 'PROCESS_TAG=haproxy-7b586c54-3322-410f-9bc9-972a63b8deff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b586c54-3322-410f-9bc9-972a63b8deff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.311 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.318 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039549.7469368, 3b0c2900-8225-474d-ba1c-da5edb1a0058 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.319 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] VM Resumed (Lifecycle Event)
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.324 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.348 182939 DEBUG nova.compute.manager [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.349 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.358 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.392 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.393 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.428 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.456 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.627 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.629 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5669MB free_disk=73.27459716796875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.630 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.630 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:30 compute-0 nova_compute[182935]: 2026-01-21 23:52:30.673 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:30 compute-0 podman[218716]: 2026-01-21 23:52:30.745056606 +0000 UTC m=+0.063946889 container create ba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:52:30 compute-0 systemd[1]: Started libpod-conmon-ba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08.scope.
Jan 21 23:52:30 compute-0 podman[218716]: 2026-01-21 23:52:30.709475325 +0000 UTC m=+0.028365628 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:52:30 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:52:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/955abcf34704b529ebab034d43e14bf2b24e617a87face2f3d6b0c444d72ed6d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:52:30 compute-0 podman[218716]: 2026-01-21 23:52:30.841183951 +0000 UTC m=+0.160074254 container init ba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 23:52:30 compute-0 podman[218716]: 2026-01-21 23:52:30.851534861 +0000 UTC m=+0.170425144 container start ba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 21 23:52:30 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218731]: [NOTICE]   (218735) : New worker (218737) forked
Jan 21 23:52:30 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218731]: [NOTICE]   (218735) : Loading success.
Jan 21 23:52:31 compute-0 nova_compute[182935]: 2026-01-21 23:52:31.366 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:32 compute-0 nova_compute[182935]: 2026-01-21 23:52:32.678 182939 DEBUG nova.compute.manager [req-a8fe5b6a-6b2a-409b-bc79-fc3282c95a11 req-5486c0ca-c5df-4614-ba3e-8294fd95899c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:52:32 compute-0 nova_compute[182935]: 2026-01-21 23:52:32.679 182939 DEBUG oslo_concurrency.lockutils [req-a8fe5b6a-6b2a-409b-bc79-fc3282c95a11 req-5486c0ca-c5df-4614-ba3e-8294fd95899c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:32 compute-0 nova_compute[182935]: 2026-01-21 23:52:32.679 182939 DEBUG oslo_concurrency.lockutils [req-a8fe5b6a-6b2a-409b-bc79-fc3282c95a11 req-5486c0ca-c5df-4614-ba3e-8294fd95899c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:32 compute-0 nova_compute[182935]: 2026-01-21 23:52:32.679 182939 DEBUG oslo_concurrency.lockutils [req-a8fe5b6a-6b2a-409b-bc79-fc3282c95a11 req-5486c0ca-c5df-4614-ba3e-8294fd95899c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:32 compute-0 nova_compute[182935]: 2026-01-21 23:52:32.680 182939 DEBUG nova.compute.manager [req-a8fe5b6a-6b2a-409b-bc79-fc3282c95a11 req-5486c0ca-c5df-4614-ba3e-8294fd95899c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] No waiting events found dispatching network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:52:32 compute-0 nova_compute[182935]: 2026-01-21 23:52:32.680 182939 WARNING nova.compute.manager [req-a8fe5b6a-6b2a-409b-bc79-fc3282c95a11 req-5486c0ca-c5df-4614-ba3e-8294fd95899c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received unexpected event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c for instance with vm_state active and task_state None.
Jan 21 23:52:32 compute-0 podman[218746]: 2026-01-21 23:52:32.690027134 +0000 UTC m=+0.059261274 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:52:32 compute-0 nova_compute[182935]: 2026-01-21 23:52:32.749 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.168 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 3b0c2900-8225-474d-ba1c-da5edb1a0058 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.169 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.169 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.500 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.547 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.623 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.624 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.994s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.624 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.625 182939 DEBUG nova.objects.instance [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.627 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.628 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.659 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.660 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.660 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 23:52:33 compute-0 nova_compute[182935]: 2026-01-21 23:52:33.773 182939 DEBUG oslo_concurrency.lockutils [None req-b6b83c46-2b40-4977-b618-2ac25ed26c29 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.173 182939 DEBUG oslo_concurrency.lockutils [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.174 182939 DEBUG oslo_concurrency.lockutils [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.175 182939 DEBUG oslo_concurrency.lockutils [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.175 182939 DEBUG oslo_concurrency.lockutils [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.176 182939 DEBUG oslo_concurrency.lockutils [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.191 182939 INFO nova.compute.manager [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Terminating instance
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.204 182939 DEBUG nova.compute.manager [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:52:34 compute-0 kernel: tap9b2b9286-20 (unregistering): left promiscuous mode
Jan 21 23:52:34 compute-0 NetworkManager[55139]: <info>  [1769039554.2321] device (tap9b2b9286-20): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.238 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:34 compute-0 ovn_controller[95047]: 2026-01-21T23:52:34Z|00195|binding|INFO|Releasing lport 9b2b9286-20f4-4015-8abc-720cb546283c from this chassis (sb_readonly=0)
Jan 21 23:52:34 compute-0 ovn_controller[95047]: 2026-01-21T23:52:34Z|00196|binding|INFO|Setting lport 9b2b9286-20f4-4015-8abc-720cb546283c down in Southbound
Jan 21 23:52:34 compute-0 ovn_controller[95047]: 2026-01-21T23:52:34Z|00197|binding|INFO|Removing iface tap9b2b9286-20 ovn-installed in OVS
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.241 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.250 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:8d:f1 10.100.0.6'], port_security=['fa:16:3e:28:8d:f1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3b0c2900-8225-474d-ba1c-da5edb1a0058', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=9b2b9286-20f4-4015-8abc-720cb546283c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.252 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 9b2b9286-20f4-4015-8abc-720cb546283c in datapath 7b586c54-3322-410f-9bc9-972a63b8deff unbound from our chassis
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.253 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b586c54-3322-410f-9bc9-972a63b8deff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.255 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.256 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5a62d862-0e76-4e3b-ae45-2fe87c6cf3cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.256 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace which is not needed anymore
Jan 21 23:52:34 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Jan 21 23:52:34 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000002f.scope: Consumed 4.779s CPU time.
Jan 21 23:52:34 compute-0 systemd-machined[154182]: Machine qemu-28-instance-0000002f terminated.
Jan 21 23:52:34 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218731]: [NOTICE]   (218735) : haproxy version is 2.8.14-c23fe91
Jan 21 23:52:34 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218731]: [NOTICE]   (218735) : path to executable is /usr/sbin/haproxy
Jan 21 23:52:34 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218731]: [WARNING]  (218735) : Exiting Master process...
Jan 21 23:52:34 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218731]: [ALERT]    (218735) : Current worker (218737) exited with code 143 (Terminated)
Jan 21 23:52:34 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[218731]: [WARNING]  (218735) : All workers exited. Exiting... (0)
Jan 21 23:52:34 compute-0 systemd[1]: libpod-ba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08.scope: Deactivated successfully.
Jan 21 23:52:34 compute-0 podman[218794]: 2026-01-21 23:52:34.401160716 +0000 UTC m=+0.049071428 container died ba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:52:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08-userdata-shm.mount: Deactivated successfully.
Jan 21 23:52:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-955abcf34704b529ebab034d43e14bf2b24e617a87face2f3d6b0c444d72ed6d-merged.mount: Deactivated successfully.
Jan 21 23:52:34 compute-0 podman[218794]: 2026-01-21 23:52:34.446390141 +0000 UTC m=+0.094300853 container cleanup ba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:52:34 compute-0 systemd[1]: libpod-conmon-ba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08.scope: Deactivated successfully.
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.473 182939 INFO nova.virt.libvirt.driver [-] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Instance destroyed successfully.
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.474 182939 DEBUG nova.objects.instance [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'resources' on Instance uuid 3b0c2900-8225-474d-ba1c-da5edb1a0058 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:52:34 compute-0 podman[218838]: 2026-01-21 23:52:34.507006197 +0000 UTC m=+0.039342402 container remove ba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.508 182939 DEBUG nova.virt.libvirt.vif [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:51:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1106281103',display_name='tempest-ServerDiskConfigTestJSON-server-1106281103',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1106281103',id=47,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:52:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-t9b97jsk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:52:33Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=3b0c2900-8225-474d-ba1c-da5edb1a0058,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.509 182939 DEBUG nova.network.os_vif_util [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "9b2b9286-20f4-4015-8abc-720cb546283c", "address": "fa:16:3e:28:8d:f1", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9b2b9286-20", "ovs_interfaceid": "9b2b9286-20f4-4015-8abc-720cb546283c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.510 182939 DEBUG nova.network.os_vif_util [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.510 182939 DEBUG os_vif [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.514 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.513 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9a0c0992-c1d0-47b8-a089-1c863ff59470]: (4, ('Wed Jan 21 11:52:34 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (ba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08)\nba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08\nWed Jan 21 11:52:34 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (ba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08)\nba0373034844ac46d063bd7360e24622b23ee1e13ecc7b8d2274b87eb1440b08\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.514 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b2b9286-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.517 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.516 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d63bb1b6-5f38-49b4-a601-4a3f365d8877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.518 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.519 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:52:34 compute-0 kernel: tap7b586c54-30: left promiscuous mode
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.524 182939 INFO os_vif [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:8d:f1,bridge_name='br-int',has_traffic_filtering=True,id=9b2b9286-20f4-4015-8abc-720cb546283c,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9b2b9286-20')
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.525 182939 INFO nova.virt.libvirt.driver [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Deleting instance files /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058_del
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.525 182939 INFO nova.virt.libvirt.driver [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Deletion of /var/lib/nova/instances/3b0c2900-8225-474d-ba1c-da5edb1a0058_del complete
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.530 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.532 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.532 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8d846d16-7f4a-4c17-a6b0-fe46f9a41077]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.550 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5c04f44c-f284-40f3-9296-6bae851d0f14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.551 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cd9fe7ec-1faa-4161-b0d0-781268ca6094]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.568 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0a23b85b-791e-4457-8eeb-523c9303ecc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 405268, 'reachable_time': 41158, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218856, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d7b586c54\x2d3322\x2d410f\x2d9bc9\x2d972a63b8deff.mount: Deactivated successfully.
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.572 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:52:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:34.572 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[fb6f6378-4619-43b3-9d82-39bbd9e30a86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.660 182939 INFO nova.compute.manager [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Took 0.46 seconds to destroy the instance on the hypervisor.
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.661 182939 DEBUG oslo.service.loopingcall [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.662 182939 DEBUG nova.compute.manager [-] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:52:34 compute-0 nova_compute[182935]: 2026-01-21 23:52:34.662 182939 DEBUG nova.network.neutron [-] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:52:35 compute-0 nova_compute[182935]: 2026-01-21 23:52:35.037 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:52:35 compute-0 nova_compute[182935]: 2026-01-21 23:52:35.676 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.689 182939 DEBUG nova.compute.manager [req-a9013700-a941-4415-9392-8dd15d71b118 req-6f4c90eb-a7b6-4921-93b4-417dacf81959 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.690 182939 DEBUG oslo_concurrency.lockutils [req-a9013700-a941-4415-9392-8dd15d71b118 req-6f4c90eb-a7b6-4921-93b4-417dacf81959 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.690 182939 DEBUG oslo_concurrency.lockutils [req-a9013700-a941-4415-9392-8dd15d71b118 req-6f4c90eb-a7b6-4921-93b4-417dacf81959 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.690 182939 DEBUG oslo_concurrency.lockutils [req-a9013700-a941-4415-9392-8dd15d71b118 req-6f4c90eb-a7b6-4921-93b4-417dacf81959 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.690 182939 DEBUG nova.compute.manager [req-a9013700-a941-4415-9392-8dd15d71b118 req-6f4c90eb-a7b6-4921-93b4-417dacf81959 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] No waiting events found dispatching network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.691 182939 WARNING nova.compute.manager [req-a9013700-a941-4415-9392-8dd15d71b118 req-6f4c90eb-a7b6-4921-93b4-417dacf81959 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received unexpected event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c for instance with vm_state active and task_state deleting.
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.691 182939 DEBUG nova.compute.manager [req-a9013700-a941-4415-9392-8dd15d71b118 req-6f4c90eb-a7b6-4921-93b4-417dacf81959 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received event network-vif-unplugged-9b2b9286-20f4-4015-8abc-720cb546283c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.691 182939 DEBUG oslo_concurrency.lockutils [req-a9013700-a941-4415-9392-8dd15d71b118 req-6f4c90eb-a7b6-4921-93b4-417dacf81959 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.691 182939 DEBUG oslo_concurrency.lockutils [req-a9013700-a941-4415-9392-8dd15d71b118 req-6f4c90eb-a7b6-4921-93b4-417dacf81959 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.691 182939 DEBUG oslo_concurrency.lockutils [req-a9013700-a941-4415-9392-8dd15d71b118 req-6f4c90eb-a7b6-4921-93b4-417dacf81959 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.692 182939 DEBUG nova.compute.manager [req-a9013700-a941-4415-9392-8dd15d71b118 req-6f4c90eb-a7b6-4921-93b4-417dacf81959 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] No waiting events found dispatching network-vif-unplugged-9b2b9286-20f4-4015-8abc-720cb546283c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.692 182939 DEBUG nova.compute.manager [req-a9013700-a941-4415-9392-8dd15d71b118 req-6f4c90eb-a7b6-4921-93b4-417dacf81959 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received event network-vif-unplugged-9b2b9286-20f4-4015-8abc-720cb546283c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:52:36 compute-0 nova_compute[182935]: 2026-01-21 23:52:36.977 182939 DEBUG nova.network.neutron [-] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:52:37 compute-0 nova_compute[182935]: 2026-01-21 23:52:37.151 182939 INFO nova.compute.manager [-] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Took 2.49 seconds to deallocate network for instance.
Jan 21 23:52:37 compute-0 nova_compute[182935]: 2026-01-21 23:52:37.423 182939 DEBUG oslo_concurrency.lockutils [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:37 compute-0 nova_compute[182935]: 2026-01-21 23:52:37.424 182939 DEBUG oslo_concurrency.lockutils [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:37 compute-0 nova_compute[182935]: 2026-01-21 23:52:37.547 182939 DEBUG nova.compute.provider_tree [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:52:37 compute-0 nova_compute[182935]: 2026-01-21 23:52:37.628 182939 DEBUG nova.scheduler.client.report [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:52:37 compute-0 podman[218857]: 2026-01-21 23:52:37.681496377 +0000 UTC m=+0.057235415 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:52:37 compute-0 nova_compute[182935]: 2026-01-21 23:52:37.703 182939 DEBUG oslo_concurrency.lockutils [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:37 compute-0 nova_compute[182935]: 2026-01-21 23:52:37.826 182939 INFO nova.scheduler.client.report [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Deleted allocations for instance 3b0c2900-8225-474d-ba1c-da5edb1a0058
Jan 21 23:52:38 compute-0 nova_compute[182935]: 2026-01-21 23:52:38.011 182939 DEBUG oslo_concurrency.lockutils [None req-16d417b3-5d2c-4f4a-b1b2-f800fcbff421 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.836s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:38 compute-0 nova_compute[182935]: 2026-01-21 23:52:38.903 182939 DEBUG nova.compute.manager [req-850f49d5-8d4e-4251-823c-c1e8bae2bc29 req-14141c13-e9bc-43fe-9a72-a91cf3368678 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:52:38 compute-0 nova_compute[182935]: 2026-01-21 23:52:38.904 182939 DEBUG oslo_concurrency.lockutils [req-850f49d5-8d4e-4251-823c-c1e8bae2bc29 req-14141c13-e9bc-43fe-9a72-a91cf3368678 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:52:38 compute-0 nova_compute[182935]: 2026-01-21 23:52:38.904 182939 DEBUG oslo_concurrency.lockutils [req-850f49d5-8d4e-4251-823c-c1e8bae2bc29 req-14141c13-e9bc-43fe-9a72-a91cf3368678 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:52:38 compute-0 nova_compute[182935]: 2026-01-21 23:52:38.904 182939 DEBUG oslo_concurrency.lockutils [req-850f49d5-8d4e-4251-823c-c1e8bae2bc29 req-14141c13-e9bc-43fe-9a72-a91cf3368678 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3b0c2900-8225-474d-ba1c-da5edb1a0058-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:52:38 compute-0 nova_compute[182935]: 2026-01-21 23:52:38.905 182939 DEBUG nova.compute.manager [req-850f49d5-8d4e-4251-823c-c1e8bae2bc29 req-14141c13-e9bc-43fe-9a72-a91cf3368678 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] No waiting events found dispatching network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:52:38 compute-0 nova_compute[182935]: 2026-01-21 23:52:38.905 182939 WARNING nova.compute.manager [req-850f49d5-8d4e-4251-823c-c1e8bae2bc29 req-14141c13-e9bc-43fe-9a72-a91cf3368678 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received unexpected event network-vif-plugged-9b2b9286-20f4-4015-8abc-720cb546283c for instance with vm_state deleted and task_state None.
Jan 21 23:52:38 compute-0 nova_compute[182935]: 2026-01-21 23:52:38.905 182939 DEBUG nova.compute.manager [req-850f49d5-8d4e-4251-823c-c1e8bae2bc29 req-14141c13-e9bc-43fe-9a72-a91cf3368678 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Received event network-vif-deleted-9b2b9286-20f4-4015-8abc-720cb546283c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:52:39 compute-0 nova_compute[182935]: 2026-01-21 23:52:39.519 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:40 compute-0 nova_compute[182935]: 2026-01-21 23:52:40.678 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:44 compute-0 nova_compute[182935]: 2026-01-21 23:52:44.523 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:45 compute-0 nova_compute[182935]: 2026-01-21 23:52:45.680 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:45 compute-0 podman[218878]: 2026-01-21 23:52:45.690740412 +0000 UTC m=+0.062550075 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, config_id=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Jan 21 23:52:45 compute-0 podman[218879]: 2026-01-21 23:52:45.721567457 +0000 UTC m=+0.077472916 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 21 23:52:49 compute-0 nova_compute[182935]: 2026-01-21 23:52:49.472 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039554.470158, 3b0c2900-8225-474d-ba1c-da5edb1a0058 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:52:49 compute-0 nova_compute[182935]: 2026-01-21 23:52:49.472 182939 INFO nova.compute.manager [-] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] VM Stopped (Lifecycle Event)
Jan 21 23:52:49 compute-0 nova_compute[182935]: 2026-01-21 23:52:49.497 182939 DEBUG nova.compute.manager [None req-a39399ed-860b-4fa8-8605-44b4a84c813c - - - - - -] [instance: 3b0c2900-8225-474d-ba1c-da5edb1a0058] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:52:49 compute-0 nova_compute[182935]: 2026-01-21 23:52:49.524 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:50 compute-0 nova_compute[182935]: 2026-01-21 23:52:50.706 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:54 compute-0 nova_compute[182935]: 2026-01-21 23:52:54.119 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:54 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:54.120 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:52:54 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:54.122 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:52:54 compute-0 nova_compute[182935]: 2026-01-21 23:52:54.526 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:55 compute-0 nova_compute[182935]: 2026-01-21 23:52:55.708 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:52:56 compute-0 podman[218917]: 2026-01-21 23:52:56.699425702 +0000 UTC m=+0.053345422 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:52:56 compute-0 podman[218916]: 2026-01-21 23:52:56.763131644 +0000 UTC m=+0.130820257 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:52:58 compute-0 sshd-session[218963]: Invalid user weblogic from 188.166.69.60 port 50046
Jan 21 23:52:58 compute-0 sshd-session[218963]: Connection closed by invalid user weblogic 188.166.69.60 port 50046 [preauth]
Jan 21 23:52:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:52:59.124 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:52:59 compute-0 nova_compute[182935]: 2026-01-21 23:52:59.528 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:00 compute-0 nova_compute[182935]: 2026-01-21 23:53:00.710 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:03.187 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:03.189 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:03.190 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:03 compute-0 podman[218965]: 2026-01-21 23:53:03.690984229 +0000 UTC m=+0.065165258 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:53:04 compute-0 nova_compute[182935]: 2026-01-21 23:53:04.531 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:05 compute-0 nova_compute[182935]: 2026-01-21 23:53:05.713 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:08 compute-0 podman[218991]: 2026-01-21 23:53:08.907711951 +0000 UTC m=+0.270823872 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:53:09 compute-0 nova_compute[182935]: 2026-01-21 23:53:09.533 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:10 compute-0 nova_compute[182935]: 2026-01-21 23:53:10.715 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:14 compute-0 nova_compute[182935]: 2026-01-21 23:53:14.562 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:15 compute-0 nova_compute[182935]: 2026-01-21 23:53:15.760 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:16 compute-0 podman[219010]: 2026-01-21 23:53:16.697077056 +0000 UTC m=+0.070902867 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 23:53:16 compute-0 podman[219011]: 2026-01-21 23:53:16.719905138 +0000 UTC m=+0.079889954 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:53:16 compute-0 nova_compute[182935]: 2026-01-21 23:53:16.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:19 compute-0 nova_compute[182935]: 2026-01-21 23:53:19.563 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:20 compute-0 nova_compute[182935]: 2026-01-21 23:53:20.762 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:22 compute-0 sshd-session[219049]: Received disconnect from 91.224.92.54 port 53900:11:  [preauth]
Jan 21 23:53:22 compute-0 sshd-session[219049]: Disconnected from authenticating user root 91.224.92.54 port 53900 [preauth]
Jan 21 23:53:22 compute-0 nova_compute[182935]: 2026-01-21 23:53:22.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:23 compute-0 nova_compute[182935]: 2026-01-21 23:53:23.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:23 compute-0 nova_compute[182935]: 2026-01-21 23:53:23.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:53:23 compute-0 nova_compute[182935]: 2026-01-21 23:53:23.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:53:23 compute-0 nova_compute[182935]: 2026-01-21 23:53:23.812 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:53:23 compute-0 nova_compute[182935]: 2026-01-21 23:53:23.812 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:24 compute-0 nova_compute[182935]: 2026-01-21 23:53:24.564 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:24 compute-0 nova_compute[182935]: 2026-01-21 23:53:24.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:24 compute-0 nova_compute[182935]: 2026-01-21 23:53:24.826 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:24 compute-0 nova_compute[182935]: 2026-01-21 23:53:24.827 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:24 compute-0 nova_compute[182935]: 2026-01-21 23:53:24.827 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:24 compute-0 nova_compute[182935]: 2026-01-21 23:53:24.828 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:53:25 compute-0 nova_compute[182935]: 2026-01-21 23:53:25.011 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:53:25 compute-0 nova_compute[182935]: 2026-01-21 23:53:25.013 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5737MB free_disk=73.27544021606445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:53:25 compute-0 nova_compute[182935]: 2026-01-21 23:53:25.013 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:25 compute-0 nova_compute[182935]: 2026-01-21 23:53:25.013 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:25 compute-0 nova_compute[182935]: 2026-01-21 23:53:25.112 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:53:25 compute-0 nova_compute[182935]: 2026-01-21 23:53:25.112 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:53:25 compute-0 nova_compute[182935]: 2026-01-21 23:53:25.154 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:53:25 compute-0 nova_compute[182935]: 2026-01-21 23:53:25.187 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:53:25 compute-0 nova_compute[182935]: 2026-01-21 23:53:25.233 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:53:25 compute-0 nova_compute[182935]: 2026-01-21 23:53:25.234 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:25 compute-0 nova_compute[182935]: 2026-01-21 23:53:25.763 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:27 compute-0 nova_compute[182935]: 2026-01-21 23:53:27.229 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:27 compute-0 nova_compute[182935]: 2026-01-21 23:53:27.230 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:27 compute-0 nova_compute[182935]: 2026-01-21 23:53:27.230 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:27 compute-0 nova_compute[182935]: 2026-01-21 23:53:27.230 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:53:27 compute-0 podman[219053]: 2026-01-21 23:53:27.721574228 +0000 UTC m=+0.079938646 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 21 23:53:27 compute-0 podman[219052]: 2026-01-21 23:53:27.763782499 +0000 UTC m=+0.121662595 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.381 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Acquiring lock "250eda73-38c7-47bf-a88e-2ca54f772298" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.381 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.410 182939 DEBUG nova.compute.manager [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.596 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.596 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.605 182939 DEBUG nova.virt.hardware [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.605 182939 INFO nova.compute.claims [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.808 182939 DEBUG nova.compute.provider_tree [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.836 182939 DEBUG nova.scheduler.client.report [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.864 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.866 182939 DEBUG nova.compute.manager [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.944 182939 DEBUG nova.compute.manager [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.945 182939 DEBUG nova.network.neutron [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:53:28 compute-0 nova_compute[182935]: 2026-01-21 23:53:28.973 182939 INFO nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.011 182939 DEBUG nova.compute.manager [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.265 182939 DEBUG nova.compute.manager [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.266 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.266 182939 INFO nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Creating image(s)
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.267 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Acquiring lock "/var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.267 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "/var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.268 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "/var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.281 182939 DEBUG oslo_concurrency.processutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.346 182939 DEBUG oslo_concurrency.processutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.347 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.348 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.359 182939 DEBUG oslo_concurrency.processutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.407 182939 DEBUG nova.policy [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b5578721aa22415da2ac48e762bd973f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f69f5fcc26c94982aa334384a38b765d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.419 182939 DEBUG oslo_concurrency.processutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.420 182939 DEBUG oslo_concurrency.processutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.463 182939 DEBUG oslo_concurrency.processutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.464 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.465 182939 DEBUG oslo_concurrency.processutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.538 182939 DEBUG oslo_concurrency.processutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.539 182939 DEBUG nova.virt.disk.api [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Checking if we can resize image /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.540 182939 DEBUG oslo_concurrency.processutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.567 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.602 182939 DEBUG oslo_concurrency.processutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.603 182939 DEBUG nova.virt.disk.api [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Cannot resize image /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.603 182939 DEBUG nova.objects.instance [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lazy-loading 'migration_context' on Instance uuid 250eda73-38c7-47bf-a88e-2ca54f772298 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.627 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.627 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Ensure instance console log exists: /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.628 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.628 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.628 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:29 compute-0 nova_compute[182935]: 2026-01-21 23:53:29.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:30 compute-0 nova_compute[182935]: 2026-01-21 23:53:30.766 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:31 compute-0 nova_compute[182935]: 2026-01-21 23:53:31.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:53:32 compute-0 nova_compute[182935]: 2026-01-21 23:53:32.175 182939 DEBUG nova.network.neutron [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Successfully created port: 8364c3a7-63d7-4d65-8ca7-7670f8e2c50f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:53:34 compute-0 nova_compute[182935]: 2026-01-21 23:53:34.298 182939 DEBUG nova.network.neutron [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Successfully updated port: 8364c3a7-63d7-4d65-8ca7-7670f8e2c50f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:53:34 compute-0 nova_compute[182935]: 2026-01-21 23:53:34.328 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Acquiring lock "refresh_cache-250eda73-38c7-47bf-a88e-2ca54f772298" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:53:34 compute-0 nova_compute[182935]: 2026-01-21 23:53:34.328 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Acquired lock "refresh_cache-250eda73-38c7-47bf-a88e-2ca54f772298" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:53:34 compute-0 nova_compute[182935]: 2026-01-21 23:53:34.329 182939 DEBUG nova.network.neutron [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:53:34 compute-0 nova_compute[182935]: 2026-01-21 23:53:34.470 182939 DEBUG nova.compute.manager [req-5fa70943-648d-44b5-9372-30cbb1c8233b req-16442b53-075d-4ccb-b886-2948bd2fc047 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Received event network-changed-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:53:34 compute-0 nova_compute[182935]: 2026-01-21 23:53:34.471 182939 DEBUG nova.compute.manager [req-5fa70943-648d-44b5-9372-30cbb1c8233b req-16442b53-075d-4ccb-b886-2948bd2fc047 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Refreshing instance network info cache due to event network-changed-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:53:34 compute-0 nova_compute[182935]: 2026-01-21 23:53:34.471 182939 DEBUG oslo_concurrency.lockutils [req-5fa70943-648d-44b5-9372-30cbb1c8233b req-16442b53-075d-4ccb-b886-2948bd2fc047 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-250eda73-38c7-47bf-a88e-2ca54f772298" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:53:34 compute-0 nova_compute[182935]: 2026-01-21 23:53:34.569 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:34 compute-0 podman[219115]: 2026-01-21 23:53:34.666347211 +0000 UTC m=+0.044090898 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:53:34 compute-0 nova_compute[182935]: 2026-01-21 23:53:34.751 182939 DEBUG nova.network.neutron [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:53:35 compute-0 nova_compute[182935]: 2026-01-21 23:53:35.768 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.203 182939 DEBUG nova.network.neutron [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Updating instance_info_cache with network_info: [{"id": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "address": "fa:16:3e:6f:31:6f", "network": {"id": "3e808973-4ac0-41a2-8ed5-17ec2d64e25b", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1394790983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f69f5fcc26c94982aa334384a38b765d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8364c3a7-63", "ovs_interfaceid": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.241 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Releasing lock "refresh_cache-250eda73-38c7-47bf-a88e-2ca54f772298" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.242 182939 DEBUG nova.compute.manager [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Instance network_info: |[{"id": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "address": "fa:16:3e:6f:31:6f", "network": {"id": "3e808973-4ac0-41a2-8ed5-17ec2d64e25b", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1394790983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f69f5fcc26c94982aa334384a38b765d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8364c3a7-63", "ovs_interfaceid": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.244 182939 DEBUG oslo_concurrency.lockutils [req-5fa70943-648d-44b5-9372-30cbb1c8233b req-16442b53-075d-4ccb-b886-2948bd2fc047 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-250eda73-38c7-47bf-a88e-2ca54f772298" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.245 182939 DEBUG nova.network.neutron [req-5fa70943-648d-44b5-9372-30cbb1c8233b req-16442b53-075d-4ccb-b886-2948bd2fc047 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Refreshing network info cache for port 8364c3a7-63d7-4d65-8ca7-7670f8e2c50f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.252 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Start _get_guest_xml network_info=[{"id": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "address": "fa:16:3e:6f:31:6f", "network": {"id": "3e808973-4ac0-41a2-8ed5-17ec2d64e25b", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1394790983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f69f5fcc26c94982aa334384a38b765d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8364c3a7-63", "ovs_interfaceid": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.261 182939 WARNING nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.271 182939 DEBUG nova.virt.libvirt.host [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.272 182939 DEBUG nova.virt.libvirt.host [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.276 182939 DEBUG nova.virt.libvirt.host [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.277 182939 DEBUG nova.virt.libvirt.host [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.280 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.280 182939 DEBUG nova.virt.hardware [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.281 182939 DEBUG nova.virt.hardware [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.281 182939 DEBUG nova.virt.hardware [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.282 182939 DEBUG nova.virt.hardware [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.282 182939 DEBUG nova.virt.hardware [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.283 182939 DEBUG nova.virt.hardware [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.284 182939 DEBUG nova.virt.hardware [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.284 182939 DEBUG nova.virt.hardware [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.285 182939 DEBUG nova.virt.hardware [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.285 182939 DEBUG nova.virt.hardware [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.286 182939 DEBUG nova.virt.hardware [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.293 182939 DEBUG nova.virt.libvirt.vif [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:53:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1836351631',display_name='tempest-ImagesOneServerTestJSON-server-1836351631',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1836351631',id=50,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f69f5fcc26c94982aa334384a38b765d',ramdisk_id='',reservation_id='r-jfryzrzr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-1772951065',owner_user_name='tempest-ImagesOneServerTestJSON-1772951065-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:53:29Z,user_data=None,user_id='b5578721aa22415da2ac48e762bd973f',uuid=250eda73-38c7-47bf-a88e-2ca54f772298,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "address": "fa:16:3e:6f:31:6f", "network": {"id": "3e808973-4ac0-41a2-8ed5-17ec2d64e25b", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1394790983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f69f5fcc26c94982aa334384a38b765d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8364c3a7-63", "ovs_interfaceid": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.294 182939 DEBUG nova.network.os_vif_util [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Converting VIF {"id": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "address": "fa:16:3e:6f:31:6f", "network": {"id": "3e808973-4ac0-41a2-8ed5-17ec2d64e25b", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1394790983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f69f5fcc26c94982aa334384a38b765d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8364c3a7-63", "ovs_interfaceid": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.297 182939 DEBUG nova.network.os_vif_util [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:31:6f,bridge_name='br-int',has_traffic_filtering=True,id=8364c3a7-63d7-4d65-8ca7-7670f8e2c50f,network=Network(3e808973-4ac0-41a2-8ed5-17ec2d64e25b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8364c3a7-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.299 182939 DEBUG nova.objects.instance [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lazy-loading 'pci_devices' on Instance uuid 250eda73-38c7-47bf-a88e-2ca54f772298 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.332 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:53:37 compute-0 nova_compute[182935]:   <uuid>250eda73-38c7-47bf-a88e-2ca54f772298</uuid>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   <name>instance-00000032</name>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <nova:name>tempest-ImagesOneServerTestJSON-server-1836351631</nova:name>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:53:37</nova:creationTime>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:53:37 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:53:37 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:53:37 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:53:37 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:53:37 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:53:37 compute-0 nova_compute[182935]:         <nova:user uuid="b5578721aa22415da2ac48e762bd973f">tempest-ImagesOneServerTestJSON-1772951065-project-member</nova:user>
Jan 21 23:53:37 compute-0 nova_compute[182935]:         <nova:project uuid="f69f5fcc26c94982aa334384a38b765d">tempest-ImagesOneServerTestJSON-1772951065</nova:project>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:53:37 compute-0 nova_compute[182935]:         <nova:port uuid="8364c3a7-63d7-4d65-8ca7-7670f8e2c50f">
Jan 21 23:53:37 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <system>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <entry name="serial">250eda73-38c7-47bf-a88e-2ca54f772298</entry>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <entry name="uuid">250eda73-38c7-47bf-a88e-2ca54f772298</entry>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     </system>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   <os>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   </os>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   <features>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   </features>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk.config"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:6f:31:6f"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <target dev="tap8364c3a7-63"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/console.log" append="off"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <video>
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     </video>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:53:37 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:53:37 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:53:37 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:53:37 compute-0 nova_compute[182935]: </domain>
Jan 21 23:53:37 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.334 182939 DEBUG nova.compute.manager [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Preparing to wait for external event network-vif-plugged-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.334 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Acquiring lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.334 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.334 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.335 182939 DEBUG nova.virt.libvirt.vif [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:53:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1836351631',display_name='tempest-ImagesOneServerTestJSON-server-1836351631',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1836351631',id=50,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f69f5fcc26c94982aa334384a38b765d',ramdisk_id='',reservation_id='r-jfryzrzr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-1772951065',owner_user_name='tempest-ImagesOneServerTestJSON-1772951065-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:53:29Z,user_data=None,user_id='b5578721aa22415da2ac48e762bd973f',uuid=250eda73-38c7-47bf-a88e-2ca54f772298,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "address": "fa:16:3e:6f:31:6f", "network": {"id": "3e808973-4ac0-41a2-8ed5-17ec2d64e25b", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1394790983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f69f5fcc26c94982aa334384a38b765d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8364c3a7-63", "ovs_interfaceid": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.335 182939 DEBUG nova.network.os_vif_util [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Converting VIF {"id": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "address": "fa:16:3e:6f:31:6f", "network": {"id": "3e808973-4ac0-41a2-8ed5-17ec2d64e25b", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1394790983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f69f5fcc26c94982aa334384a38b765d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8364c3a7-63", "ovs_interfaceid": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.336 182939 DEBUG nova.network.os_vif_util [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:31:6f,bridge_name='br-int',has_traffic_filtering=True,id=8364c3a7-63d7-4d65-8ca7-7670f8e2c50f,network=Network(3e808973-4ac0-41a2-8ed5-17ec2d64e25b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8364c3a7-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.336 182939 DEBUG os_vif [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:31:6f,bridge_name='br-int',has_traffic_filtering=True,id=8364c3a7-63d7-4d65-8ca7-7670f8e2c50f,network=Network(3e808973-4ac0-41a2-8ed5-17ec2d64e25b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8364c3a7-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.337 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.337 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.337 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.346 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.346 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8364c3a7-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.346 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8364c3a7-63, col_values=(('external_ids', {'iface-id': '8364c3a7-63d7-4d65-8ca7-7670f8e2c50f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6f:31:6f', 'vm-uuid': '250eda73-38c7-47bf-a88e-2ca54f772298'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.348 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:37 compute-0 NetworkManager[55139]: <info>  [1769039617.3490] manager: (tap8364c3a7-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.352 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.357 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.359 182939 INFO os_vif [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:31:6f,bridge_name='br-int',has_traffic_filtering=True,id=8364c3a7-63d7-4d65-8ca7-7670f8e2c50f,network=Network(3e808973-4ac0-41a2-8ed5-17ec2d64e25b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8364c3a7-63')
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.440 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.440 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.441 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] No VIF found with MAC fa:16:3e:6f:31:6f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:53:37 compute-0 nova_compute[182935]: 2026-01-21 23:53:37.441 182939 INFO nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Using config drive
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.309 182939 INFO nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Creating config drive at /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk.config
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.315 182939 DEBUG oslo_concurrency.processutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy4oiyxa6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.457 182939 DEBUG oslo_concurrency.processutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy4oiyxa6" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:38 compute-0 kernel: tap8364c3a7-63: entered promiscuous mode
Jan 21 23:53:38 compute-0 NetworkManager[55139]: <info>  [1769039618.5382] manager: (tap8364c3a7-63): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Jan 21 23:53:38 compute-0 ovn_controller[95047]: 2026-01-21T23:53:38Z|00198|binding|INFO|Claiming lport 8364c3a7-63d7-4d65-8ca7-7670f8e2c50f for this chassis.
Jan 21 23:53:38 compute-0 ovn_controller[95047]: 2026-01-21T23:53:38Z|00199|binding|INFO|8364c3a7-63d7-4d65-8ca7-7670f8e2c50f: Claiming fa:16:3e:6f:31:6f 10.100.0.6
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.539 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.547 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.559 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:31:6f 10.100.0.6'], port_security=['fa:16:3e:6f:31:6f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '250eda73-38c7-47bf-a88e-2ca54f772298', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e808973-4ac0-41a2-8ed5-17ec2d64e25b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f69f5fcc26c94982aa334384a38b765d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9b028383-0d8a-459d-ae8e-014f6b9e387a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a690640-25cf-4bda-912b-c9138a74d87d, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=8364c3a7-63d7-4d65-8ca7-7670f8e2c50f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.560 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 8364c3a7-63d7-4d65-8ca7-7670f8e2c50f in datapath 3e808973-4ac0-41a2-8ed5-17ec2d64e25b bound to our chassis
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.561 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3e808973-4ac0-41a2-8ed5-17ec2d64e25b
Jan 21 23:53:38 compute-0 systemd-udevd[219157]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.578 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d434d58e-df8f-4f2a-9611-c2a943eef8b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.580 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3e808973-41 in ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.582 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3e808973-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.582 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[417e7fe8-0d4e-4314-b391-648422f13f53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.583 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[630dee78-578f-4df2-b6a0-c0a9e4566615]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 NetworkManager[55139]: <info>  [1769039618.5992] device (tap8364c3a7-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:53:38 compute-0 NetworkManager[55139]: <info>  [1769039618.6000] device (tap8364c3a7-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.600 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[d56afa40-13ef-4e1f-a7f6-2990584424dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 systemd-machined[154182]: New machine qemu-29-instance-00000032.
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.618 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:38 compute-0 systemd[1]: Started Virtual Machine qemu-29-instance-00000032.
Jan 21 23:53:38 compute-0 ovn_controller[95047]: 2026-01-21T23:53:38Z|00200|binding|INFO|Setting lport 8364c3a7-63d7-4d65-8ca7-7670f8e2c50f ovn-installed in OVS
Jan 21 23:53:38 compute-0 ovn_controller[95047]: 2026-01-21T23:53:38Z|00201|binding|INFO|Setting lport 8364c3a7-63d7-4d65-8ca7-7670f8e2c50f up in Southbound
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.625 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.629 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5a368e4b-c7af-4cb0-88f5-f71c5d77bd1a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.669 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b99103c7-053e-429d-93ad-4b6e9e0b7c33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 NetworkManager[55139]: <info>  [1769039618.6769] manager: (tap3e808973-40): new Veth device (/org/freedesktop/NetworkManager/Devices/92)
Jan 21 23:53:38 compute-0 systemd-udevd[219163]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.676 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[92f4abfd-a802-440d-8d58-4786af341072]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.720 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[48c81a9b-cde7-4d83-a795-272fa34d6a94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.723 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a3058e-5fbf-400b-a739-a5073970f620]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 NetworkManager[55139]: <info>  [1769039618.7500] device (tap3e808973-40): carrier: link connected
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.757 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[313f5418-bdcd-487c-981a-c67b7f35e4b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.778 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[81d50aa6-621f-46b1-8de9-2e23e8f7ed6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e808973-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:9b:2c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412148, 'reachable_time': 30845, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219193, 'error': None, 'target': 'ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.797 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e144d5-38a4-4d4f-b8f6-d60a15897f85]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:9b2c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412148, 'tstamp': 412148}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219194, 'error': None, 'target': 'ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.819 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bbac7047-1dde-462c-9b8b-8a90af3b4381]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e808973-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:9b:2c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412148, 'reachable_time': 30845, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219197, 'error': None, 'target': 'ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.869 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d39625-b489-4c3b-ac76-47f5b0994ef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.923 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039618.9230094, 250eda73-38c7-47bf-a88e-2ca54f772298 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.924 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] VM Started (Lifecycle Event)
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.939 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5b3a66a4-0939-429e-86b2-bbddaa0b01b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.941 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e808973-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.941 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.941 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e808973-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.943 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:38 compute-0 NetworkManager[55139]: <info>  [1769039618.9443] manager: (tap3e808973-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Jan 21 23:53:38 compute-0 kernel: tap3e808973-40: entered promiscuous mode
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.945 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.946 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3e808973-40, col_values=(('external_ids', {'iface-id': 'fd165a3b-554f-4c3b-8c01-6a8706a317cb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:53:38 compute-0 ovn_controller[95047]: 2026-01-21T23:53:38Z|00202|binding|INFO|Releasing lport fd165a3b-554f-4c3b-8c01-6a8706a317cb from this chassis (sb_readonly=0)
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.948 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.948 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.949 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e808973-4ac0-41a2-8ed5-17ec2d64e25b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e808973-4ac0-41a2-8ed5-17ec2d64e25b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.950 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[58caa85f-d6ce-45bf-963a-820de0a0b533]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.951 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-3e808973-4ac0-41a2-8ed5-17ec2d64e25b
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/3e808973-4ac0-41a2-8ed5-17ec2d64e25b.pid.haproxy
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 3e808973-4ac0-41a2-8ed5-17ec2d64e25b
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:53:38 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:38.952 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b', 'env', 'PROCESS_TAG=haproxy-3e808973-4ac0-41a2-8ed5-17ec2d64e25b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3e808973-4ac0-41a2-8ed5-17ec2d64e25b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.952 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039618.9232097, 250eda73-38c7-47bf-a88e-2ca54f772298 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.953 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] VM Paused (Lifecycle Event)
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.958 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.973 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:53:38 compute-0 nova_compute[182935]: 2026-01-21 23:53:38.976 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:53:39 compute-0 nova_compute[182935]: 2026-01-21 23:53:39.009 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:53:39 compute-0 podman[219233]: 2026-01-21 23:53:39.334953156 +0000 UTC m=+0.033077432 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:53:39 compute-0 podman[219233]: 2026-01-21 23:53:39.503920704 +0000 UTC m=+0.202044950 container create c7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 23:53:39 compute-0 systemd[1]: Started libpod-conmon-c7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063.scope.
Jan 21 23:53:39 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:53:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3fb84f224caff87fd0ae2682739fdef6268dc3aabad51facf4c40cc374e4ce82/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:53:39 compute-0 podman[219233]: 2026-01-21 23:53:39.600283864 +0000 UTC m=+0.298408110 container init c7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:53:39 compute-0 podman[219233]: 2026-01-21 23:53:39.605764597 +0000 UTC m=+0.303888823 container start c7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true)
Jan 21 23:53:39 compute-0 neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b[219255]: [NOTICE]   (219271) : New worker (219273) forked
Jan 21 23:53:39 compute-0 neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b[219255]: [NOTICE]   (219271) : Loading success.
Jan 21 23:53:39 compute-0 podman[219244]: 2026-01-21 23:53:39.641059371 +0000 UTC m=+0.088997914 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.293 182939 DEBUG nova.compute.manager [req-90d94444-a81f-4360-9722-d39468cf4e56 req-5b4eb8db-0482-4628-bc9b-e78c2cd802a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Received event network-vif-plugged-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.294 182939 DEBUG oslo_concurrency.lockutils [req-90d94444-a81f-4360-9722-d39468cf4e56 req-5b4eb8db-0482-4628-bc9b-e78c2cd802a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.295 182939 DEBUG oslo_concurrency.lockutils [req-90d94444-a81f-4360-9722-d39468cf4e56 req-5b4eb8db-0482-4628-bc9b-e78c2cd802a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.296 182939 DEBUG oslo_concurrency.lockutils [req-90d94444-a81f-4360-9722-d39468cf4e56 req-5b4eb8db-0482-4628-bc9b-e78c2cd802a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.296 182939 DEBUG nova.compute.manager [req-90d94444-a81f-4360-9722-d39468cf4e56 req-5b4eb8db-0482-4628-bc9b-e78c2cd802a0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Processing event network-vif-plugged-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.298 182939 DEBUG nova.compute.manager [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.304 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.305 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039620.3036714, 250eda73-38c7-47bf-a88e-2ca54f772298 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.305 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] VM Resumed (Lifecycle Event)
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.312 182939 INFO nova.virt.libvirt.driver [-] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Instance spawned successfully.
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.313 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.337 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.345 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.349 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.349 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.350 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.350 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.351 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.351 182939 DEBUG nova.virt.libvirt.driver [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.386 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.459 182939 INFO nova.compute.manager [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Took 11.19 seconds to spawn the instance on the hypervisor.
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.459 182939 DEBUG nova.compute.manager [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.565 182939 INFO nova.compute.manager [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Took 12.04 seconds to build instance.
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.598 182939 DEBUG oslo_concurrency.lockutils [None req-de860f5a-4344-4aec-a66e-9134206b1297 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:40 compute-0 nova_compute[182935]: 2026-01-21 23:53:40.770 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:41 compute-0 sshd-session[219283]: Invalid user weblogic from 188.166.69.60 port 44352
Jan 21 23:53:41 compute-0 sshd-session[219283]: Connection closed by invalid user weblogic 188.166.69.60 port 44352 [preauth]
Jan 21 23:53:41 compute-0 nova_compute[182935]: 2026-01-21 23:53:41.936 182939 DEBUG nova.network.neutron [req-5fa70943-648d-44b5-9372-30cbb1c8233b req-16442b53-075d-4ccb-b886-2948bd2fc047 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Updated VIF entry in instance network info cache for port 8364c3a7-63d7-4d65-8ca7-7670f8e2c50f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:53:41 compute-0 nova_compute[182935]: 2026-01-21 23:53:41.937 182939 DEBUG nova.network.neutron [req-5fa70943-648d-44b5-9372-30cbb1c8233b req-16442b53-075d-4ccb-b886-2948bd2fc047 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Updating instance_info_cache with network_info: [{"id": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "address": "fa:16:3e:6f:31:6f", "network": {"id": "3e808973-4ac0-41a2-8ed5-17ec2d64e25b", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1394790983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f69f5fcc26c94982aa334384a38b765d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8364c3a7-63", "ovs_interfaceid": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:53:41 compute-0 nova_compute[182935]: 2026-01-21 23:53:41.965 182939 DEBUG oslo_concurrency.lockutils [req-5fa70943-648d-44b5-9372-30cbb1c8233b req-16442b53-075d-4ccb-b886-2948bd2fc047 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-250eda73-38c7-47bf-a88e-2ca54f772298" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:53:42 compute-0 nova_compute[182935]: 2026-01-21 23:53:42.349 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:42 compute-0 nova_compute[182935]: 2026-01-21 23:53:42.487 182939 DEBUG nova.compute.manager [req-8bb3c6db-406d-4eea-997d-fd548cdedfd8 req-522e2c34-2d90-4510-b40f-38920a698688 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Received event network-vif-plugged-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:53:42 compute-0 nova_compute[182935]: 2026-01-21 23:53:42.488 182939 DEBUG oslo_concurrency.lockutils [req-8bb3c6db-406d-4eea-997d-fd548cdedfd8 req-522e2c34-2d90-4510-b40f-38920a698688 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:53:42 compute-0 nova_compute[182935]: 2026-01-21 23:53:42.488 182939 DEBUG oslo_concurrency.lockutils [req-8bb3c6db-406d-4eea-997d-fd548cdedfd8 req-522e2c34-2d90-4510-b40f-38920a698688 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:53:42 compute-0 nova_compute[182935]: 2026-01-21 23:53:42.488 182939 DEBUG oslo_concurrency.lockutils [req-8bb3c6db-406d-4eea-997d-fd548cdedfd8 req-522e2c34-2d90-4510-b40f-38920a698688 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:53:42 compute-0 nova_compute[182935]: 2026-01-21 23:53:42.489 182939 DEBUG nova.compute.manager [req-8bb3c6db-406d-4eea-997d-fd548cdedfd8 req-522e2c34-2d90-4510-b40f-38920a698688 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] No waiting events found dispatching network-vif-plugged-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:53:42 compute-0 nova_compute[182935]: 2026-01-21 23:53:42.489 182939 WARNING nova.compute.manager [req-8bb3c6db-406d-4eea-997d-fd548cdedfd8 req-522e2c34-2d90-4510-b40f-38920a698688 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Received unexpected event network-vif-plugged-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f for instance with vm_state active and task_state None.
Jan 21 23:53:42 compute-0 nova_compute[182935]: 2026-01-21 23:53:42.965 182939 DEBUG nova.compute.manager [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:53:43 compute-0 nova_compute[182935]: 2026-01-21 23:53:43.087 182939 INFO nova.compute.manager [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] instance snapshotting
Jan 21 23:53:43 compute-0 nova_compute[182935]: 2026-01-21 23:53:43.750 182939 INFO nova.virt.libvirt.driver [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Beginning live snapshot process
Jan 21 23:53:44 compute-0 virtqemud[182477]: invalid argument: disk vda does not have an active block job
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.143 182939 DEBUG oslo_concurrency.processutils [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.200 182939 DEBUG oslo_concurrency.processutils [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk --force-share --output=json -f qcow2" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.202 182939 DEBUG oslo_concurrency.processutils [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.257 182939 DEBUG oslo_concurrency.processutils [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk --force-share --output=json -f qcow2" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.270 182939 DEBUG oslo_concurrency.processutils [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.327 182939 DEBUG oslo_concurrency.processutils [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.328 182939 DEBUG oslo_concurrency.processutils [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpqrvqsc1d/63fb7b2e1f8045e3b030b2963da9038a.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.365 182939 DEBUG oslo_concurrency.processutils [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpqrvqsc1d/63fb7b2e1f8045e3b030b2963da9038a.delta 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.367 182939 INFO nova.virt.libvirt.driver [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.421 182939 DEBUG nova.virt.libvirt.guest [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.427 182939 INFO nova.virt.libvirt.driver [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.467 182939 DEBUG nova.privsep.utils [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.468 182939 DEBUG oslo_concurrency.processutils [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpqrvqsc1d/63fb7b2e1f8045e3b030b2963da9038a.delta /var/lib/nova/instances/snapshots/tmpqrvqsc1d/63fb7b2e1f8045e3b030b2963da9038a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.622 182939 DEBUG oslo_concurrency.processutils [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpqrvqsc1d/63fb7b2e1f8045e3b030b2963da9038a.delta /var/lib/nova/instances/snapshots/tmpqrvqsc1d/63fb7b2e1f8045e3b030b2963da9038a" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:44 compute-0 nova_compute[182935]: 2026-01-21 23:53:44.624 182939 INFO nova.virt.libvirt.driver [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Snapshot extracted, beginning image upload
Jan 21 23:53:45 compute-0 nova_compute[182935]: 2026-01-21 23:53:45.772 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:47 compute-0 nova_compute[182935]: 2026-01-21 23:53:47.352 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:47 compute-0 podman[219313]: 2026-01-21 23:53:47.690947707 +0000 UTC m=+0.060353691 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_id=openstack_network_exporter, vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 23:53:47 compute-0 podman[219314]: 2026-01-21 23:53:47.6947738 +0000 UTC m=+0.061905720 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 21 23:53:48 compute-0 nova_compute[182935]: 2026-01-21 23:53:48.534 182939 INFO nova.virt.libvirt.driver [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Snapshot image upload complete
Jan 21 23:53:48 compute-0 nova_compute[182935]: 2026-01-21 23:53:48.535 182939 INFO nova.compute.manager [None req-a554b9b8-fdf3-4925-9fec-304cfc8c89b5 b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Took 5.44 seconds to snapshot the instance on the hypervisor.
Jan 21 23:53:50 compute-0 nova_compute[182935]: 2026-01-21 23:53:50.774 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:52 compute-0 nova_compute[182935]: 2026-01-21 23:53:52.356 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:53 compute-0 ovn_controller[95047]: 2026-01-21T23:53:53Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6f:31:6f 10.100.0.6
Jan 21 23:53:53 compute-0 ovn_controller[95047]: 2026-01-21T23:53:53Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6f:31:6f 10.100.0.6
Jan 21 23:53:55 compute-0 nova_compute[182935]: 2026-01-21 23:53:55.777 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:55 compute-0 nova_compute[182935]: 2026-01-21 23:53:55.948 182939 DEBUG nova.compute.manager [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:53:56 compute-0 nova_compute[182935]: 2026-01-21 23:53:56.033 182939 INFO nova.compute.manager [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] instance snapshotting
Jan 21 23:53:56 compute-0 nova_compute[182935]: 2026-01-21 23:53:56.510 182939 INFO nova.virt.libvirt.driver [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Beginning live snapshot process
Jan 21 23:53:56 compute-0 virtqemud[182477]: invalid argument: disk vda does not have an active block job
Jan 21 23:53:56 compute-0 nova_compute[182935]: 2026-01-21 23:53:56.785 182939 DEBUG oslo_concurrency.processutils [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:56 compute-0 nova_compute[182935]: 2026-01-21 23:53:56.856 182939 DEBUG oslo_concurrency.processutils [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk --force-share --output=json -f qcow2" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:56 compute-0 nova_compute[182935]: 2026-01-21 23:53:56.858 182939 DEBUG oslo_concurrency.processutils [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:56 compute-0 nova_compute[182935]: 2026-01-21 23:53:56.918 182939 DEBUG oslo_concurrency.processutils [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298/disk --force-share --output=json -f qcow2" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:56 compute-0 nova_compute[182935]: 2026-01-21 23:53:56.946 182939 DEBUG oslo_concurrency.processutils [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:57 compute-0 nova_compute[182935]: 2026-01-21 23:53:57.008 182939 DEBUG oslo_concurrency.processutils [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:57 compute-0 nova_compute[182935]: 2026-01-21 23:53:57.009 182939 DEBUG oslo_concurrency.processutils [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp2xtjk7yo/533e73dac00c466486aa6a3546a1e0f2.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:57 compute-0 nova_compute[182935]: 2026-01-21 23:53:57.049 182939 DEBUG oslo_concurrency.processutils [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp2xtjk7yo/533e73dac00c466486aa6a3546a1e0f2.delta 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:57 compute-0 nova_compute[182935]: 2026-01-21 23:53:57.050 182939 INFO nova.virt.libvirt.driver [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 21 23:53:57 compute-0 nova_compute[182935]: 2026-01-21 23:53:57.107 182939 DEBUG nova.virt.libvirt.guest [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] COPY block job progress, current cursor: 0 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 21 23:53:57 compute-0 nova_compute[182935]: 2026-01-21 23:53:57.359 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:57 compute-0 nova_compute[182935]: 2026-01-21 23:53:57.612 182939 DEBUG nova.virt.libvirt.guest [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] COPY block job progress, current cursor: 75235328 final cursor: 75235328 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 21 23:53:57 compute-0 nova_compute[182935]: 2026-01-21 23:53:57.615 182939 INFO nova.virt.libvirt.driver [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 21 23:53:57 compute-0 nova_compute[182935]: 2026-01-21 23:53:57.651 182939 DEBUG nova.privsep.utils [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:53:57 compute-0 nova_compute[182935]: 2026-01-21 23:53:57.652 182939 DEBUG oslo_concurrency.processutils [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp2xtjk7yo/533e73dac00c466486aa6a3546a1e0f2.delta /var/lib/nova/instances/snapshots/tmp2xtjk7yo/533e73dac00c466486aa6a3546a1e0f2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:53:57 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:57.778 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:53:57 compute-0 nova_compute[182935]: 2026-01-21 23:53:57.779 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:53:57 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:53:57.781 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:53:58 compute-0 nova_compute[182935]: 2026-01-21 23:53:58.016 182939 DEBUG oslo_concurrency.processutils [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp2xtjk7yo/533e73dac00c466486aa6a3546a1e0f2.delta /var/lib/nova/instances/snapshots/tmp2xtjk7yo/533e73dac00c466486aa6a3546a1e0f2" returned: 0 in 0.364s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:53:58 compute-0 nova_compute[182935]: 2026-01-21 23:53:58.022 182939 INFO nova.virt.libvirt.driver [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Snapshot extracted, beginning image upload
Jan 21 23:53:58 compute-0 podman[219398]: 2026-01-21 23:53:58.706320008 +0000 UTC m=+0.057658356 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:53:58 compute-0 podman[219397]: 2026-01-21 23:53:58.720705087 +0000 UTC m=+0.082586849 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 21 23:54:00 compute-0 nova_compute[182935]: 2026-01-21 23:54:00.778 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:02 compute-0 nova_compute[182935]: 2026-01-21 23:54:02.298 182939 INFO nova.virt.libvirt.driver [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Snapshot image upload complete
Jan 21 23:54:02 compute-0 nova_compute[182935]: 2026-01-21 23:54:02.299 182939 INFO nova.compute.manager [None req-6de9808e-ffea-482c-ae43-625e632eb1ee b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Took 6.26 seconds to snapshot the instance on the hypervisor.
Jan 21 23:54:02 compute-0 nova_compute[182935]: 2026-01-21 23:54:02.361 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:03.188 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:03.188 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:03.189 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:03.783 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:05 compute-0 podman[219448]: 2026-01-21 23:54:05.706250117 +0000 UTC m=+0.072580946 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:54:05 compute-0 nova_compute[182935]: 2026-01-21 23:54:05.780 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.364 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.514 182939 DEBUG oslo_concurrency.lockutils [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Acquiring lock "250eda73-38c7-47bf-a88e-2ca54f772298" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.515 182939 DEBUG oslo_concurrency.lockutils [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.515 182939 DEBUG oslo_concurrency.lockutils [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Acquiring lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.515 182939 DEBUG oslo_concurrency.lockutils [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.516 182939 DEBUG oslo_concurrency.lockutils [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.528 182939 INFO nova.compute.manager [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Terminating instance
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.538 182939 DEBUG nova.compute.manager [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:54:07 compute-0 kernel: tap8364c3a7-63 (unregistering): left promiscuous mode
Jan 21 23:54:07 compute-0 NetworkManager[55139]: <info>  [1769039647.5711] device (tap8364c3a7-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:54:07 compute-0 ovn_controller[95047]: 2026-01-21T23:54:07Z|00203|binding|INFO|Releasing lport 8364c3a7-63d7-4d65-8ca7-7670f8e2c50f from this chassis (sb_readonly=0)
Jan 21 23:54:07 compute-0 ovn_controller[95047]: 2026-01-21T23:54:07Z|00204|binding|INFO|Setting lport 8364c3a7-63d7-4d65-8ca7-7670f8e2c50f down in Southbound
Jan 21 23:54:07 compute-0 ovn_controller[95047]: 2026-01-21T23:54:07Z|00205|binding|INFO|Removing iface tap8364c3a7-63 ovn-installed in OVS
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.583 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.585 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.598 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:31:6f 10.100.0.6'], port_security=['fa:16:3e:6f:31:6f 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '250eda73-38c7-47bf-a88e-2ca54f772298', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e808973-4ac0-41a2-8ed5-17ec2d64e25b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f69f5fcc26c94982aa334384a38b765d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b028383-0d8a-459d-ae8e-014f6b9e387a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a690640-25cf-4bda-912b-c9138a74d87d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=8364c3a7-63d7-4d65-8ca7-7670f8e2c50f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.599 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 8364c3a7-63d7-4d65-8ca7-7670f8e2c50f in datapath 3e808973-4ac0-41a2-8ed5-17ec2d64e25b unbound from our chassis
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.601 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3e808973-4ac0-41a2-8ed5-17ec2d64e25b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.602 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.602 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3f81c552-81e3-44e4-a352-c6f10ce3446f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.603 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b namespace which is not needed anymore
Jan 21 23:54:07 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000032.scope: Deactivated successfully.
Jan 21 23:54:07 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000032.scope: Consumed 13.803s CPU time.
Jan 21 23:54:07 compute-0 systemd-machined[154182]: Machine qemu-29-instance-00000032 terminated.
Jan 21 23:54:07 compute-0 neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b[219255]: [NOTICE]   (219271) : haproxy version is 2.8.14-c23fe91
Jan 21 23:54:07 compute-0 neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b[219255]: [NOTICE]   (219271) : path to executable is /usr/sbin/haproxy
Jan 21 23:54:07 compute-0 neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b[219255]: [WARNING]  (219271) : Exiting Master process...
Jan 21 23:54:07 compute-0 neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b[219255]: [ALERT]    (219271) : Current worker (219273) exited with code 143 (Terminated)
Jan 21 23:54:07 compute-0 neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b[219255]: [WARNING]  (219271) : All workers exited. Exiting... (0)
Jan 21 23:54:07 compute-0 systemd[1]: libpod-c7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063.scope: Deactivated successfully.
Jan 21 23:54:07 compute-0 podman[219498]: 2026-01-21 23:54:07.766889324 +0000 UTC m=+0.055338919 container died c7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:54:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063-userdata-shm.mount: Deactivated successfully.
Jan 21 23:54:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-3fb84f224caff87fd0ae2682739fdef6268dc3aabad51facf4c40cc374e4ce82-merged.mount: Deactivated successfully.
Jan 21 23:54:07 compute-0 podman[219498]: 2026-01-21 23:54:07.819901496 +0000 UTC m=+0.108351091 container cleanup c7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.822 182939 INFO nova.virt.libvirt.driver [-] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Instance destroyed successfully.
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.822 182939 DEBUG nova.objects.instance [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lazy-loading 'resources' on Instance uuid 250eda73-38c7-47bf-a88e-2ca54f772298 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:07 compute-0 systemd[1]: libpod-conmon-c7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063.scope: Deactivated successfully.
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.841 182939 DEBUG nova.virt.libvirt.vif [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:53:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1836351631',display_name='tempest-ImagesOneServerTestJSON-server-1836351631',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1836351631',id=50,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:53:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f69f5fcc26c94982aa334384a38b765d',ramdisk_id='',reservation_id='r-jfryzrzr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerTestJSON-1772951065',owner_user_name='tempest-ImagesOneServerTestJSON-1772951065-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:54:02Z,user_data=None,user_id='b5578721aa22415da2ac48e762bd973f',uuid=250eda73-38c7-47bf-a88e-2ca54f772298,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "address": "fa:16:3e:6f:31:6f", "network": {"id": "3e808973-4ac0-41a2-8ed5-17ec2d64e25b", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1394790983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f69f5fcc26c94982aa334384a38b765d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8364c3a7-63", "ovs_interfaceid": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.842 182939 DEBUG nova.network.os_vif_util [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Converting VIF {"id": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "address": "fa:16:3e:6f:31:6f", "network": {"id": "3e808973-4ac0-41a2-8ed5-17ec2d64e25b", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1394790983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f69f5fcc26c94982aa334384a38b765d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8364c3a7-63", "ovs_interfaceid": "8364c3a7-63d7-4d65-8ca7-7670f8e2c50f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.843 182939 DEBUG nova.network.os_vif_util [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:31:6f,bridge_name='br-int',has_traffic_filtering=True,id=8364c3a7-63d7-4d65-8ca7-7670f8e2c50f,network=Network(3e808973-4ac0-41a2-8ed5-17ec2d64e25b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8364c3a7-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.843 182939 DEBUG os_vif [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:31:6f,bridge_name='br-int',has_traffic_filtering=True,id=8364c3a7-63d7-4d65-8ca7-7670f8e2c50f,network=Network(3e808973-4ac0-41a2-8ed5-17ec2d64e25b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8364c3a7-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.846 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.846 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8364c3a7-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.848 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.850 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.853 182939 INFO os_vif [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:31:6f,bridge_name='br-int',has_traffic_filtering=True,id=8364c3a7-63d7-4d65-8ca7-7670f8e2c50f,network=Network(3e808973-4ac0-41a2-8ed5-17ec2d64e25b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8364c3a7-63')
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.854 182939 INFO nova.virt.libvirt.driver [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Deleting instance files /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298_del
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.855 182939 INFO nova.virt.libvirt.driver [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Deletion of /var/lib/nova/instances/250eda73-38c7-47bf-a88e-2ca54f772298_del complete
Jan 21 23:54:07 compute-0 podman[219545]: 2026-01-21 23:54:07.904163534 +0000 UTC m=+0.060729399 container remove c7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.912 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dfbfd2e3-7cb2-40a9-a890-606150e32452]: (4, ('Wed Jan 21 11:54:07 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b (c7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063)\nc7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063\nWed Jan 21 11:54:07 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b (c7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063)\nc7ce22be31b7a548b99adcfb44927ea99c572c5f1df5142d4cd690afb3800063\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.915 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3f792aae-754d-447e-9131-27ba7b6b13f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.916 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e808973-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.920 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-0 kernel: tap3e808973-40: left promiscuous mode
Jan 21 23:54:07 compute-0 nova_compute[182935]: 2026-01-21 23:54:07.937 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.940 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9045e4bf-747c-4ff0-a8ea-60572b3e1e40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.961 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[36ec2c7a-b68b-4a4d-a162-d80f994d814f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.963 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6819950f-58c3-4650-9a82-6a7011eb5166]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.981 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9d2dd882-db15-416f-a606-cd20dae3d962]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412139, 'reachable_time': 39317, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219560, 'error': None, 'target': 'ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.985 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3e808973-4ac0-41a2-8ed5-17ec2d64e25b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:54:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:07.985 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[3de391b0-b5bc-467c-ab2a-889b9d535e27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:07 compute-0 systemd[1]: run-netns-ovnmeta\x2d3e808973\x2d4ac0\x2d41a2\x2d8ed5\x2d17ec2d64e25b.mount: Deactivated successfully.
Jan 21 23:54:08 compute-0 nova_compute[182935]: 2026-01-21 23:54:08.009 182939 INFO nova.compute.manager [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 21 23:54:08 compute-0 nova_compute[182935]: 2026-01-21 23:54:08.010 182939 DEBUG oslo.service.loopingcall [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:54:08 compute-0 nova_compute[182935]: 2026-01-21 23:54:08.010 182939 DEBUG nova.compute.manager [-] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:54:08 compute-0 nova_compute[182935]: 2026-01-21 23:54:08.010 182939 DEBUG nova.network.neutron [-] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:54:09 compute-0 nova_compute[182935]: 2026-01-21 23:54:09.416 182939 DEBUG nova.compute.manager [req-a7b500b1-e00d-42c2-8212-d518b7bbadf1 req-f86e7d6e-5208-49c1-ad93-bc4a282beb65 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Received event network-vif-unplugged-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:09 compute-0 nova_compute[182935]: 2026-01-21 23:54:09.417 182939 DEBUG oslo_concurrency.lockutils [req-a7b500b1-e00d-42c2-8212-d518b7bbadf1 req-f86e7d6e-5208-49c1-ad93-bc4a282beb65 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:09 compute-0 nova_compute[182935]: 2026-01-21 23:54:09.418 182939 DEBUG oslo_concurrency.lockutils [req-a7b500b1-e00d-42c2-8212-d518b7bbadf1 req-f86e7d6e-5208-49c1-ad93-bc4a282beb65 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:09 compute-0 nova_compute[182935]: 2026-01-21 23:54:09.418 182939 DEBUG oslo_concurrency.lockutils [req-a7b500b1-e00d-42c2-8212-d518b7bbadf1 req-f86e7d6e-5208-49c1-ad93-bc4a282beb65 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:09 compute-0 nova_compute[182935]: 2026-01-21 23:54:09.419 182939 DEBUG nova.compute.manager [req-a7b500b1-e00d-42c2-8212-d518b7bbadf1 req-f86e7d6e-5208-49c1-ad93-bc4a282beb65 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] No waiting events found dispatching network-vif-unplugged-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:09 compute-0 nova_compute[182935]: 2026-01-21 23:54:09.419 182939 DEBUG nova.compute.manager [req-a7b500b1-e00d-42c2-8212-d518b7bbadf1 req-f86e7d6e-5208-49c1-ad93-bc4a282beb65 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Received event network-vif-unplugged-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:54:10 compute-0 podman[219561]: 2026-01-21 23:54:10.68525935 +0000 UTC m=+0.053968817 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:54:10 compute-0 nova_compute[182935]: 2026-01-21 23:54:10.781 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:11 compute-0 nova_compute[182935]: 2026-01-21 23:54:11.753 182939 DEBUG nova.compute.manager [req-e461011f-8539-4913-9e8d-930b97e2a876 req-2ffedcce-9ed6-47e0-82a9-ffe8a601ed01 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Received event network-vif-plugged-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:11 compute-0 nova_compute[182935]: 2026-01-21 23:54:11.753 182939 DEBUG oslo_concurrency.lockutils [req-e461011f-8539-4913-9e8d-930b97e2a876 req-2ffedcce-9ed6-47e0-82a9-ffe8a601ed01 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:11 compute-0 nova_compute[182935]: 2026-01-21 23:54:11.754 182939 DEBUG oslo_concurrency.lockutils [req-e461011f-8539-4913-9e8d-930b97e2a876 req-2ffedcce-9ed6-47e0-82a9-ffe8a601ed01 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:11 compute-0 nova_compute[182935]: 2026-01-21 23:54:11.754 182939 DEBUG oslo_concurrency.lockutils [req-e461011f-8539-4913-9e8d-930b97e2a876 req-2ffedcce-9ed6-47e0-82a9-ffe8a601ed01 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:11 compute-0 nova_compute[182935]: 2026-01-21 23:54:11.754 182939 DEBUG nova.compute.manager [req-e461011f-8539-4913-9e8d-930b97e2a876 req-2ffedcce-9ed6-47e0-82a9-ffe8a601ed01 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] No waiting events found dispatching network-vif-plugged-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:11 compute-0 nova_compute[182935]: 2026-01-21 23:54:11.754 182939 WARNING nova.compute.manager [req-e461011f-8539-4913-9e8d-930b97e2a876 req-2ffedcce-9ed6-47e0-82a9-ffe8a601ed01 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Received unexpected event network-vif-plugged-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f for instance with vm_state active and task_state deleting.
Jan 21 23:54:12 compute-0 nova_compute[182935]: 2026-01-21 23:54:12.527 182939 DEBUG nova.network.neutron [-] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:12 compute-0 nova_compute[182935]: 2026-01-21 23:54:12.612 182939 INFO nova.compute.manager [-] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Took 4.60 seconds to deallocate network for instance.
Jan 21 23:54:12 compute-0 nova_compute[182935]: 2026-01-21 23:54:12.737 182939 DEBUG oslo_concurrency.lockutils [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:12 compute-0 nova_compute[182935]: 2026-01-21 23:54:12.738 182939 DEBUG oslo_concurrency.lockutils [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:12 compute-0 nova_compute[182935]: 2026-01-21 23:54:12.778 182939 DEBUG nova.scheduler.client.report [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 23:54:12 compute-0 nova_compute[182935]: 2026-01-21 23:54:12.851 182939 DEBUG nova.scheduler.client.report [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 23:54:12 compute-0 nova_compute[182935]: 2026-01-21 23:54:12.852 182939 DEBUG nova.compute.provider_tree [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:54:12 compute-0 nova_compute[182935]: 2026-01-21 23:54:12.857 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:12 compute-0 nova_compute[182935]: 2026-01-21 23:54:12.893 182939 DEBUG nova.scheduler.client.report [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 23:54:13 compute-0 nova_compute[182935]: 2026-01-21 23:54:13.001 182939 DEBUG nova.scheduler.client.report [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 23:54:13 compute-0 nova_compute[182935]: 2026-01-21 23:54:13.129 182939 DEBUG nova.compute.provider_tree [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:54:13 compute-0 nova_compute[182935]: 2026-01-21 23:54:13.181 182939 DEBUG nova.scheduler.client.report [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:54:13 compute-0 nova_compute[182935]: 2026-01-21 23:54:13.227 182939 DEBUG oslo_concurrency.lockutils [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.489s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:13 compute-0 nova_compute[182935]: 2026-01-21 23:54:13.296 182939 INFO nova.scheduler.client.report [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Deleted allocations for instance 250eda73-38c7-47bf-a88e-2ca54f772298
Jan 21 23:54:13 compute-0 nova_compute[182935]: 2026-01-21 23:54:13.573 182939 DEBUG oslo_concurrency.lockutils [None req-bca5581f-205c-4372-ba23-e57ed7d8828c b5578721aa22415da2ac48e762bd973f f69f5fcc26c94982aa334384a38b765d - - default default] Lock "250eda73-38c7-47bf-a88e-2ca54f772298" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:14 compute-0 nova_compute[182935]: 2026-01-21 23:54:14.016 182939 DEBUG nova.compute.manager [req-3eceec28-8fc8-4f75-8697-3c04e490c798 req-06e10dfd-33aa-4e32-82b8-59a80a4af447 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Received event network-vif-deleted-8364c3a7-63d7-4d65-8ca7-7670f8e2c50f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:15 compute-0 nova_compute[182935]: 2026-01-21 23:54:15.784 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:17 compute-0 nova_compute[182935]: 2026-01-21 23:54:17.859 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:18 compute-0 podman[219582]: 2026-01-21 23:54:18.707088969 +0000 UTC m=+0.072914745 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 23:54:18 compute-0 podman[219581]: 2026-01-21 23:54:18.729223554 +0000 UTC m=+0.096877584 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, config_id=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 23:54:20 compute-0 nova_compute[182935]: 2026-01-21 23:54:20.786 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:21 compute-0 nova_compute[182935]: 2026-01-21 23:54:21.361 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:21 compute-0 nova_compute[182935]: 2026-01-21 23:54:21.659 182939 DEBUG nova.compute.manager [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 21 23:54:21 compute-0 nova_compute[182935]: 2026-01-21 23:54:21.806 182939 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:21 compute-0 nova_compute[182935]: 2026-01-21 23:54:21.806 182939 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:21 compute-0 nova_compute[182935]: 2026-01-21 23:54:21.868 182939 DEBUG nova.objects.instance [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_requests' on Instance uuid 2c5b484c-19e7-47b1-bf93-fa599ddb6873 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:21 compute-0 nova_compute[182935]: 2026-01-21 23:54:21.885 182939 DEBUG nova.virt.hardware [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:54:21 compute-0 nova_compute[182935]: 2026-01-21 23:54:21.885 182939 INFO nova.compute.claims [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:54:21 compute-0 nova_compute[182935]: 2026-01-21 23:54:21.886 182939 DEBUG nova.objects.instance [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'resources' on Instance uuid 2c5b484c-19e7-47b1-bf93-fa599ddb6873 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:21 compute-0 nova_compute[182935]: 2026-01-21 23:54:21.906 182939 DEBUG nova.objects.instance [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'pci_devices' on Instance uuid 2c5b484c-19e7-47b1-bf93-fa599ddb6873 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:21 compute-0 nova_compute[182935]: 2026-01-21 23:54:21.961 182939 INFO nova.compute.resource_tracker [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating resource usage from migration 5cffe5fd-78f0-49b5-bafd-dbed115bc56f
Jan 21 23:54:21 compute-0 nova_compute[182935]: 2026-01-21 23:54:21.961 182939 DEBUG nova.compute.resource_tracker [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Starting to track incoming migration 5cffe5fd-78f0-49b5-bafd-dbed115bc56f with flavor ff01ccba-ad51-439f-9037-926190d6dc0f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 21 23:54:22 compute-0 nova_compute[182935]: 2026-01-21 23:54:22.047 182939 DEBUG nova.compute.provider_tree [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:54:22 compute-0 nova_compute[182935]: 2026-01-21 23:54:22.077 182939 DEBUG nova.scheduler.client.report [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:54:22 compute-0 nova_compute[182935]: 2026-01-21 23:54:22.099 182939 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:22 compute-0 nova_compute[182935]: 2026-01-21 23:54:22.099 182939 INFO nova.compute.manager [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Migrating
Jan 21 23:54:22 compute-0 sshd-session[219622]: Invalid user weblogic from 188.166.69.60 port 53456
Jan 21 23:54:22 compute-0 sshd-session[219622]: Connection closed by invalid user weblogic 188.166.69.60 port 53456 [preauth]
Jan 21 23:54:22 compute-0 nova_compute[182935]: 2026-01-21 23:54:22.820 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039647.8181753, 250eda73-38c7-47bf-a88e-2ca54f772298 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:22 compute-0 nova_compute[182935]: 2026-01-21 23:54:22.821 182939 INFO nova.compute.manager [-] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] VM Stopped (Lifecycle Event)
Jan 21 23:54:22 compute-0 nova_compute[182935]: 2026-01-21 23:54:22.854 182939 DEBUG nova.compute.manager [None req-b609a23f-0470-461d-b420-3965fd73ca29 - - - - - -] [instance: 250eda73-38c7-47bf-a88e-2ca54f772298] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:22 compute-0 nova_compute[182935]: 2026-01-21 23:54:22.861 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.299 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.301 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.301 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.301 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.301 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.301 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:54:23.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 21 23:54:23 compute-0 nova_compute[182935]: 2026-01-21 23:54:23.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:23 compute-0 nova_compute[182935]: 2026-01-21 23:54:23.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:54:23 compute-0 nova_compute[182935]: 2026-01-21 23:54:23.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:54:23 compute-0 nova_compute[182935]: 2026-01-21 23:54:23.814 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:54:24 compute-0 nova_compute[182935]: 2026-01-21 23:54:24.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:24 compute-0 nova_compute[182935]: 2026-01-21 23:54:24.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:24 compute-0 nova_compute[182935]: 2026-01-21 23:54:24.822 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:24 compute-0 nova_compute[182935]: 2026-01-21 23:54:24.823 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:24 compute-0 nova_compute[182935]: 2026-01-21 23:54:24.824 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:24 compute-0 nova_compute[182935]: 2026-01-21 23:54:24.824 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.033 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.035 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5715MB free_disk=73.27540969848633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.036 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.036 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.096 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Migration for instance 2c5b484c-19e7-47b1-bf93-fa599ddb6873 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.117 182939 INFO nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating resource usage from migration 5cffe5fd-78f0-49b5-bafd-dbed115bc56f
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.118 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Starting to track incoming migration 5cffe5fd-78f0-49b5-bafd-dbed115bc56f with flavor ff01ccba-ad51-439f-9037-926190d6dc0f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.203 182939 WARNING nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 2c5b484c-19e7-47b1-bf93-fa599ddb6873 has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}.
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.203 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.204 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.265 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.291 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.320 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.321 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:25 compute-0 nova_compute[182935]: 2026-01-21 23:54:25.789 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:26 compute-0 sshd-session[219625]: Accepted publickey for nova from 192.168.122.102 port 50988 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:54:26 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 21 23:54:26 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 21 23:54:26 compute-0 systemd-logind[784]: New session 43 of user nova.
Jan 21 23:54:26 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 21 23:54:26 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 21 23:54:26 compute-0 systemd[219629]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:54:26 compute-0 systemd[219629]: Queued start job for default target Main User Target.
Jan 21 23:54:26 compute-0 nova_compute[182935]: 2026-01-21 23:54:26.322 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:26 compute-0 systemd[219629]: Created slice User Application Slice.
Jan 21 23:54:26 compute-0 systemd[219629]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:54:26 compute-0 systemd[219629]: Started Daily Cleanup of User's Temporary Directories.
Jan 21 23:54:26 compute-0 systemd[219629]: Reached target Paths.
Jan 21 23:54:26 compute-0 systemd[219629]: Reached target Timers.
Jan 21 23:54:26 compute-0 systemd[219629]: Starting D-Bus User Message Bus Socket...
Jan 21 23:54:26 compute-0 systemd[219629]: Starting Create User's Volatile Files and Directories...
Jan 21 23:54:26 compute-0 systemd[219629]: Finished Create User's Volatile Files and Directories.
Jan 21 23:54:26 compute-0 systemd[219629]: Listening on D-Bus User Message Bus Socket.
Jan 21 23:54:26 compute-0 systemd[219629]: Reached target Sockets.
Jan 21 23:54:26 compute-0 systemd[219629]: Reached target Basic System.
Jan 21 23:54:26 compute-0 systemd[219629]: Reached target Main User Target.
Jan 21 23:54:26 compute-0 systemd[219629]: Startup finished in 138ms.
Jan 21 23:54:26 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 21 23:54:26 compute-0 systemd[1]: Started Session 43 of User nova.
Jan 21 23:54:26 compute-0 sshd-session[219625]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:54:26 compute-0 sshd-session[219644]: Received disconnect from 192.168.122.102 port 50988:11: disconnected by user
Jan 21 23:54:26 compute-0 sshd-session[219644]: Disconnected from user nova 192.168.122.102 port 50988
Jan 21 23:54:26 compute-0 sshd-session[219625]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:54:26 compute-0 systemd[1]: session-43.scope: Deactivated successfully.
Jan 21 23:54:26 compute-0 systemd-logind[784]: Session 43 logged out. Waiting for processes to exit.
Jan 21 23:54:26 compute-0 systemd-logind[784]: Removed session 43.
Jan 21 23:54:26 compute-0 sshd-session[219646]: Accepted publickey for nova from 192.168.122.102 port 51004 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:54:26 compute-0 systemd-logind[784]: New session 45 of user nova.
Jan 21 23:54:26 compute-0 systemd[1]: Started Session 45 of User nova.
Jan 21 23:54:26 compute-0 sshd-session[219646]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:54:26 compute-0 sshd-session[219649]: Received disconnect from 192.168.122.102 port 51004:11: disconnected by user
Jan 21 23:54:26 compute-0 sshd-session[219649]: Disconnected from user nova 192.168.122.102 port 51004
Jan 21 23:54:26 compute-0 sshd-session[219646]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:54:26 compute-0 systemd[1]: session-45.scope: Deactivated successfully.
Jan 21 23:54:26 compute-0 systemd-logind[784]: Session 45 logged out. Waiting for processes to exit.
Jan 21 23:54:26 compute-0 systemd-logind[784]: Removed session 45.
Jan 21 23:54:26 compute-0 nova_compute[182935]: 2026-01-21 23:54:26.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:26 compute-0 nova_compute[182935]: 2026-01-21 23:54:26.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:26 compute-0 nova_compute[182935]: 2026-01-21 23:54:26.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:54:27 compute-0 nova_compute[182935]: 2026-01-21 23:54:27.865 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:28 compute-0 nova_compute[182935]: 2026-01-21 23:54:28.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:29 compute-0 nova_compute[182935]: 2026-01-21 23:54:29.398 182939 DEBUG nova.compute.manager [req-c5f4c898-a3bf-46c1-bf21-2fb574174819 req-cbb7371b-9b72-43a0-9351-f0d2112e7570 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-unplugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:29 compute-0 nova_compute[182935]: 2026-01-21 23:54:29.398 182939 DEBUG oslo_concurrency.lockutils [req-c5f4c898-a3bf-46c1-bf21-2fb574174819 req-cbb7371b-9b72-43a0-9351-f0d2112e7570 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:29 compute-0 nova_compute[182935]: 2026-01-21 23:54:29.399 182939 DEBUG oslo_concurrency.lockutils [req-c5f4c898-a3bf-46c1-bf21-2fb574174819 req-cbb7371b-9b72-43a0-9351-f0d2112e7570 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:29 compute-0 nova_compute[182935]: 2026-01-21 23:54:29.399 182939 DEBUG oslo_concurrency.lockutils [req-c5f4c898-a3bf-46c1-bf21-2fb574174819 req-cbb7371b-9b72-43a0-9351-f0d2112e7570 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:29 compute-0 nova_compute[182935]: 2026-01-21 23:54:29.399 182939 DEBUG nova.compute.manager [req-c5f4c898-a3bf-46c1-bf21-2fb574174819 req-cbb7371b-9b72-43a0-9351-f0d2112e7570 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] No waiting events found dispatching network-vif-unplugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:29 compute-0 nova_compute[182935]: 2026-01-21 23:54:29.400 182939 WARNING nova.compute.manager [req-c5f4c898-a3bf-46c1-bf21-2fb574174819 req-cbb7371b-9b72-43a0-9351-f0d2112e7570 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received unexpected event network-vif-unplugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 for instance with vm_state active and task_state resize_migrating.
Jan 21 23:54:29 compute-0 podman[219652]: 2026-01-21 23:54:29.693866546 +0000 UTC m=+0.065559177 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:54:29 compute-0 podman[219651]: 2026-01-21 23:54:29.724736754 +0000 UTC m=+0.096256001 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 23:54:29 compute-0 nova_compute[182935]: 2026-01-21 23:54:29.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:30 compute-0 sshd-session[219698]: Accepted publickey for nova from 192.168.122.102 port 51008 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:54:30 compute-0 systemd-logind[784]: New session 46 of user nova.
Jan 21 23:54:30 compute-0 systemd[1]: Started Session 46 of User nova.
Jan 21 23:54:30 compute-0 sshd-session[219698]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:54:30 compute-0 sshd-session[219701]: Received disconnect from 192.168.122.102 port 51008:11: disconnected by user
Jan 21 23:54:30 compute-0 sshd-session[219701]: Disconnected from user nova 192.168.122.102 port 51008
Jan 21 23:54:30 compute-0 sshd-session[219698]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:54:30 compute-0 systemd[1]: session-46.scope: Deactivated successfully.
Jan 21 23:54:30 compute-0 systemd-logind[784]: Session 46 logged out. Waiting for processes to exit.
Jan 21 23:54:30 compute-0 systemd-logind[784]: Removed session 46.
Jan 21 23:54:30 compute-0 sshd-session[219703]: Accepted publickey for nova from 192.168.122.102 port 51010 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:54:30 compute-0 systemd-logind[784]: New session 47 of user nova.
Jan 21 23:54:30 compute-0 systemd[1]: Started Session 47 of User nova.
Jan 21 23:54:30 compute-0 sshd-session[219703]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:54:30 compute-0 sshd-session[219706]: Received disconnect from 192.168.122.102 port 51010:11: disconnected by user
Jan 21 23:54:30 compute-0 sshd-session[219706]: Disconnected from user nova 192.168.122.102 port 51010
Jan 21 23:54:30 compute-0 sshd-session[219703]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:54:30 compute-0 systemd[1]: session-47.scope: Deactivated successfully.
Jan 21 23:54:30 compute-0 systemd-logind[784]: Session 47 logged out. Waiting for processes to exit.
Jan 21 23:54:30 compute-0 systemd-logind[784]: Removed session 47.
Jan 21 23:54:30 compute-0 nova_compute[182935]: 2026-01-21 23:54:30.791 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:30 compute-0 sshd-session[219708]: Accepted publickey for nova from 192.168.122.102 port 51012 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 21 23:54:30 compute-0 systemd-logind[784]: New session 48 of user nova.
Jan 21 23:54:30 compute-0 systemd[1]: Started Session 48 of User nova.
Jan 21 23:54:30 compute-0 sshd-session[219708]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 21 23:54:30 compute-0 sshd-session[219711]: Received disconnect from 192.168.122.102 port 51012:11: disconnected by user
Jan 21 23:54:30 compute-0 sshd-session[219711]: Disconnected from user nova 192.168.122.102 port 51012
Jan 21 23:54:30 compute-0 sshd-session[219708]: pam_unix(sshd:session): session closed for user nova
Jan 21 23:54:30 compute-0 systemd-logind[784]: Session 48 logged out. Waiting for processes to exit.
Jan 21 23:54:30 compute-0 systemd[1]: session-48.scope: Deactivated successfully.
Jan 21 23:54:30 compute-0 systemd-logind[784]: Removed session 48.
Jan 21 23:54:31 compute-0 nova_compute[182935]: 2026-01-21 23:54:31.568 182939 DEBUG nova.compute.manager [req-f1c81cab-c099-4c54-9d37-7921d360c113 req-64b14beb-ee53-4f24-9b92-39a989235ef0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:31 compute-0 nova_compute[182935]: 2026-01-21 23:54:31.569 182939 DEBUG oslo_concurrency.lockutils [req-f1c81cab-c099-4c54-9d37-7921d360c113 req-64b14beb-ee53-4f24-9b92-39a989235ef0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:31 compute-0 nova_compute[182935]: 2026-01-21 23:54:31.569 182939 DEBUG oslo_concurrency.lockutils [req-f1c81cab-c099-4c54-9d37-7921d360c113 req-64b14beb-ee53-4f24-9b92-39a989235ef0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:31 compute-0 nova_compute[182935]: 2026-01-21 23:54:31.569 182939 DEBUG oslo_concurrency.lockutils [req-f1c81cab-c099-4c54-9d37-7921d360c113 req-64b14beb-ee53-4f24-9b92-39a989235ef0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:31 compute-0 nova_compute[182935]: 2026-01-21 23:54:31.569 182939 DEBUG nova.compute.manager [req-f1c81cab-c099-4c54-9d37-7921d360c113 req-64b14beb-ee53-4f24-9b92-39a989235ef0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] No waiting events found dispatching network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:31 compute-0 nova_compute[182935]: 2026-01-21 23:54:31.570 182939 WARNING nova.compute.manager [req-f1c81cab-c099-4c54-9d37-7921d360c113 req-64b14beb-ee53-4f24-9b92-39a989235ef0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received unexpected event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 for instance with vm_state active and task_state resize_migrated.
Jan 21 23:54:32 compute-0 nova_compute[182935]: 2026-01-21 23:54:32.388 182939 INFO nova.network.neutron [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating port 2ad8a775-c03c-4a1b-919a-278faef8cb47 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 21 23:54:32 compute-0 nova_compute[182935]: 2026-01-21 23:54:32.869 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:33 compute-0 nova_compute[182935]: 2026-01-21 23:54:33.740 182939 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:54:33 compute-0 nova_compute[182935]: 2026-01-21 23:54:33.741 182939 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquired lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:54:33 compute-0 nova_compute[182935]: 2026-01-21 23:54:33.741 182939 DEBUG nova.network.neutron [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:54:33 compute-0 nova_compute[182935]: 2026-01-21 23:54:33.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:54:33 compute-0 nova_compute[182935]: 2026-01-21 23:54:33.951 182939 DEBUG nova.compute.manager [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-changed-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:33 compute-0 nova_compute[182935]: 2026-01-21 23:54:33.951 182939 DEBUG nova.compute.manager [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Refreshing instance network info cache due to event network-changed-2ad8a775-c03c-4a1b-919a-278faef8cb47. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:54:33 compute-0 nova_compute[182935]: 2026-01-21 23:54:33.952 182939 DEBUG oslo_concurrency.lockutils [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:54:35 compute-0 nova_compute[182935]: 2026-01-21 23:54:35.794 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:36 compute-0 podman[219713]: 2026-01-21 23:54:36.68633767 +0000 UTC m=+0.060950656 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 21 23:54:37 compute-0 nova_compute[182935]: 2026-01-21 23:54:37.872 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:39 compute-0 nova_compute[182935]: 2026-01-21 23:54:39.748 182939 DEBUG nova.network.neutron [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating instance_info_cache with network_info: [{"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:39 compute-0 nova_compute[182935]: 2026-01-21 23:54:39.771 182939 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Releasing lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:54:39 compute-0 nova_compute[182935]: 2026-01-21 23:54:39.776 182939 DEBUG oslo_concurrency.lockutils [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:54:39 compute-0 nova_compute[182935]: 2026-01-21 23:54:39.776 182939 DEBUG nova.network.neutron [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Refreshing network info cache for port 2ad8a775-c03c-4a1b-919a-278faef8cb47 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:54:39 compute-0 nova_compute[182935]: 2026-01-21 23:54:39.921 182939 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 21 23:54:39 compute-0 nova_compute[182935]: 2026-01-21 23:54:39.923 182939 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 21 23:54:39 compute-0 nova_compute[182935]: 2026-01-21 23:54:39.923 182939 INFO nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Creating image(s)
Jan 21 23:54:39 compute-0 nova_compute[182935]: 2026-01-21 23:54:39.924 182939 DEBUG nova.objects.instance [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2c5b484c-19e7-47b1-bf93-fa599ddb6873 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:39 compute-0 nova_compute[182935]: 2026-01-21 23:54:39.942 182939 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.002 182939 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.003 182939 DEBUG nova.virt.disk.api [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Checking if we can resize image /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.003 182939 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.062 182939 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.063 182939 DEBUG nova.virt.disk.api [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Cannot resize image /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.085 182939 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.085 182939 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Ensure instance console log exists: /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.086 182939 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.087 182939 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.087 182939 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.093 182939 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Start _get_guest_xml network_info=[{"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:10:e7:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.100 182939 WARNING nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.106 182939 DEBUG nova.virt.libvirt.host [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.107 182939 DEBUG nova.virt.libvirt.host [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.110 182939 DEBUG nova.virt.libvirt.host [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.111 182939 DEBUG nova.virt.libvirt.host [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.114 182939 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.114 182939 DEBUG nova.virt.hardware [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ff01ccba-ad51-439f-9037-926190d6dc0f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.115 182939 DEBUG nova.virt.hardware [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.116 182939 DEBUG nova.virt.hardware [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.116 182939 DEBUG nova.virt.hardware [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.117 182939 DEBUG nova.virt.hardware [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.118 182939 DEBUG nova.virt.hardware [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.118 182939 DEBUG nova.virt.hardware [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.119 182939 DEBUG nova.virt.hardware [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.119 182939 DEBUG nova.virt.hardware [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.124 182939 DEBUG nova.virt.hardware [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.125 182939 DEBUG nova.virt.hardware [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.125 182939 DEBUG nova.objects.instance [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2c5b484c-19e7-47b1-bf93-fa599ddb6873 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.148 182939 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.204 182939 DEBUG oslo_concurrency.processutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.config --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.205 182939 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.205 182939 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.206 182939 DEBUG oslo_concurrency.lockutils [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.207 182939 DEBUG nova.virt.libvirt.vif [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:53:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1222136722',display_name='tempest-ServerDiskConfigTestJSON-server-1222136722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1222136722',id=51,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:54:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-wyf3b09s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:54:31Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=2c5b484c-19e7-47b1-bf93-fa599ddb6873,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:10:e7:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.208 182939 DEBUG nova.network.os_vif_util [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:10:e7:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.209 182939 DEBUG nova.network.os_vif_util [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.211 182939 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:54:40 compute-0 nova_compute[182935]:   <uuid>2c5b484c-19e7-47b1-bf93-fa599ddb6873</uuid>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   <name>instance-00000033</name>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   <memory>196608</memory>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-1222136722</nova:name>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:54:40</nova:creationTime>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <nova:flavor name="m1.micro">
Jan 21 23:54:40 compute-0 nova_compute[182935]:         <nova:memory>192</nova:memory>
Jan 21 23:54:40 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:54:40 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:54:40 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:54:40 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:54:40 compute-0 nova_compute[182935]:         <nova:user uuid="a7fb6bdd938b4fcdb749b0bc4f86f97e">tempest-ServerDiskConfigTestJSON-1417790226-project-member</nova:user>
Jan 21 23:54:40 compute-0 nova_compute[182935]:         <nova:project uuid="c09a5cf201e249f69f57cd4a632d1e2b">tempest-ServerDiskConfigTestJSON-1417790226</nova:project>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:54:40 compute-0 nova_compute[182935]:         <nova:port uuid="2ad8a775-c03c-4a1b-919a-278faef8cb47">
Jan 21 23:54:40 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <system>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <entry name="serial">2c5b484c-19e7-47b1-bf93-fa599ddb6873</entry>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <entry name="uuid">2c5b484c-19e7-47b1-bf93-fa599ddb6873</entry>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     </system>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   <os>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   </os>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   <features>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   </features>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/disk.config"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:10:e7:d9"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <target dev="tap2ad8a775-c0"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873/console.log" append="off"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <video>
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     </video>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:54:40 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:54:40 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:54:40 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:54:40 compute-0 nova_compute[182935]: </domain>
Jan 21 23:54:40 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.213 182939 DEBUG nova.virt.libvirt.vif [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:53:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1222136722',display_name='tempest-ServerDiskConfigTestJSON-server-1222136722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1222136722',id=51,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:54:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-wyf3b09s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:54:31Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=2c5b484c-19e7-47b1-bf93-fa599ddb6873,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:10:e7:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.213 182939 DEBUG nova.network.os_vif_util [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "vif_mac": "fa:16:3e:10:e7:d9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.213 182939 DEBUG nova.network.os_vif_util [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.214 182939 DEBUG os_vif [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.214 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.215 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.215 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.219 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.219 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ad8a775-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.220 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2ad8a775-c0, col_values=(('external_ids', {'iface-id': '2ad8a775-c03c-4a1b-919a-278faef8cb47', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:e7:d9', 'vm-uuid': '2c5b484c-19e7-47b1-bf93-fa599ddb6873'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.222 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 NetworkManager[55139]: <info>  [1769039680.2232] manager: (tap2ad8a775-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.225 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.230 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.231 182939 INFO os_vif [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0')
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.305 182939 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.305 182939 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.306 182939 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] No VIF found with MAC fa:16:3e:10:e7:d9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.306 182939 INFO nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Using config drive
Jan 21 23:54:40 compute-0 kernel: tap2ad8a775-c0: entered promiscuous mode
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.369 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 NetworkManager[55139]: <info>  [1769039680.3700] manager: (tap2ad8a775-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Jan 21 23:54:40 compute-0 ovn_controller[95047]: 2026-01-21T23:54:40Z|00206|binding|INFO|Claiming lport 2ad8a775-c03c-4a1b-919a-278faef8cb47 for this chassis.
Jan 21 23:54:40 compute-0 ovn_controller[95047]: 2026-01-21T23:54:40Z|00207|binding|INFO|2ad8a775-c03c-4a1b-919a-278faef8cb47: Claiming fa:16:3e:10:e7:d9 10.100.0.12
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.373 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.382 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:e7:d9 10.100.0.12'], port_security=['fa:16:3e:10:e7:d9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '6', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=2ad8a775-c03c-4a1b-919a-278faef8cb47) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.383 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 2ad8a775-c03c-4a1b-919a-278faef8cb47 in datapath 7b586c54-3322-410f-9bc9-972a63b8deff bound to our chassis
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.384 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.396 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9d729412-95da-4e1d-9f6d-54b1d439d401]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.397 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b586c54-31 in ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:54:40 compute-0 systemd-udevd[219763]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.404 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b586c54-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.405 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9af49c-0705-4e26-8e18-6ae33b7e7787]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.406 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f467ef2c-8d26-45ba-9c90-0373abe42308]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 systemd-machined[154182]: New machine qemu-30-instance-00000033.
Jan 21 23:54:40 compute-0 NetworkManager[55139]: <info>  [1769039680.4156] device (tap2ad8a775-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:54:40 compute-0 NetworkManager[55139]: <info>  [1769039680.4162] device (tap2ad8a775-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.422 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[2d9e2d38-8658-457c-9ee1-cdb2967091f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.426 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 ovn_controller[95047]: 2026-01-21T23:54:40Z|00208|binding|INFO|Setting lport 2ad8a775-c03c-4a1b-919a-278faef8cb47 ovn-installed in OVS
Jan 21 23:54:40 compute-0 ovn_controller[95047]: 2026-01-21T23:54:40Z|00209|binding|INFO|Setting lport 2ad8a775-c03c-4a1b-919a-278faef8cb47 up in Southbound
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.432 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 systemd[1]: Started Virtual Machine qemu-30-instance-00000033.
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.450 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f13983d5-1411-4034-b760-6f22c4488a4c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.482 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9ad35f21-5e10-4c2b-963b-2b4776e1d3f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 NetworkManager[55139]: <info>  [1769039680.4892] manager: (tap7b586c54-30): new Veth device (/org/freedesktop/NetworkManager/Devices/96)
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.488 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dfa6a86d-de20-4724-9996-c2f5f351d677]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.521 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[faa421d3-3de1-4175-9cdb-231c41f57412]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.524 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d95a01-7020-48cb-85cf-44c41d6a6c61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 NetworkManager[55139]: <info>  [1769039680.5455] device (tap7b586c54-30): carrier: link connected
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.549 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[de5f8f2e-f3cd-4379-b066-8cf3a43bb74e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.567 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[804fd93f-0801-4a57-93bd-5d9c0e66f26c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418328, 'reachable_time': 20865, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219796, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.589 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[57f4dd6b-941b-4e16-90b8-7b42ae70defa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedc:a9f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 418328, 'tstamp': 418328}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219797, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.609 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4df53b0f-4e1f-4d95-9192-60bf7ff0d3ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b586c54-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dc:a9:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418328, 'reachable_time': 20865, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219798, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.638 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad68e5e-d8fc-4658-8f35-19817032af16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.703 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6e37ea8e-f90f-49f7-add4-062b04af4d07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.704 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.704 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.705 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b586c54-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.706 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 NetworkManager[55139]: <info>  [1769039680.7073] manager: (tap7b586c54-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Jan 21 23:54:40 compute-0 kernel: tap7b586c54-30: entered promiscuous mode
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.709 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.710 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b586c54-30, col_values=(('external_ids', {'iface-id': '52e5d5d5-be78-49fa-86d7-24ac4adf40c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:40 compute-0 ovn_controller[95047]: 2026-01-21T23:54:40Z|00210|binding|INFO|Releasing lport 52e5d5d5-be78-49fa-86d7-24ac4adf40c1 from this chassis (sb_readonly=0)
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.711 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.723 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.724 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.726 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[39f8a4f8-0aaf-4f0c-8934-a2d6adba9c69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.728 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/7b586c54-3322-410f-9bc9-972a63b8deff.pid.haproxy
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 7b586c54-3322-410f-9bc9-972a63b8deff
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:54:40 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:40.728 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'env', 'PROCESS_TAG=haproxy-7b586c54-3322-410f-9bc9-972a63b8deff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b586c54-3322-410f-9bc9-972a63b8deff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.763 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039680.7632275, 2c5b484c-19e7-47b1-bf93-fa599ddb6873 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.764 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] VM Resumed (Lifecycle Event)
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.768 182939 DEBUG nova.compute.manager [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.773 182939 INFO nova.virt.libvirt.driver [-] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Instance running successfully.
Jan 21 23:54:40 compute-0 virtqemud[182477]: argument unsupported: QEMU guest agent is not configured
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.776 182939 DEBUG nova.virt.libvirt.guest [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.776 182939 DEBUG nova.virt.libvirt.driver [None req-119b87c6-e29b-40e2-ade9-16bc86f4c969 a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.796 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.801 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.804 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.849 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.850 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039680.7673998, 2c5b484c-19e7-47b1-bf93-fa599ddb6873 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.850 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] VM Started (Lifecycle Event)
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.862 182939 DEBUG nova.compute.manager [req-55a8046f-5a09-40e5-a631-da7f0f5567a4 req-ce910971-3a6c-4387-a979-522c849cafd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.862 182939 DEBUG oslo_concurrency.lockutils [req-55a8046f-5a09-40e5-a631-da7f0f5567a4 req-ce910971-3a6c-4387-a979-522c849cafd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.863 182939 DEBUG oslo_concurrency.lockutils [req-55a8046f-5a09-40e5-a631-da7f0f5567a4 req-ce910971-3a6c-4387-a979-522c849cafd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.863 182939 DEBUG oslo_concurrency.lockutils [req-55a8046f-5a09-40e5-a631-da7f0f5567a4 req-ce910971-3a6c-4387-a979-522c849cafd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.864 182939 DEBUG nova.compute.manager [req-55a8046f-5a09-40e5-a631-da7f0f5567a4 req-ce910971-3a6c-4387-a979-522c849cafd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] No waiting events found dispatching network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.864 182939 WARNING nova.compute.manager [req-55a8046f-5a09-40e5-a631-da7f0f5567a4 req-ce910971-3a6c-4387-a979-522c849cafd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received unexpected event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 for instance with vm_state active and task_state resize_finish.
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.883 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:40 compute-0 nova_compute[182935]: 2026-01-21 23:54:40.887 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:54:41 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 21 23:54:41 compute-0 systemd[219629]: Activating special unit Exit the Session...
Jan 21 23:54:41 compute-0 systemd[219629]: Stopped target Main User Target.
Jan 21 23:54:41 compute-0 systemd[219629]: Stopped target Basic System.
Jan 21 23:54:41 compute-0 systemd[219629]: Stopped target Paths.
Jan 21 23:54:41 compute-0 systemd[219629]: Stopped target Sockets.
Jan 21 23:54:41 compute-0 systemd[219629]: Stopped target Timers.
Jan 21 23:54:41 compute-0 systemd[219629]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 21 23:54:41 compute-0 systemd[219629]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 21 23:54:41 compute-0 systemd[219629]: Closed D-Bus User Message Bus Socket.
Jan 21 23:54:41 compute-0 systemd[219629]: Stopped Create User's Volatile Files and Directories.
Jan 21 23:54:41 compute-0 systemd[219629]: Removed slice User Application Slice.
Jan 21 23:54:41 compute-0 systemd[219629]: Reached target Shutdown.
Jan 21 23:54:41 compute-0 systemd[219629]: Finished Exit the Session.
Jan 21 23:54:41 compute-0 systemd[219629]: Reached target Exit the Session.
Jan 21 23:54:41 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 21 23:54:41 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 21 23:54:41 compute-0 podman[219837]: 2026-01-21 23:54:41.135970029 +0000 UTC m=+0.053021467 container create 349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 21 23:54:41 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 21 23:54:41 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 21 23:54:41 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 21 23:54:41 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 21 23:54:41 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 21 23:54:41 compute-0 systemd[1]: Started libpod-conmon-349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef.scope.
Jan 21 23:54:41 compute-0 podman[219848]: 2026-01-21 23:54:41.198830871 +0000 UTC m=+0.064868360 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 21 23:54:41 compute-0 podman[219837]: 2026-01-21 23:54:41.109343893 +0000 UTC m=+0.026395381 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:54:41 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:54:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8d9e137e977d4a27abe3a295042a64a7117e7e1f4d38e64535c981e49853cf9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:54:41 compute-0 podman[219837]: 2026-01-21 23:54:41.220470737 +0000 UTC m=+0.137522195 container init 349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:54:41 compute-0 podman[219837]: 2026-01-21 23:54:41.225382745 +0000 UTC m=+0.142434173 container start 349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 21 23:54:41 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[219870]: [NOTICE]   (219876) : New worker (219878) forked
Jan 21 23:54:41 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[219870]: [NOTICE]   (219876) : Loading success.
Jan 21 23:54:42 compute-0 nova_compute[182935]: 2026-01-21 23:54:42.666 182939 DEBUG nova.network.neutron [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updated VIF entry in instance network info cache for port 2ad8a775-c03c-4a1b-919a-278faef8cb47. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:54:42 compute-0 nova_compute[182935]: 2026-01-21 23:54:42.668 182939 DEBUG nova.network.neutron [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating instance_info_cache with network_info: [{"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:42 compute-0 nova_compute[182935]: 2026-01-21 23:54:42.699 182939 DEBUG oslo_concurrency.lockutils [req-a60413fe-018c-48eb-962e-85fca4a59575 req-be40ca7b-d00d-481c-b79f-835d1663f184 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2c5b484c-19e7-47b1-bf93-fa599ddb6873" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:54:42 compute-0 nova_compute[182935]: 2026-01-21 23:54:42.983 182939 DEBUG nova.compute.manager [req-0ae2b0da-55e1-4a6e-8646-45025146963d req-3284c023-e4f8-46a4-a8a5-350c14b06f87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:42 compute-0 nova_compute[182935]: 2026-01-21 23:54:42.984 182939 DEBUG oslo_concurrency.lockutils [req-0ae2b0da-55e1-4a6e-8646-45025146963d req-3284c023-e4f8-46a4-a8a5-350c14b06f87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:42 compute-0 nova_compute[182935]: 2026-01-21 23:54:42.984 182939 DEBUG oslo_concurrency.lockutils [req-0ae2b0da-55e1-4a6e-8646-45025146963d req-3284c023-e4f8-46a4-a8a5-350c14b06f87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:42 compute-0 nova_compute[182935]: 2026-01-21 23:54:42.985 182939 DEBUG oslo_concurrency.lockutils [req-0ae2b0da-55e1-4a6e-8646-45025146963d req-3284c023-e4f8-46a4-a8a5-350c14b06f87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:42 compute-0 nova_compute[182935]: 2026-01-21 23:54:42.985 182939 DEBUG nova.compute.manager [req-0ae2b0da-55e1-4a6e-8646-45025146963d req-3284c023-e4f8-46a4-a8a5-350c14b06f87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] No waiting events found dispatching network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:42 compute-0 nova_compute[182935]: 2026-01-21 23:54:42.986 182939 WARNING nova.compute.manager [req-0ae2b0da-55e1-4a6e-8646-45025146963d req-3284c023-e4f8-46a4-a8a5-350c14b06f87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received unexpected event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 for instance with vm_state resized and task_state None.
Jan 21 23:54:45 compute-0 nova_compute[182935]: 2026-01-21 23:54:45.226 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:45 compute-0 nova_compute[182935]: 2026-01-21 23:54:45.800 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:47 compute-0 nova_compute[182935]: 2026-01-21 23:54:47.730 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "6be88afb-f958-4578-a2a6-678e18a8fcef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:47 compute-0 nova_compute[182935]: 2026-01-21 23:54:47.731 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.025 182939 DEBUG nova.compute.manager [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.130 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.131 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.141 182939 DEBUG nova.virt.hardware [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.142 182939 INFO nova.compute.claims [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.298 182939 DEBUG nova.compute.provider_tree [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.317 182939 DEBUG nova.scheduler.client.report [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.340 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.341 182939 DEBUG nova.compute.manager [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.393 182939 DEBUG nova.compute.manager [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.394 182939 DEBUG nova.network.neutron [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.413 182939 INFO nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.443 182939 DEBUG nova.compute.manager [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.594 182939 DEBUG nova.compute.manager [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.596 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.597 182939 INFO nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Creating image(s)
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.597 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "/var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.598 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "/var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.599 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "/var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.617 182939 DEBUG oslo_concurrency.processutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.676 182939 DEBUG nova.policy [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb46f340c44c473b9286568553cb6374', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8556453a9e6644b4b29f7e2585b6beb3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.682 182939 DEBUG oslo_concurrency.processutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.683 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.684 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.700 182939 DEBUG oslo_concurrency.processutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.758 182939 DEBUG oslo_concurrency.processutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.760 182939 DEBUG oslo_concurrency.processutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.796 182939 DEBUG oslo_concurrency.processutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.797 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.797 182939 DEBUG oslo_concurrency.processutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.852 182939 DEBUG oslo_concurrency.processutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.854 182939 DEBUG nova.virt.disk.api [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Checking if we can resize image /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.855 182939 DEBUG oslo_concurrency.processutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.910 182939 DEBUG oslo_concurrency.processutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.912 182939 DEBUG nova.virt.disk.api [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Cannot resize image /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.913 182939 DEBUG nova.objects.instance [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lazy-loading 'migration_context' on Instance uuid 6be88afb-f958-4578-a2a6-678e18a8fcef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.928 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.929 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Ensure instance console log exists: /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.929 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.930 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:48 compute-0 nova_compute[182935]: 2026-01-21 23:54:48.930 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:49 compute-0 podman[219904]: 2026-01-21 23:54:49.704750943 +0000 UTC m=+0.069872870 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 21 23:54:49 compute-0 podman[219903]: 2026-01-21 23:54:49.704747133 +0000 UTC m=+0.072476622 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.058 182939 DEBUG nova.network.neutron [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Successfully created port: 2d3c6c23-7c5a-4528-8281-b94a6216b9eb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.183 182939 DEBUG oslo_concurrency.lockutils [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.184 182939 DEBUG oslo_concurrency.lockutils [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.185 182939 DEBUG oslo_concurrency.lockutils [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.185 182939 DEBUG oslo_concurrency.lockutils [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.185 182939 DEBUG oslo_concurrency.lockutils [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.200 182939 INFO nova.compute.manager [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Terminating instance
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.210 182939 DEBUG nova.compute.manager [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:54:50 compute-0 kernel: tap2ad8a775-c0 (unregistering): left promiscuous mode
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.231 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:50 compute-0 NetworkManager[55139]: <info>  [1769039690.2330] device (tap2ad8a775-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.239 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:50 compute-0 ovn_controller[95047]: 2026-01-21T23:54:50Z|00211|binding|INFO|Releasing lport 2ad8a775-c03c-4a1b-919a-278faef8cb47 from this chassis (sb_readonly=0)
Jan 21 23:54:50 compute-0 ovn_controller[95047]: 2026-01-21T23:54:50Z|00212|binding|INFO|Setting lport 2ad8a775-c03c-4a1b-919a-278faef8cb47 down in Southbound
Jan 21 23:54:50 compute-0 ovn_controller[95047]: 2026-01-21T23:54:50Z|00213|binding|INFO|Removing iface tap2ad8a775-c0 ovn-installed in OVS
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.242 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.251 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:e7:d9 10.100.0.12'], port_security=['fa:16:3e:10:e7:d9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2c5b484c-19e7-47b1-bf93-fa599ddb6873', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b586c54-3322-410f-9bc9-972a63b8deff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09a5cf201e249f69f57cd4a632d1e2b', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0a9884ba-4fab-4d1a-a8f1-d417efefef12', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f503744d-afcf-48c4-bcde-b001877de7d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=2ad8a775-c03c-4a1b-919a-278faef8cb47) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.255 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 2ad8a775-c03c-4a1b-919a-278faef8cb47 in datapath 7b586c54-3322-410f-9bc9-972a63b8deff unbound from our chassis
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.257 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.259 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b586c54-3322-410f-9bc9-972a63b8deff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.263 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b6b42f0e-9b5d-4f9f-9452-12143bcdaafc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.265 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff namespace which is not needed anymore
Jan 21 23:54:50 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000033.scope: Deactivated successfully.
Jan 21 23:54:50 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000033.scope: Consumed 9.985s CPU time.
Jan 21 23:54:50 compute-0 systemd-machined[154182]: Machine qemu-30-instance-00000033 terminated.
Jan 21 23:54:50 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[219870]: [NOTICE]   (219876) : haproxy version is 2.8.14-c23fe91
Jan 21 23:54:50 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[219870]: [NOTICE]   (219876) : path to executable is /usr/sbin/haproxy
Jan 21 23:54:50 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[219870]: [WARNING]  (219876) : Exiting Master process...
Jan 21 23:54:50 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[219870]: [WARNING]  (219876) : Exiting Master process...
Jan 21 23:54:50 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[219870]: [ALERT]    (219876) : Current worker (219878) exited with code 143 (Terminated)
Jan 21 23:54:50 compute-0 neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff[219870]: [WARNING]  (219876) : All workers exited. Exiting... (0)
Jan 21 23:54:50 compute-0 systemd[1]: libpod-349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef.scope: Deactivated successfully.
Jan 21 23:54:50 compute-0 podman[219967]: 2026-01-21 23:54:50.420770915 +0000 UTC m=+0.046473821 container died 349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:54:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-b8d9e137e977d4a27abe3a295042a64a7117e7e1f4d38e64535c981e49853cf9-merged.mount: Deactivated successfully.
Jan 21 23:54:50 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef-userdata-shm.mount: Deactivated successfully.
Jan 21 23:54:50 compute-0 podman[219967]: 2026-01-21 23:54:50.457522234 +0000 UTC m=+0.083225140 container cleanup 349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:54:50 compute-0 systemd[1]: libpod-conmon-349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef.scope: Deactivated successfully.
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.482 182939 INFO nova.virt.libvirt.driver [-] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Instance destroyed successfully.
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.483 182939 DEBUG nova.objects.instance [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lazy-loading 'resources' on Instance uuid 2c5b484c-19e7-47b1-bf93-fa599ddb6873 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.499 182939 DEBUG nova.virt.libvirt.vif [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:53:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1222136722',display_name='tempest-ServerDiskConfigTestJSON-server-1222136722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1222136722',id=51,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:54:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c09a5cf201e249f69f57cd4a632d1e2b',ramdisk_id='',reservation_id='r-wyf3b09s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1417790226',owner_user_name='tempest-ServerDiskConfigTestJSON-1417790226-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:54:48Z,user_data=None,user_id='a7fb6bdd938b4fcdb749b0bc4f86f97e',uuid=2c5b484c-19e7-47b1-bf93-fa599ddb6873,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.499 182939 DEBUG nova.network.os_vif_util [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converting VIF {"id": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "address": "fa:16:3e:10:e7:d9", "network": {"id": "7b586c54-3322-410f-9bc9-972a63b8deff", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1542056474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09a5cf201e249f69f57cd4a632d1e2b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ad8a775-c0", "ovs_interfaceid": "2ad8a775-c03c-4a1b-919a-278faef8cb47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.501 182939 DEBUG nova.network.os_vif_util [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.501 182939 DEBUG os_vif [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.502 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.503 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ad8a775-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.504 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.506 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.510 182939 INFO os_vif [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:10:e7:d9,bridge_name='br-int',has_traffic_filtering=True,id=2ad8a775-c03c-4a1b-919a-278faef8cb47,network=Network(7b586c54-3322-410f-9bc9-972a63b8deff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ad8a775-c0')
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.511 182939 INFO nova.virt.libvirt.driver [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Deleting instance files /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873_del
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.517 182939 INFO nova.virt.libvirt.driver [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Deletion of /var/lib/nova/instances/2c5b484c-19e7-47b1-bf93-fa599ddb6873_del complete
Jan 21 23:54:50 compute-0 podman[220013]: 2026-01-21 23:54:50.522948046 +0000 UTC m=+0.041455751 container remove 349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.528 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[891c5f4e-1805-4f07-bc96-d46788995425]: (4, ('Wed Jan 21 11:54:50 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef)\n349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef\nWed Jan 21 11:54:50 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff (349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef)\n349d7d99db9c3f47c22bc1e5e77888d4ad03c3542e9f142ed5f4f073a5fc50ef\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.531 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[269b0bb7-3ccf-4d24-96bb-5ad6d70389bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.532 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b586c54-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.533 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:50 compute-0 kernel: tap7b586c54-30: left promiscuous mode
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.546 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.550 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d13e2dbf-004f-4088-9e33-3859397cee8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.568 182939 DEBUG nova.compute.manager [req-fb3a52f1-4fb6-40ea-8507-409809e130b4 req-e02d8253-1c58-4c55-b2c6-6fbd97f61b48 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-unplugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.568 182939 DEBUG oslo_concurrency.lockutils [req-fb3a52f1-4fb6-40ea-8507-409809e130b4 req-e02d8253-1c58-4c55-b2c6-6fbd97f61b48 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.568 182939 DEBUG oslo_concurrency.lockutils [req-fb3a52f1-4fb6-40ea-8507-409809e130b4 req-e02d8253-1c58-4c55-b2c6-6fbd97f61b48 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.569 182939 DEBUG oslo_concurrency.lockutils [req-fb3a52f1-4fb6-40ea-8507-409809e130b4 req-e02d8253-1c58-4c55-b2c6-6fbd97f61b48 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.569 182939 DEBUG nova.compute.manager [req-fb3a52f1-4fb6-40ea-8507-409809e130b4 req-e02d8253-1c58-4c55-b2c6-6fbd97f61b48 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] No waiting events found dispatching network-vif-unplugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.568 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[359ac885-4bed-4d49-8bdb-58c3c2b3ea99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.569 182939 DEBUG nova.compute.manager [req-fb3a52f1-4fb6-40ea-8507-409809e130b4 req-e02d8253-1c58-4c55-b2c6-6fbd97f61b48 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-unplugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.569 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2d8b1b15-a76a-4aee-9835-bf83579fd078]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.588 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9ed5a026-667f-4290-9b65-920d781dcc60]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 418321, 'reachable_time': 35076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220028, 'error': None, 'target': 'ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.591 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b586c54-3322-410f-9bc9-972a63b8deff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:54:50 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:50.591 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[3de00371-2c9f-4f1a-9e0b-68d5a79eaa38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:50 compute-0 systemd[1]: run-netns-ovnmeta\x2d7b586c54\x2d3322\x2d410f\x2d9bc9\x2d972a63b8deff.mount: Deactivated successfully.
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.595 182939 INFO nova.compute.manager [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.596 182939 DEBUG oslo.service.loopingcall [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.596 182939 DEBUG nova.compute.manager [-] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.596 182939 DEBUG nova.network.neutron [-] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:54:50 compute-0 nova_compute[182935]: 2026-01-21 23:54:50.801 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.025 182939 DEBUG nova.network.neutron [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Successfully updated port: 2d3c6c23-7c5a-4528-8281-b94a6216b9eb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.047 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.047 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquired lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.048 182939 DEBUG nova.network.neutron [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.145 182939 DEBUG nova.compute.manager [req-610d36ae-f65b-45b1-b459-22790a66160c req-d92ec29b-9da7-4f0f-8834-6e214313bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Received event network-changed-2d3c6c23-7c5a-4528-8281-b94a6216b9eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.145 182939 DEBUG nova.compute.manager [req-610d36ae-f65b-45b1-b459-22790a66160c req-d92ec29b-9da7-4f0f-8834-6e214313bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Refreshing instance network info cache due to event network-changed-2d3c6c23-7c5a-4528-8281-b94a6216b9eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.145 182939 DEBUG oslo_concurrency.lockutils [req-610d36ae-f65b-45b1-b459-22790a66160c req-d92ec29b-9da7-4f0f-8834-6e214313bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.239 182939 DEBUG nova.network.neutron [-] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.255 182939 INFO nova.compute.manager [-] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Took 0.66 seconds to deallocate network for instance.
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.367 182939 DEBUG oslo_concurrency.lockutils [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.368 182939 DEBUG oslo_concurrency.lockutils [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.374 182939 DEBUG oslo_concurrency.lockutils [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.385 182939 DEBUG nova.compute.manager [req-f0e0f6be-b895-4713-82e1-d16f9acc9ba0 req-70f8da62-142f-4131-8165-aa7b68b4e8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-deleted-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.389 182939 DEBUG nova.network.neutron [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.405 182939 INFO nova.scheduler.client.report [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Deleted allocations for instance 2c5b484c-19e7-47b1-bf93-fa599ddb6873
Jan 21 23:54:51 compute-0 nova_compute[182935]: 2026-01-21 23:54:51.498 182939 DEBUG oslo_concurrency.lockutils [None req-cd92e45a-21db-41b4-8710-8f4dc2d820bf a7fb6bdd938b4fcdb749b0bc4f86f97e c09a5cf201e249f69f57cd4a632d1e2b - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.657 182939 DEBUG nova.network.neutron [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Updating instance_info_cache with network_info: [{"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.683 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Releasing lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.683 182939 DEBUG nova.compute.manager [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Instance network_info: |[{"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.685 182939 DEBUG oslo_concurrency.lockutils [req-610d36ae-f65b-45b1-b459-22790a66160c req-d92ec29b-9da7-4f0f-8834-6e214313bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.685 182939 DEBUG nova.network.neutron [req-610d36ae-f65b-45b1-b459-22790a66160c req-d92ec29b-9da7-4f0f-8834-6e214313bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Refreshing network info cache for port 2d3c6c23-7c5a-4528-8281-b94a6216b9eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.690 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Start _get_guest_xml network_info=[{"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.695 182939 DEBUG nova.compute.manager [req-01ce9943-3c38-4573-a6db-6eac2dd8d9ec req-fe20b51c-31b1-4905-a21f-6261a0f15701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.696 182939 DEBUG oslo_concurrency.lockutils [req-01ce9943-3c38-4573-a6db-6eac2dd8d9ec req-fe20b51c-31b1-4905-a21f-6261a0f15701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.696 182939 DEBUG oslo_concurrency.lockutils [req-01ce9943-3c38-4573-a6db-6eac2dd8d9ec req-fe20b51c-31b1-4905-a21f-6261a0f15701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.697 182939 DEBUG oslo_concurrency.lockutils [req-01ce9943-3c38-4573-a6db-6eac2dd8d9ec req-fe20b51c-31b1-4905-a21f-6261a0f15701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2c5b484c-19e7-47b1-bf93-fa599ddb6873-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.697 182939 DEBUG nova.compute.manager [req-01ce9943-3c38-4573-a6db-6eac2dd8d9ec req-fe20b51c-31b1-4905-a21f-6261a0f15701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] No waiting events found dispatching network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.698 182939 WARNING nova.compute.manager [req-01ce9943-3c38-4573-a6db-6eac2dd8d9ec req-fe20b51c-31b1-4905-a21f-6261a0f15701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Received unexpected event network-vif-plugged-2ad8a775-c03c-4a1b-919a-278faef8cb47 for instance with vm_state deleted and task_state None.
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.715 182939 WARNING nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.722 182939 DEBUG nova.virt.libvirt.host [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.723 182939 DEBUG nova.virt.libvirt.host [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.728 182939 DEBUG nova.virt.libvirt.host [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.729 182939 DEBUG nova.virt.libvirt.host [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.731 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.731 182939 DEBUG nova.virt.hardware [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.732 182939 DEBUG nova.virt.hardware [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.733 182939 DEBUG nova.virt.hardware [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.733 182939 DEBUG nova.virt.hardware [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.734 182939 DEBUG nova.virt.hardware [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.734 182939 DEBUG nova.virt.hardware [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.735 182939 DEBUG nova.virt.hardware [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.736 182939 DEBUG nova.virt.hardware [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.736 182939 DEBUG nova.virt.hardware [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.736 182939 DEBUG nova.virt.hardware [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.737 182939 DEBUG nova.virt.hardware [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.741 182939 DEBUG nova.virt.libvirt.vif [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:54:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1186307028',display_name='tempest-SecurityGroupsTestJSON-server-1186307028',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1186307028',id=54,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8556453a9e6644b4b29f7e2585b6beb3',ramdisk_id='',reservation_id='r-21srab01',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-744520065',owner_user_name='tempest-SecurityGroupsTestJSON-744520065-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:54:48Z,user_data=None,user_id='fb46f340c44c473b9286568553cb6374',uuid=6be88afb-f958-4578-a2a6-678e18a8fcef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.741 182939 DEBUG nova.network.os_vif_util [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converting VIF {"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.742 182939 DEBUG nova.network.os_vif_util [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:38:62,bridge_name='br-int',has_traffic_filtering=True,id=2d3c6c23-7c5a-4528-8281-b94a6216b9eb,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d3c6c23-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.743 182939 DEBUG nova.objects.instance [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6be88afb-f958-4578-a2a6-678e18a8fcef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.763 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:54:52 compute-0 nova_compute[182935]:   <uuid>6be88afb-f958-4578-a2a6-678e18a8fcef</uuid>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   <name>instance-00000036</name>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <nova:name>tempest-SecurityGroupsTestJSON-server-1186307028</nova:name>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:54:52</nova:creationTime>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:54:52 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:54:52 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:54:52 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:54:52 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:54:52 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:54:52 compute-0 nova_compute[182935]:         <nova:user uuid="fb46f340c44c473b9286568553cb6374">tempest-SecurityGroupsTestJSON-744520065-project-member</nova:user>
Jan 21 23:54:52 compute-0 nova_compute[182935]:         <nova:project uuid="8556453a9e6644b4b29f7e2585b6beb3">tempest-SecurityGroupsTestJSON-744520065</nova:project>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:54:52 compute-0 nova_compute[182935]:         <nova:port uuid="2d3c6c23-7c5a-4528-8281-b94a6216b9eb">
Jan 21 23:54:52 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <system>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <entry name="serial">6be88afb-f958-4578-a2a6-678e18a8fcef</entry>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <entry name="uuid">6be88afb-f958-4578-a2a6-678e18a8fcef</entry>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     </system>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   <os>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   </os>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   <features>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   </features>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk.config"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:3f:38:62"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <target dev="tap2d3c6c23-7c"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/console.log" append="off"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <video>
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     </video>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:54:52 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:54:52 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:54:52 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:54:52 compute-0 nova_compute[182935]: </domain>
Jan 21 23:54:52 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.764 182939 DEBUG nova.compute.manager [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Preparing to wait for external event network-vif-plugged-2d3c6c23-7c5a-4528-8281-b94a6216b9eb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.765 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.765 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.765 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.766 182939 DEBUG nova.virt.libvirt.vif [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:54:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1186307028',display_name='tempest-SecurityGroupsTestJSON-server-1186307028',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1186307028',id=54,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8556453a9e6644b4b29f7e2585b6beb3',ramdisk_id='',reservation_id='r-21srab01',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-744520065',owner_user_name='tempest-SecurityGroupsTestJSON-744520065-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:54:48Z,user_data=None,user_id='fb46f340c44c473b9286568553cb6374',uuid=6be88afb-f958-4578-a2a6-678e18a8fcef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.766 182939 DEBUG nova.network.os_vif_util [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converting VIF {"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.766 182939 DEBUG nova.network.os_vif_util [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:38:62,bridge_name='br-int',has_traffic_filtering=True,id=2d3c6c23-7c5a-4528-8281-b94a6216b9eb,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d3c6c23-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.767 182939 DEBUG os_vif [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:38:62,bridge_name='br-int',has_traffic_filtering=True,id=2d3c6c23-7c5a-4528-8281-b94a6216b9eb,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d3c6c23-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.767 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.768 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.768 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.772 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.772 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2d3c6c23-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.773 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2d3c6c23-7c, col_values=(('external_ids', {'iface-id': '2d3c6c23-7c5a-4528-8281-b94a6216b9eb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3f:38:62', 'vm-uuid': '6be88afb-f958-4578-a2a6-678e18a8fcef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.774 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.776 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:54:52 compute-0 NetworkManager[55139]: <info>  [1769039692.7760] manager: (tap2d3c6c23-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.783 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.783 182939 INFO os_vif [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:38:62,bridge_name='br-int',has_traffic_filtering=True,id=2d3c6c23-7c5a-4528-8281-b94a6216b9eb,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d3c6c23-7c')
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.860 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.861 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.861 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] No VIF found with MAC fa:16:3e:3f:38:62, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:54:52 compute-0 nova_compute[182935]: 2026-01-21 23:54:52.862 182939 INFO nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Using config drive
Jan 21 23:54:53 compute-0 nova_compute[182935]: 2026-01-21 23:54:53.404 182939 INFO nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Creating config drive at /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk.config
Jan 21 23:54:53 compute-0 nova_compute[182935]: 2026-01-21 23:54:53.412 182939 DEBUG oslo_concurrency.processutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpphseeu1t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:54:53 compute-0 nova_compute[182935]: 2026-01-21 23:54:53.544 182939 DEBUG oslo_concurrency.processutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpphseeu1t" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:54:53 compute-0 kernel: tap2d3c6c23-7c: entered promiscuous mode
Jan 21 23:54:53 compute-0 ovn_controller[95047]: 2026-01-21T23:54:53Z|00214|binding|INFO|Claiming lport 2d3c6c23-7c5a-4528-8281-b94a6216b9eb for this chassis.
Jan 21 23:54:53 compute-0 ovn_controller[95047]: 2026-01-21T23:54:53Z|00215|binding|INFO|2d3c6c23-7c5a-4528-8281-b94a6216b9eb: Claiming fa:16:3e:3f:38:62 10.100.0.9
Jan 21 23:54:53 compute-0 NetworkManager[55139]: <info>  [1769039693.6081] manager: (tap2d3c6c23-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Jan 21 23:54:53 compute-0 nova_compute[182935]: 2026-01-21 23:54:53.608 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.623 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:38:62 10.100.0.9'], port_security=['fa:16:3e:3f:38:62 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6be88afb-f958-4578-a2a6-678e18a8fcef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8556453a9e6644b4b29f7e2585b6beb3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '07a1a0ce-5790-4d2e-8869-adf91647e1de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c83a09c2-c943-4d92-aedc-1a1adb93cc19, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=2d3c6c23-7c5a-4528-8281-b94a6216b9eb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.624 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 2d3c6c23-7c5a-4528-8281-b94a6216b9eb in datapath f32b0ae0-64b5-4b08-b029-da33b7e8f96a bound to our chassis
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.625 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f32b0ae0-64b5-4b08-b029-da33b7e8f96a
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.634 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[20633c44-beee-4685-8d48-af154f9f6bc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.635 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf32b0ae0-61 in ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.638 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf32b0ae0-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.638 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8a147393-27ca-4f20-aa60-0ae49ba858db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 systemd-udevd[220048]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.639 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[160ef887-d4f8-40bf-ba5e-e2c70dca4bda]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.650 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[656679ad-069b-4c04-ae27-b0fd63694207]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 NetworkManager[55139]: <info>  [1769039693.6519] device (tap2d3c6c23-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:54:53 compute-0 NetworkManager[55139]: <info>  [1769039693.6544] device (tap2d3c6c23-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:54:53 compute-0 systemd-machined[154182]: New machine qemu-31-instance-00000036.
Jan 21 23:54:53 compute-0 nova_compute[182935]: 2026-01-21 23:54:53.679 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.681 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3f701e-3c6b-4d4b-a76e-8847eb74ae4f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 nova_compute[182935]: 2026-01-21 23:54:53.684 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:53 compute-0 ovn_controller[95047]: 2026-01-21T23:54:53Z|00216|binding|INFO|Setting lport 2d3c6c23-7c5a-4528-8281-b94a6216b9eb ovn-installed in OVS
Jan 21 23:54:53 compute-0 ovn_controller[95047]: 2026-01-21T23:54:53Z|00217|binding|INFO|Setting lport 2d3c6c23-7c5a-4528-8281-b94a6216b9eb up in Southbound
Jan 21 23:54:53 compute-0 nova_compute[182935]: 2026-01-21 23:54:53.687 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:53 compute-0 systemd[1]: Started Virtual Machine qemu-31-instance-00000036.
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.717 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc8d37e-0e11-462b-8494-afe9b784401c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.723 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2c091139-7a94-4032-a2e2-23f54182a79e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 NetworkManager[55139]: <info>  [1769039693.7240] manager: (tapf32b0ae0-60): new Veth device (/org/freedesktop/NetworkManager/Devices/100)
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.759 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c77f45bf-d55f-4008-b7d8-044a03bdf7fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.764 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9e375627-deca-46d8-b05b-c89f5d327bbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 NetworkManager[55139]: <info>  [1769039693.7879] device (tapf32b0ae0-60): carrier: link connected
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.794 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d113a3a6-b98a-45f3-9b49-ec85e43ef7e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.811 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[31102fff-8a32-4023-9dee-b72801218451]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf32b0ae0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:84:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 419652, 'reachable_time': 19778, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220082, 'error': None, 'target': 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.828 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[20d1b281-9a63-4232-8d64-45b646bda66a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe63:84cb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 419652, 'tstamp': 419652}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220083, 'error': None, 'target': 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.846 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[970be0a1-a8f4-456e-89dc-2a447cc6e8b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf32b0ae0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:84:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 419652, 'reachable_time': 19778, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220084, 'error': None, 'target': 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.879 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bc7957fb-e8ba-4818-b8d9-cf2b17b3a88c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.937 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ffa17f46-3e83-4707-8614-029a110300e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.939 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf32b0ae0-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.940 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.940 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf32b0ae0-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:53 compute-0 kernel: tapf32b0ae0-60: entered promiscuous mode
Jan 21 23:54:53 compute-0 nova_compute[182935]: 2026-01-21 23:54:53.942 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:53 compute-0 NetworkManager[55139]: <info>  [1769039693.9429] manager: (tapf32b0ae0-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.950 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf32b0ae0-60, col_values=(('external_ids', {'iface-id': '6cc0b80c-0a82-45e0-b8cc-ede53f6f4c47'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:54:53 compute-0 nova_compute[182935]: 2026-01-21 23:54:53.952 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:53 compute-0 ovn_controller[95047]: 2026-01-21T23:54:53Z|00218|binding|INFO|Releasing lport 6cc0b80c-0a82-45e0-b8cc-ede53f6f4c47 from this chassis (sb_readonly=0)
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.956 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f32b0ae0-64b5-4b08-b029-da33b7e8f96a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f32b0ae0-64b5-4b08-b029-da33b7e8f96a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.957 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fe2c2844-187a-41e8-9427-daea55a3bb60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.958 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-f32b0ae0-64b5-4b08-b029-da33b7e8f96a
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/f32b0ae0-64b5-4b08-b029-da33b7e8f96a.pid.haproxy
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID f32b0ae0-64b5-4b08-b029-da33b7e8f96a
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:54:53 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:54:53.959 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'env', 'PROCESS_TAG=haproxy-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f32b0ae0-64b5-4b08-b029-da33b7e8f96a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:54:53 compute-0 nova_compute[182935]: 2026-01-21 23:54:53.966 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.098 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039694.0976546, 6be88afb-f958-4578-a2a6-678e18a8fcef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.098 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] VM Started (Lifecycle Event)
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.127 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.131 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039694.0977755, 6be88afb-f958-4578-a2a6-678e18a8fcef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.132 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] VM Paused (Lifecycle Event)
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.154 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.158 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.180 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.217 182939 DEBUG nova.compute.manager [req-b09475d0-0a60-4f0e-9e81-abeac8f4e9f8 req-26014433-b4a1-4305-84a5-df874a453b62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Received event network-vif-plugged-2d3c6c23-7c5a-4528-8281-b94a6216b9eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.218 182939 DEBUG oslo_concurrency.lockutils [req-b09475d0-0a60-4f0e-9e81-abeac8f4e9f8 req-26014433-b4a1-4305-84a5-df874a453b62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.219 182939 DEBUG oslo_concurrency.lockutils [req-b09475d0-0a60-4f0e-9e81-abeac8f4e9f8 req-26014433-b4a1-4305-84a5-df874a453b62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.219 182939 DEBUG oslo_concurrency.lockutils [req-b09475d0-0a60-4f0e-9e81-abeac8f4e9f8 req-26014433-b4a1-4305-84a5-df874a453b62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.219 182939 DEBUG nova.compute.manager [req-b09475d0-0a60-4f0e-9e81-abeac8f4e9f8 req-26014433-b4a1-4305-84a5-df874a453b62 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Processing event network-vif-plugged-2d3c6c23-7c5a-4528-8281-b94a6216b9eb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.220 182939 DEBUG nova.compute.manager [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.226 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039694.2260969, 6be88afb-f958-4578-a2a6-678e18a8fcef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.226 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] VM Resumed (Lifecycle Event)
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.230 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.233 182939 INFO nova.virt.libvirt.driver [-] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Instance spawned successfully.
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.233 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.267 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.272 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.283 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.283 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.284 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.285 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.285 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.286 182939 DEBUG nova.virt.libvirt.driver [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:54:54 compute-0 podman[220121]: 2026-01-21 23:54:54.309085338 +0000 UTC m=+0.050918968 container create 8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 23:54:54 compute-0 systemd[1]: Started libpod-conmon-8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f.scope.
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.353 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:54:54 compute-0 podman[220121]: 2026-01-21 23:54:54.283845175 +0000 UTC m=+0.025678855 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.382 182939 INFO nova.compute.manager [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Took 5.79 seconds to spawn the instance on the hypervisor.
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.383 182939 DEBUG nova.compute.manager [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:54:54 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:54:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5dea9752c85e3325e70d821233a2883807eeaf626296f19a66ff75a40b4be8b5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:54:54 compute-0 podman[220121]: 2026-01-21 23:54:54.402860118 +0000 UTC m=+0.144693768 container init 8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 21 23:54:54 compute-0 podman[220121]: 2026-01-21 23:54:54.411957115 +0000 UTC m=+0.153790745 container start 8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:54:54 compute-0 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[220136]: [NOTICE]   (220140) : New worker (220142) forked
Jan 21 23:54:54 compute-0 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[220136]: [NOTICE]   (220140) : Loading success.
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.479 182939 INFO nova.compute.manager [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Took 6.38 seconds to build instance.
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.498 182939 DEBUG oslo_concurrency.lockutils [None req-e0dd4302-bbc0-4a6e-9234-06e9344e70a3 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.694 182939 DEBUG nova.network.neutron [req-610d36ae-f65b-45b1-b459-22790a66160c req-d92ec29b-9da7-4f0f-8834-6e214313bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Updated VIF entry in instance network info cache for port 2d3c6c23-7c5a-4528-8281-b94a6216b9eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.695 182939 DEBUG nova.network.neutron [req-610d36ae-f65b-45b1-b459-22790a66160c req-d92ec29b-9da7-4f0f-8834-6e214313bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Updating instance_info_cache with network_info: [{"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:54 compute-0 nova_compute[182935]: 2026-01-21 23:54:54.714 182939 DEBUG oslo_concurrency.lockutils [req-610d36ae-f65b-45b1-b459-22790a66160c req-d92ec29b-9da7-4f0f-8834-6e214313bca8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:54:55 compute-0 nova_compute[182935]: 2026-01-21 23:54:55.807 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:56 compute-0 nova_compute[182935]: 2026-01-21 23:54:56.426 182939 DEBUG nova.compute.manager [req-a615351e-b728-4d7c-8c0d-1294096949cb req-4320e489-146c-4d0f-a8d4-f6eb15ad5ad6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Received event network-vif-plugged-2d3c6c23-7c5a-4528-8281-b94a6216b9eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:56 compute-0 nova_compute[182935]: 2026-01-21 23:54:56.427 182939 DEBUG oslo_concurrency.lockutils [req-a615351e-b728-4d7c-8c0d-1294096949cb req-4320e489-146c-4d0f-a8d4-f6eb15ad5ad6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:54:56 compute-0 nova_compute[182935]: 2026-01-21 23:54:56.428 182939 DEBUG oslo_concurrency.lockutils [req-a615351e-b728-4d7c-8c0d-1294096949cb req-4320e489-146c-4d0f-a8d4-f6eb15ad5ad6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:54:56 compute-0 nova_compute[182935]: 2026-01-21 23:54:56.428 182939 DEBUG oslo_concurrency.lockutils [req-a615351e-b728-4d7c-8c0d-1294096949cb req-4320e489-146c-4d0f-a8d4-f6eb15ad5ad6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:54:56 compute-0 nova_compute[182935]: 2026-01-21 23:54:56.428 182939 DEBUG nova.compute.manager [req-a615351e-b728-4d7c-8c0d-1294096949cb req-4320e489-146c-4d0f-a8d4-f6eb15ad5ad6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] No waiting events found dispatching network-vif-plugged-2d3c6c23-7c5a-4528-8281-b94a6216b9eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:54:56 compute-0 nova_compute[182935]: 2026-01-21 23:54:56.429 182939 WARNING nova.compute.manager [req-a615351e-b728-4d7c-8c0d-1294096949cb req-4320e489-146c-4d0f-a8d4-f6eb15ad5ad6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Received unexpected event network-vif-plugged-2d3c6c23-7c5a-4528-8281-b94a6216b9eb for instance with vm_state active and task_state None.
Jan 21 23:54:57 compute-0 nova_compute[182935]: 2026-01-21 23:54:57.495 182939 DEBUG nova.compute.manager [req-e754b68c-ae3a-461b-8b03-a8900917568c req-b1eebe5f-2497-4b68-9c8b-828bf859cab7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Received event network-changed-2d3c6c23-7c5a-4528-8281-b94a6216b9eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:57 compute-0 nova_compute[182935]: 2026-01-21 23:54:57.496 182939 DEBUG nova.compute.manager [req-e754b68c-ae3a-461b-8b03-a8900917568c req-b1eebe5f-2497-4b68-9c8b-828bf859cab7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Refreshing instance network info cache due to event network-changed-2d3c6c23-7c5a-4528-8281-b94a6216b9eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:54:57 compute-0 nova_compute[182935]: 2026-01-21 23:54:57.496 182939 DEBUG oslo_concurrency.lockutils [req-e754b68c-ae3a-461b-8b03-a8900917568c req-b1eebe5f-2497-4b68-9c8b-828bf859cab7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:54:57 compute-0 nova_compute[182935]: 2026-01-21 23:54:57.497 182939 DEBUG oslo_concurrency.lockutils [req-e754b68c-ae3a-461b-8b03-a8900917568c req-b1eebe5f-2497-4b68-9c8b-828bf859cab7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:54:57 compute-0 nova_compute[182935]: 2026-01-21 23:54:57.497 182939 DEBUG nova.network.neutron [req-e754b68c-ae3a-461b-8b03-a8900917568c req-b1eebe5f-2497-4b68-9c8b-828bf859cab7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Refreshing network info cache for port 2d3c6c23-7c5a-4528-8281-b94a6216b9eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:54:57 compute-0 nova_compute[182935]: 2026-01-21 23:54:57.775 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:54:58 compute-0 nova_compute[182935]: 2026-01-21 23:54:58.635 182939 DEBUG nova.compute.manager [req-d32db256-53ce-4301-a29e-4f407169a4df req-f1a35d31-1802-4607-ba37-4d1d6da3fb33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Received event network-changed-2d3c6c23-7c5a-4528-8281-b94a6216b9eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:54:58 compute-0 nova_compute[182935]: 2026-01-21 23:54:58.637 182939 DEBUG nova.compute.manager [req-d32db256-53ce-4301-a29e-4f407169a4df req-f1a35d31-1802-4607-ba37-4d1d6da3fb33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Refreshing instance network info cache due to event network-changed-2d3c6c23-7c5a-4528-8281-b94a6216b9eb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:54:58 compute-0 nova_compute[182935]: 2026-01-21 23:54:58.637 182939 DEBUG oslo_concurrency.lockutils [req-d32db256-53ce-4301-a29e-4f407169a4df req-f1a35d31-1802-4607-ba37-4d1d6da3fb33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:54:59 compute-0 nova_compute[182935]: 2026-01-21 23:54:59.140 182939 DEBUG nova.network.neutron [req-e754b68c-ae3a-461b-8b03-a8900917568c req-b1eebe5f-2497-4b68-9c8b-828bf859cab7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Updated VIF entry in instance network info cache for port 2d3c6c23-7c5a-4528-8281-b94a6216b9eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:54:59 compute-0 nova_compute[182935]: 2026-01-21 23:54:59.141 182939 DEBUG nova.network.neutron [req-e754b68c-ae3a-461b-8b03-a8900917568c req-b1eebe5f-2497-4b68-9c8b-828bf859cab7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Updating instance_info_cache with network_info: [{"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:54:59 compute-0 nova_compute[182935]: 2026-01-21 23:54:59.188 182939 DEBUG oslo_concurrency.lockutils [req-e754b68c-ae3a-461b-8b03-a8900917568c req-b1eebe5f-2497-4b68-9c8b-828bf859cab7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:54:59 compute-0 nova_compute[182935]: 2026-01-21 23:54:59.190 182939 DEBUG oslo_concurrency.lockutils [req-d32db256-53ce-4301-a29e-4f407169a4df req-f1a35d31-1802-4607-ba37-4d1d6da3fb33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:54:59 compute-0 nova_compute[182935]: 2026-01-21 23:54:59.191 182939 DEBUG nova.network.neutron [req-d32db256-53ce-4301-a29e-4f407169a4df req-f1a35d31-1802-4607-ba37-4d1d6da3fb33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Refreshing network info cache for port 2d3c6c23-7c5a-4528-8281-b94a6216b9eb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:55:00 compute-0 podman[220152]: 2026-01-21 23:55:00.692892774 +0000 UTC m=+0.060606748 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:55:00 compute-0 podman[220151]: 2026-01-21 23:55:00.740047991 +0000 UTC m=+0.104014646 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 21 23:55:00 compute-0 nova_compute[182935]: 2026-01-21 23:55:00.807 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:00.875 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:55:00 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:00.877 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:55:00 compute-0 nova_compute[182935]: 2026-01-21 23:55:00.878 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:02 compute-0 sshd-session[220201]: Invalid user weblogic from 188.166.69.60 port 45614
Jan 21 23:55:02 compute-0 sshd-session[220201]: Connection closed by invalid user weblogic 188.166.69.60 port 45614 [preauth]
Jan 21 23:55:02 compute-0 nova_compute[182935]: 2026-01-21 23:55:02.778 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:03.188 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:03.190 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:03.190 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:03 compute-0 nova_compute[182935]: 2026-01-21 23:55:03.299 182939 DEBUG nova.network.neutron [req-d32db256-53ce-4301-a29e-4f407169a4df req-f1a35d31-1802-4607-ba37-4d1d6da3fb33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Updated VIF entry in instance network info cache for port 2d3c6c23-7c5a-4528-8281-b94a6216b9eb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:55:03 compute-0 nova_compute[182935]: 2026-01-21 23:55:03.300 182939 DEBUG nova.network.neutron [req-d32db256-53ce-4301-a29e-4f407169a4df req-f1a35d31-1802-4607-ba37-4d1d6da3fb33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Updating instance_info_cache with network_info: [{"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:03 compute-0 nova_compute[182935]: 2026-01-21 23:55:03.326 182939 DEBUG oslo_concurrency.lockutils [req-d32db256-53ce-4301-a29e-4f407169a4df req-f1a35d31-1802-4607-ba37-4d1d6da3fb33 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:05 compute-0 nova_compute[182935]: 2026-01-21 23:55:05.481 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039690.4801514, 2c5b484c-19e7-47b1-bf93-fa599ddb6873 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:05 compute-0 nova_compute[182935]: 2026-01-21 23:55:05.482 182939 INFO nova.compute.manager [-] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] VM Stopped (Lifecycle Event)
Jan 21 23:55:05 compute-0 nova_compute[182935]: 2026-01-21 23:55:05.518 182939 DEBUG nova.compute.manager [None req-7a018ad2-ea08-4b3d-a0c9-df0e370f5437 - - - - - -] [instance: 2c5b484c-19e7-47b1-bf93-fa599ddb6873] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:05 compute-0 nova_compute[182935]: 2026-01-21 23:55:05.850 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:07 compute-0 podman[220221]: 2026-01-21 23:55:07.717593159 +0000 UTC m=+0.091035645 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:55:07 compute-0 nova_compute[182935]: 2026-01-21 23:55:07.783 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:08 compute-0 ovn_controller[95047]: 2026-01-21T23:55:08Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3f:38:62 10.100.0.9
Jan 21 23:55:08 compute-0 ovn_controller[95047]: 2026-01-21T23:55:08Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3f:38:62 10.100.0.9
Jan 21 23:55:10 compute-0 nova_compute[182935]: 2026-01-21 23:55:10.853 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:10.878 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:11 compute-0 podman[220246]: 2026-01-21 23:55:11.706035073 +0000 UTC m=+0.070757352 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 23:55:12 compute-0 nova_compute[182935]: 2026-01-21 23:55:12.786 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:15 compute-0 nova_compute[182935]: 2026-01-21 23:55:15.855 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:17 compute-0 nova_compute[182935]: 2026-01-21 23:55:17.788 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:19 compute-0 ovn_controller[95047]: 2026-01-21T23:55:19Z|00219|binding|INFO|Releasing lport 6cc0b80c-0a82-45e0-b8cc-ede53f6f4c47 from this chassis (sb_readonly=0)
Jan 21 23:55:19 compute-0 nova_compute[182935]: 2026-01-21 23:55:19.770 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:20 compute-0 podman[220267]: 2026-01-21 23:55:20.707771678 +0000 UTC m=+0.069076171 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 21 23:55:20 compute-0 podman[220266]: 2026-01-21 23:55:20.732329764 +0000 UTC m=+0.096449844 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, distribution-scope=public, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 21 23:55:20 compute-0 nova_compute[182935]: 2026-01-21 23:55:20.894 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:21 compute-0 nova_compute[182935]: 2026-01-21 23:55:21.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:22 compute-0 nova_compute[182935]: 2026-01-21 23:55:22.790 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:25 compute-0 nova_compute[182935]: 2026-01-21 23:55:25.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:25 compute-0 nova_compute[182935]: 2026-01-21 23:55:25.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:55:25 compute-0 nova_compute[182935]: 2026-01-21 23:55:25.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:55:25 compute-0 nova_compute[182935]: 2026-01-21 23:55:25.896 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:25 compute-0 nova_compute[182935]: 2026-01-21 23:55:25.990 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:55:25 compute-0 nova_compute[182935]: 2026-01-21 23:55:25.991 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:55:25 compute-0 nova_compute[182935]: 2026-01-21 23:55:25.992 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:55:25 compute-0 nova_compute[182935]: 2026-01-21 23:55:25.992 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6be88afb-f958-4578-a2a6-678e18a8fcef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.387 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Updating instance_info_cache with network_info: [{"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.408 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-6be88afb-f958-4578-a2a6-678e18a8fcef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.408 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.409 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.409 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.410 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.410 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.450 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.451 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.451 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.451 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.530 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.593 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.594 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.658 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.794 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.882 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.884 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5556MB free_disk=73.2424201965332GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.885 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.885 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.973 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 6be88afb-f958-4578-a2a6-678e18a8fcef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.973 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:55:27 compute-0 nova_compute[182935]: 2026-01-21 23:55:27.974 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:55:28 compute-0 nova_compute[182935]: 2026-01-21 23:55:28.035 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:55:28 compute-0 nova_compute[182935]: 2026-01-21 23:55:28.065 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:55:28 compute-0 nova_compute[182935]: 2026-01-21 23:55:28.110 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:55:28 compute-0 nova_compute[182935]: 2026-01-21 23:55:28.111 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:28 compute-0 nova_compute[182935]: 2026-01-21 23:55:28.495 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:28 compute-0 nova_compute[182935]: 2026-01-21 23:55:28.496 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:55:28 compute-0 nova_compute[182935]: 2026-01-21 23:55:28.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:30 compute-0 nova_compute[182935]: 2026-01-21 23:55:30.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:30 compute-0 nova_compute[182935]: 2026-01-21 23:55:30.899 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:31 compute-0 podman[220312]: 2026-01-21 23:55:31.727725026 +0000 UTC m=+0.085330538 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:55:31 compute-0 podman[220311]: 2026-01-21 23:55:31.794580613 +0000 UTC m=+0.157292577 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 21 23:55:32 compute-0 nova_compute[182935]: 2026-01-21 23:55:32.798 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:35 compute-0 nova_compute[182935]: 2026-01-21 23:55:35.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:55:35 compute-0 nova_compute[182935]: 2026-01-21 23:55:35.901 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:37 compute-0 nova_compute[182935]: 2026-01-21 23:55:37.810 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:38 compute-0 podman[220360]: 2026-01-21 23:55:38.687747173 +0000 UTC m=+0.060117946 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.226 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "70e91a38-1e04-4d71-93e1-4b946f228d7e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.227 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.265 182939 DEBUG nova.compute.manager [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.373 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.374 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.380 182939 DEBUG nova.virt.hardware [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.380 182939 INFO nova.compute.claims [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.509 182939 DEBUG nova.compute.provider_tree [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.530 182939 DEBUG nova.scheduler.client.report [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.554 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.554 182939 DEBUG nova.compute.manager [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.868 182939 DEBUG nova.compute.manager [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.869 182939 DEBUG nova.network.neutron [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.966 182939 INFO nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:55:39 compute-0 nova_compute[182935]: 2026-01-21 23:55:39.985 182939 DEBUG nova.compute.manager [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.306 182939 DEBUG nova.compute.manager [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.307 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.308 182939 INFO nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Creating image(s)
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.309 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "/var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.309 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "/var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.309 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "/var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.321 182939 DEBUG oslo_concurrency.processutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.382 182939 DEBUG oslo_concurrency.processutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.383 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.384 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.399 182939 DEBUG oslo_concurrency.processutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.460 182939 DEBUG oslo_concurrency.processutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.461 182939 DEBUG oslo_concurrency.processutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.497 182939 DEBUG oslo_concurrency.processutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.498 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.498 182939 DEBUG oslo_concurrency.processutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.566 182939 DEBUG oslo_concurrency.processutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.568 182939 DEBUG nova.virt.disk.api [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Checking if we can resize image /var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.569 182939 DEBUG oslo_concurrency.processutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.643 182939 DEBUG oslo_concurrency.processutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.645 182939 DEBUG nova.virt.disk.api [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Cannot resize image /var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.645 182939 DEBUG nova.objects.instance [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'migration_context' on Instance uuid 70e91a38-1e04-4d71-93e1-4b946f228d7e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.741 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.742 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Ensure instance console log exists: /var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.742 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.742 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.743 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:40 compute-0 nova_compute[182935]: 2026-01-21 23:55:40.903 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:41 compute-0 nova_compute[182935]: 2026-01-21 23:55:41.391 182939 DEBUG nova.policy [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7e79b904cb8a49f990b05eb0ed72fdf4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70b1c9f8be0042aa8de9841a26729700', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:55:41 compute-0 sshd-session[220401]: Invalid user weblogic from 188.166.69.60 port 58654
Jan 21 23:55:41 compute-0 sshd-session[220401]: Connection closed by invalid user weblogic 188.166.69.60 port 58654 [preauth]
Jan 21 23:55:42 compute-0 podman[220403]: 2026-01-21 23:55:42.674528567 +0000 UTC m=+0.052580168 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:55:42 compute-0 nova_compute[182935]: 2026-01-21 23:55:42.852 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:44 compute-0 nova_compute[182935]: 2026-01-21 23:55:44.315 182939 DEBUG nova.network.neutron [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Successfully created port: 9222eb38-8c2b-4811-ba8b-69dee7a49f2e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:55:45 compute-0 nova_compute[182935]: 2026-01-21 23:55:45.908 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:47 compute-0 nova_compute[182935]: 2026-01-21 23:55:47.247 182939 DEBUG nova.network.neutron [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Successfully updated port: 9222eb38-8c2b-4811-ba8b-69dee7a49f2e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:55:47 compute-0 nova_compute[182935]: 2026-01-21 23:55:47.268 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "refresh_cache-70e91a38-1e04-4d71-93e1-4b946f228d7e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:55:47 compute-0 nova_compute[182935]: 2026-01-21 23:55:47.269 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquired lock "refresh_cache-70e91a38-1e04-4d71-93e1-4b946f228d7e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:55:47 compute-0 nova_compute[182935]: 2026-01-21 23:55:47.269 182939 DEBUG nova.network.neutron [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:55:47 compute-0 nova_compute[182935]: 2026-01-21 23:55:47.677 182939 DEBUG nova.network.neutron [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:55:47 compute-0 nova_compute[182935]: 2026-01-21 23:55:47.908 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:48 compute-0 nova_compute[182935]: 2026-01-21 23:55:48.613 182939 DEBUG nova.compute.manager [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Received event network-changed-9222eb38-8c2b-4811-ba8b-69dee7a49f2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:48 compute-0 nova_compute[182935]: 2026-01-21 23:55:48.614 182939 DEBUG nova.compute.manager [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Refreshing instance network info cache due to event network-changed-9222eb38-8c2b-4811-ba8b-69dee7a49f2e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:55:48 compute-0 nova_compute[182935]: 2026-01-21 23:55:48.614 182939 DEBUG oslo_concurrency.lockutils [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-70e91a38-1e04-4d71-93e1-4b946f228d7e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.114 182939 DEBUG nova.network.neutron [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Updating instance_info_cache with network_info: [{"id": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "address": "fa:16:3e:f0:34:e9", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9222eb38-8c", "ovs_interfaceid": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.134 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Releasing lock "refresh_cache-70e91a38-1e04-4d71-93e1-4b946f228d7e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.134 182939 DEBUG nova.compute.manager [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Instance network_info: |[{"id": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "address": "fa:16:3e:f0:34:e9", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9222eb38-8c", "ovs_interfaceid": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.135 182939 DEBUG oslo_concurrency.lockutils [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-70e91a38-1e04-4d71-93e1-4b946f228d7e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.135 182939 DEBUG nova.network.neutron [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Refreshing network info cache for port 9222eb38-8c2b-4811-ba8b-69dee7a49f2e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.140 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Start _get_guest_xml network_info=[{"id": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "address": "fa:16:3e:f0:34:e9", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9222eb38-8c", "ovs_interfaceid": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.146 182939 WARNING nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.156 182939 DEBUG nova.virt.libvirt.host [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.157 182939 DEBUG nova.virt.libvirt.host [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.161 182939 DEBUG nova.virt.libvirt.host [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.162 182939 DEBUG nova.virt.libvirt.host [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.163 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.164 182939 DEBUG nova.virt.hardware [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ff01ccba-ad51-439f-9037-926190d6dc0f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.164 182939 DEBUG nova.virt.hardware [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.164 182939 DEBUG nova.virt.hardware [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.165 182939 DEBUG nova.virt.hardware [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.165 182939 DEBUG nova.virt.hardware [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.165 182939 DEBUG nova.virt.hardware [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.165 182939 DEBUG nova.virt.hardware [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.165 182939 DEBUG nova.virt.hardware [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.166 182939 DEBUG nova.virt.hardware [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.166 182939 DEBUG nova.virt.hardware [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.166 182939 DEBUG nova.virt.hardware [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.170 182939 DEBUG nova.virt.libvirt.vif [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-767265430',display_name='tempest-ListServerFiltersTestJSON-instance-767265430',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-767265430',id=62,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70b1c9f8be0042aa8de9841a26729700',ramdisk_id='',reservation_id='r-jkx275db',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1547380946',owner_user_name='tempest-ListServerFiltersTestJSON-1547380946-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:40Z,user_data=None,user_id='7e79b904cb8a49f990b05eb0ed72fdf4',uuid=70e91a38-1e04-4d71-93e1-4b946f228d7e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "address": "fa:16:3e:f0:34:e9", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9222eb38-8c", "ovs_interfaceid": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.171 182939 DEBUG nova.network.os_vif_util [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converting VIF {"id": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "address": "fa:16:3e:f0:34:e9", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9222eb38-8c", "ovs_interfaceid": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.172 182939 DEBUG nova.network.os_vif_util [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:34:e9,bridge_name='br-int',has_traffic_filtering=True,id=9222eb38-8c2b-4811-ba8b-69dee7a49f2e,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9222eb38-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.173 182939 DEBUG nova.objects.instance [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'pci_devices' on Instance uuid 70e91a38-1e04-4d71-93e1-4b946f228d7e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.188 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:55:50 compute-0 nova_compute[182935]:   <uuid>70e91a38-1e04-4d71-93e1-4b946f228d7e</uuid>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   <name>instance-0000003e</name>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   <memory>196608</memory>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-767265430</nova:name>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:55:50</nova:creationTime>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <nova:flavor name="m1.micro">
Jan 21 23:55:50 compute-0 nova_compute[182935]:         <nova:memory>192</nova:memory>
Jan 21 23:55:50 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:55:50 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:55:50 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:55:50 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:55:50 compute-0 nova_compute[182935]:         <nova:user uuid="7e79b904cb8a49f990b05eb0ed72fdf4">tempest-ListServerFiltersTestJSON-1547380946-project-member</nova:user>
Jan 21 23:55:50 compute-0 nova_compute[182935]:         <nova:project uuid="70b1c9f8be0042aa8de9841a26729700">tempest-ListServerFiltersTestJSON-1547380946</nova:project>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:55:50 compute-0 nova_compute[182935]:         <nova:port uuid="9222eb38-8c2b-4811-ba8b-69dee7a49f2e">
Jan 21 23:55:50 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <system>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <entry name="serial">70e91a38-1e04-4d71-93e1-4b946f228d7e</entry>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <entry name="uuid">70e91a38-1e04-4d71-93e1-4b946f228d7e</entry>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     </system>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   <os>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   </os>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   <features>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   </features>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk.config"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:f0:34:e9"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <target dev="tap9222eb38-8c"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/console.log" append="off"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <video>
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     </video>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:55:50 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:55:50 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:55:50 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:55:50 compute-0 nova_compute[182935]: </domain>
Jan 21 23:55:50 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.189 182939 DEBUG nova.compute.manager [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Preparing to wait for external event network-vif-plugged-9222eb38-8c2b-4811-ba8b-69dee7a49f2e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.190 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.190 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.190 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.191 182939 DEBUG nova.virt.libvirt.vif [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:55:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-767265430',display_name='tempest-ListServerFiltersTestJSON-instance-767265430',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-767265430',id=62,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70b1c9f8be0042aa8de9841a26729700',ramdisk_id='',reservation_id='r-jkx275db',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1547380946',owner_user_name='tempest-ListServerFiltersTestJSON-1547380946-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:55:40Z,user_data=None,user_id='7e79b904cb8a49f990b05eb0ed72fdf4',uuid=70e91a38-1e04-4d71-93e1-4b946f228d7e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "address": "fa:16:3e:f0:34:e9", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9222eb38-8c", "ovs_interfaceid": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.191 182939 DEBUG nova.network.os_vif_util [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converting VIF {"id": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "address": "fa:16:3e:f0:34:e9", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9222eb38-8c", "ovs_interfaceid": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.192 182939 DEBUG nova.network.os_vif_util [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:34:e9,bridge_name='br-int',has_traffic_filtering=True,id=9222eb38-8c2b-4811-ba8b-69dee7a49f2e,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9222eb38-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.192 182939 DEBUG os_vif [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:34:e9,bridge_name='br-int',has_traffic_filtering=True,id=9222eb38-8c2b-4811-ba8b-69dee7a49f2e,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9222eb38-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.193 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.193 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.193 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.197 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.198 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9222eb38-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.198 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9222eb38-8c, col_values=(('external_ids', {'iface-id': '9222eb38-8c2b-4811-ba8b-69dee7a49f2e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f0:34:e9', 'vm-uuid': '70e91a38-1e04-4d71-93e1-4b946f228d7e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.200 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:50 compute-0 NetworkManager[55139]: <info>  [1769039750.2010] manager: (tap9222eb38-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.203 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.208 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.210 182939 INFO os_vif [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:34:e9,bridge_name='br-int',has_traffic_filtering=True,id=9222eb38-8c2b-4811-ba8b-69dee7a49f2e,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9222eb38-8c')
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.261 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.262 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.262 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] No VIF found with MAC fa:16:3e:f0:34:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.263 182939 INFO nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Using config drive
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.850 182939 INFO nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Creating config drive at /var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk.config
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.855 182939 DEBUG oslo_concurrency.processutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsusc5zr9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.910 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:50 compute-0 nova_compute[182935]: 2026-01-21 23:55:50.983 182939 DEBUG oslo_concurrency.processutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsusc5zr9" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:55:51 compute-0 kernel: tap9222eb38-8c: entered promiscuous mode
Jan 21 23:55:51 compute-0 NetworkManager[55139]: <info>  [1769039751.0633] manager: (tap9222eb38-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/103)
Jan 21 23:55:51 compute-0 ovn_controller[95047]: 2026-01-21T23:55:51Z|00220|binding|INFO|Claiming lport 9222eb38-8c2b-4811-ba8b-69dee7a49f2e for this chassis.
Jan 21 23:55:51 compute-0 ovn_controller[95047]: 2026-01-21T23:55:51Z|00221|binding|INFO|9222eb38-8c2b-4811-ba8b-69dee7a49f2e: Claiming fa:16:3e:f0:34:e9 10.100.0.3
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.064 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.077 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:34:e9 10.100.0.3'], port_security=['fa:16:3e:f0:34:e9 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '70e91a38-1e04-4d71-93e1-4b946f228d7e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70b1c9f8be0042aa8de9841a26729700', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5943869c-ade1-4cd3-81a5-29e65236fb49', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88d3d39a-f56f-4f3b-95e9-79768ac7b596, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=9222eb38-8c2b-4811-ba8b-69dee7a49f2e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.079 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 9222eb38-8c2b-4811-ba8b-69dee7a49f2e in datapath a78bfb22-a192-4dbe-a117-9f8a59130e27 bound to our chassis
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.080 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a78bfb22-a192-4dbe-a117-9f8a59130e27
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.096 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a36800c1-ee02-4f01-bb36-94538d7610a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.097 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa78bfb22-a1 in ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.100 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa78bfb22-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.100 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b32f072b-2dfa-4587-befa-efb6e8a056b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 systemd-machined[154182]: New machine qemu-32-instance-0000003e.
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.101 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6c2042d0-54ba-4717-b1ed-e07e5378254b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 systemd-udevd[220465]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.113 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[7705b1e2-cc70-40c8-be4e-6bedc53fbb08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 ovn_controller[95047]: 2026-01-21T23:55:51Z|00222|binding|INFO|Setting lport 9222eb38-8c2b-4811-ba8b-69dee7a49f2e ovn-installed in OVS
Jan 21 23:55:51 compute-0 ovn_controller[95047]: 2026-01-21T23:55:51Z|00223|binding|INFO|Setting lport 9222eb38-8c2b-4811-ba8b-69dee7a49f2e up in Southbound
Jan 21 23:55:51 compute-0 systemd[1]: Started Virtual Machine qemu-32-instance-0000003e.
Jan 21 23:55:51 compute-0 NetworkManager[55139]: <info>  [1769039751.1536] device (tap9222eb38-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.154 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:51 compute-0 NetworkManager[55139]: <info>  [1769039751.1550] device (tap9222eb38-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.155 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e0e5f576-6ef8-4f39-90b1-c0af50cadc2e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 podman[220436]: 2026-01-21 23:55:51.189764141 +0000 UTC m=+0.130202240 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git)
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.192 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[de9eacd5-0c5c-4a0b-b0c1-73ab92671e39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 NetworkManager[55139]: <info>  [1769039751.2025] manager: (tapa78bfb22-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/104)
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.202 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0f4f82d9-6e8c-405d-bee0-232b3e6de235]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 podman[220437]: 2026-01-21 23:55:51.208336665 +0000 UTC m=+0.148914599 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.233 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[fb897a2d-3a31-4d75-90ef-4d503c4de511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.237 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[6d761f52-5d25-402d-8601-8f46b855d216]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 NetworkManager[55139]: <info>  [1769039751.2593] device (tapa78bfb22-a0): carrier: link connected
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.262 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c22c82f2-f42c-4da9-9235-25f744354fbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.279 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[184b5736-e23d-47cd-885d-113bc274dc40]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa78bfb22-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:41:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425399, 'reachable_time': 28345, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220514, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.294 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9aece655-0949-4954-a87b-624b3720ba10]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:4194'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425399, 'tstamp': 425399}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220515, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.310 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5a928ad5-63a8-460f-9ad3-0c5f326fcd27]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa78bfb22-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:41:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425399, 'reachable_time': 28345, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220516, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.339 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b7f5cd-58c8-4ccd-bdec-ee55a840c796]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.400 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[542be4e1-9744-4ba3-82f3-64a2c490e51d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.406 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa78bfb22-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.407 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.408 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa78bfb22-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:51 compute-0 kernel: tapa78bfb22-a0: entered promiscuous mode
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.409 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:51 compute-0 NetworkManager[55139]: <info>  [1769039751.4105] manager: (tapa78bfb22-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.413 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa78bfb22-a0, col_values=(('external_ids', {'iface-id': 'bb8c3f45-55b8-4c8e-8a31-26c5ecb4fb32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:51 compute-0 ovn_controller[95047]: 2026-01-21T23:55:51Z|00224|binding|INFO|Releasing lport bb8c3f45-55b8-4c8e-8a31-26c5ecb4fb32 from this chassis (sb_readonly=0)
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.416 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a78bfb22-a192-4dbe-a117-9f8a59130e27.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a78bfb22-a192-4dbe-a117-9f8a59130e27.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.417 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0a927ec3-910e-42e4-951b-8fac30a0dee4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.418 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-a78bfb22-a192-4dbe-a117-9f8a59130e27
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/a78bfb22-a192-4dbe-a117-9f8a59130e27.pid.haproxy
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID a78bfb22-a192-4dbe-a117-9f8a59130e27
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.419 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'env', 'PROCESS_TAG=haproxy-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a78bfb22-a192-4dbe-a117-9f8a59130e27.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.428 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.497 182939 DEBUG oslo_concurrency.lockutils [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "6be88afb-f958-4578-a2a6-678e18a8fcef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.498 182939 DEBUG oslo_concurrency.lockutils [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.498 182939 DEBUG oslo_concurrency.lockutils [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.499 182939 DEBUG oslo_concurrency.lockutils [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.499 182939 DEBUG oslo_concurrency.lockutils [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.512 182939 INFO nova.compute.manager [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Terminating instance
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.523 182939 DEBUG nova.compute.manager [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:55:51 compute-0 kernel: tap2d3c6c23-7c (unregistering): left promiscuous mode
Jan 21 23:55:51 compute-0 NetworkManager[55139]: <info>  [1769039751.5509] device (tap2d3c6c23-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:55:51 compute-0 ovn_controller[95047]: 2026-01-21T23:55:51Z|00225|binding|INFO|Releasing lport 2d3c6c23-7c5a-4528-8281-b94a6216b9eb from this chassis (sb_readonly=0)
Jan 21 23:55:51 compute-0 ovn_controller[95047]: 2026-01-21T23:55:51Z|00226|binding|INFO|Setting lport 2d3c6c23-7c5a-4528-8281-b94a6216b9eb down in Southbound
Jan 21 23:55:51 compute-0 ovn_controller[95047]: 2026-01-21T23:55:51Z|00227|binding|INFO|Removing iface tap2d3c6c23-7c ovn-installed in OVS
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.562 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:51 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:51.567 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:38:62 10.100.0.9'], port_security=['fa:16:3e:3f:38:62 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6be88afb-f958-4578-a2a6-678e18a8fcef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8556453a9e6644b4b29f7e2585b6beb3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '07a1a0ce-5790-4d2e-8869-adf91647e1de e4f1fc62-7115-48e0-8218-11b8bcfa72f9 ecacf12b-2952-40ca-ba6d-ba5cb7486300', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c83a09c2-c943-4d92-aedc-1a1adb93cc19, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=2d3c6c23-7c5a-4528-8281-b94a6216b9eb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.578 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:51 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000036.scope: Deactivated successfully.
Jan 21 23:55:51 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000036.scope: Consumed 15.112s CPU time.
Jan 21 23:55:51 compute-0 systemd-machined[154182]: Machine qemu-31-instance-00000036 terminated.
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.786 182939 INFO nova.virt.libvirt.driver [-] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Instance destroyed successfully.
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.787 182939 DEBUG nova.objects.instance [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lazy-loading 'resources' on Instance uuid 6be88afb-f958-4578-a2a6-678e18a8fcef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.804 182939 DEBUG nova.virt.libvirt.vif [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:54:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1186307028',display_name='tempest-SecurityGroupsTestJSON-server-1186307028',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1186307028',id=54,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:54:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8556453a9e6644b4b29f7e2585b6beb3',ramdisk_id='',reservation_id='r-21srab01',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-744520065',owner_user_name='tempest-SecurityGroupsTestJSON-744520065-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:54:54Z,user_data=None,user_id='fb46f340c44c473b9286568553cb6374',uuid=6be88afb-f958-4578-a2a6-678e18a8fcef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.805 182939 DEBUG nova.network.os_vif_util [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converting VIF {"id": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "address": "fa:16:3e:3f:38:62", "network": {"id": "f32b0ae0-64b5-4b08-b029-da33b7e8f96a", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1426753002-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8556453a9e6644b4b29f7e2585b6beb3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d3c6c23-7c", "ovs_interfaceid": "2d3c6c23-7c5a-4528-8281-b94a6216b9eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.806 182939 DEBUG nova.network.os_vif_util [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3f:38:62,bridge_name='br-int',has_traffic_filtering=True,id=2d3c6c23-7c5a-4528-8281-b94a6216b9eb,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d3c6c23-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.807 182939 DEBUG os_vif [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:38:62,bridge_name='br-int',has_traffic_filtering=True,id=2d3c6c23-7c5a-4528-8281-b94a6216b9eb,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d3c6c23-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.808 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.809 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d3c6c23-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.811 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.813 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.816 182939 INFO os_vif [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:38:62,bridge_name='br-int',has_traffic_filtering=True,id=2d3c6c23-7c5a-4528-8281-b94a6216b9eb,network=Network(f32b0ae0-64b5-4b08-b029-da33b7e8f96a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2d3c6c23-7c')
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.817 182939 INFO nova.virt.libvirt.driver [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Deleting instance files /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef_del
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.817 182939 INFO nova.virt.libvirt.driver [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Deletion of /var/lib/nova/instances/6be88afb-f958-4578-a2a6-678e18a8fcef_del complete
Jan 21 23:55:51 compute-0 podman[220557]: 2026-01-21 23:55:51.834985682 +0000 UTC m=+0.069957222 container create 99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.859 182939 DEBUG nova.compute.manager [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Received event network-vif-plugged-9222eb38-8c2b-4811-ba8b-69dee7a49f2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.860 182939 DEBUG oslo_concurrency.lockutils [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.861 182939 DEBUG oslo_concurrency.lockutils [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.861 182939 DEBUG oslo_concurrency.lockutils [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.861 182939 DEBUG nova.compute.manager [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Processing event network-vif-plugged-9222eb38-8c2b-4811-ba8b-69dee7a49f2e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.862 182939 DEBUG nova.compute.manager [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Received event network-vif-plugged-9222eb38-8c2b-4811-ba8b-69dee7a49f2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.862 182939 DEBUG oslo_concurrency.lockutils [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.862 182939 DEBUG oslo_concurrency.lockutils [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.863 182939 DEBUG oslo_concurrency.lockutils [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.863 182939 DEBUG nova.compute.manager [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] No waiting events found dispatching network-vif-plugged-9222eb38-8c2b-4811-ba8b-69dee7a49f2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.863 182939 WARNING nova.compute.manager [req-f926d3c4-175b-4796-9558-8f8d0ffc7647 req-954cf371-b4de-4fe8-88c5-a8ccd7f7dc9d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Received unexpected event network-vif-plugged-9222eb38-8c2b-4811-ba8b-69dee7a49f2e for instance with vm_state building and task_state spawning.
Jan 21 23:55:51 compute-0 systemd[1]: Started libpod-conmon-99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89.scope.
Jan 21 23:55:51 compute-0 podman[220557]: 2026-01-21 23:55:51.807334672 +0000 UTC m=+0.042306212 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.900 182939 INFO nova.compute.manager [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.901 182939 DEBUG oslo.service.loopingcall [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.901 182939 DEBUG nova.compute.manager [-] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:55:51 compute-0 nova_compute[182935]: 2026-01-21 23:55:51.901 182939 DEBUG nova.network.neutron [-] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:55:51 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:55:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f6990b2f8a9261035cdfabe49db0733a624b3b330de3dc65889aa411f14d581/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:55:51 compute-0 podman[220557]: 2026-01-21 23:55:51.923418885 +0000 UTC m=+0.158390445 container init 99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 21 23:55:51 compute-0 podman[220557]: 2026-01-21 23:55:51.928860254 +0000 UTC m=+0.163831794 container start 99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:55:51 compute-0 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[220583]: [NOTICE]   (220587) : New worker (220589) forked
Jan 21 23:55:51 compute-0 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[220583]: [NOTICE]   (220587) : Loading success.
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.003 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 2d3c6c23-7c5a-4528-8281-b94a6216b9eb in datapath f32b0ae0-64b5-4b08-b029-da33b7e8f96a unbound from our chassis
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.004 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f32b0ae0-64b5-4b08-b029-da33b7e8f96a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.005 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[194178ea-d789-46f8-ac68-656ab01cb15e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.006 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a namespace which is not needed anymore
Jan 21 23:55:52 compute-0 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[220136]: [NOTICE]   (220140) : haproxy version is 2.8.14-c23fe91
Jan 21 23:55:52 compute-0 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[220136]: [NOTICE]   (220140) : path to executable is /usr/sbin/haproxy
Jan 21 23:55:52 compute-0 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[220136]: [WARNING]  (220140) : Exiting Master process...
Jan 21 23:55:52 compute-0 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[220136]: [WARNING]  (220140) : Exiting Master process...
Jan 21 23:55:52 compute-0 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[220136]: [ALERT]    (220140) : Current worker (220142) exited with code 143 (Terminated)
Jan 21 23:55:52 compute-0 neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a[220136]: [WARNING]  (220140) : All workers exited. Exiting... (0)
Jan 21 23:55:52 compute-0 systemd[1]: libpod-8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f.scope: Deactivated successfully.
Jan 21 23:55:52 compute-0 podman[220615]: 2026-01-21 23:55:52.141266598 +0000 UTC m=+0.045697293 container died 8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:55:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f-userdata-shm.mount: Deactivated successfully.
Jan 21 23:55:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-5dea9752c85e3325e70d821233a2883807eeaf626296f19a66ff75a40b4be8b5-merged.mount: Deactivated successfully.
Jan 21 23:55:52 compute-0 podman[220615]: 2026-01-21 23:55:52.181637382 +0000 UTC m=+0.086068047 container cleanup 8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 21 23:55:52 compute-0 systemd[1]: libpod-conmon-8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f.scope: Deactivated successfully.
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.229 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039752.2289135, 70e91a38-1e04-4d71-93e1-4b946f228d7e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.230 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] VM Started (Lifecycle Event)
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.233 182939 DEBUG nova.compute.manager [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.236 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.239 182939 INFO nova.virt.libvirt.driver [-] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Instance spawned successfully.
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.240 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:55:52 compute-0 podman[220653]: 2026-01-21 23:55:52.245450637 +0000 UTC m=+0.041802760 container remove 8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.250 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[59ed34a6-7bca-4c3c-9d6f-5b1545cef32a]: (4, ('Wed Jan 21 11:55:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a (8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f)\n8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f\nWed Jan 21 11:55:52 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a (8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f)\n8e96be9216e8d96aba920287e5a3db293396291b0e0ac21d391a43d082db3b1f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.254 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[338d2d12-2992-47f6-a807-4a84542da995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.255 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf32b0ae0-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.256 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.257 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:52 compute-0 kernel: tapf32b0ae0-60: left promiscuous mode
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.263 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.266 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.267 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.267 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.267 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.268 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.268 182939 DEBUG nova.virt.libvirt.driver [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.271 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.273 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c48b8e-b4e3-45bc-bd1b-b20171111d03]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.289 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ead4ca-c29d-411d-8932-6899935bd385]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.291 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ff3b90-8472-4eba-b347-ecd1111e8579]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.301 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.301 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039752.2299886, 70e91a38-1e04-4d71-93e1-4b946f228d7e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.301 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] VM Paused (Lifecycle Event)
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.309 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e7c2bb-7439-4fed-8287-782c3b92839c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 419644, 'reachable_time': 29644, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220668, 'error': None, 'target': 'ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.312 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f32b0ae0-64b5-4b08-b029-da33b7e8f96a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:55:52 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:55:52.312 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[8614904d-06d3-463d-9ca1-ce85db75eaa2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:55:52 compute-0 systemd[1]: run-netns-ovnmeta\x2df32b0ae0\x2d64b5\x2d4b08\x2db029\x2dda33b7e8f96a.mount: Deactivated successfully.
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.328 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.332 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039752.235797, 70e91a38-1e04-4d71-93e1-4b946f228d7e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.332 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] VM Resumed (Lifecycle Event)
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.361 182939 INFO nova.compute.manager [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Took 12.05 seconds to spawn the instance on the hypervisor.
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.361 182939 DEBUG nova.compute.manager [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.371 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.373 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.414 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.463 182939 INFO nova.compute.manager [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Took 13.13 seconds to build instance.
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.482 182939 DEBUG oslo_concurrency.lockutils [None req-b5f964a4-5b9f-4cdf-98e9-5c2d0c1b7588 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.621 182939 DEBUG nova.network.neutron [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Updated VIF entry in instance network info cache for port 9222eb38-8c2b-4811-ba8b-69dee7a49f2e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.622 182939 DEBUG nova.network.neutron [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Updating instance_info_cache with network_info: [{"id": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "address": "fa:16:3e:f0:34:e9", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9222eb38-8c", "ovs_interfaceid": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.638 182939 DEBUG oslo_concurrency.lockutils [req-906235ec-2ed2-4035-a23f-287a0540d63b req-6c0b8232-8f00-439c-8788-fc595258c2bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-70e91a38-1e04-4d71-93e1-4b946f228d7e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.839 182939 DEBUG nova.compute.manager [req-144b68e5-26a4-4974-a992-1c7c898a1b84 req-a80c7f5a-60d9-4771-adfa-e31386950d75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Received event network-vif-unplugged-2d3c6c23-7c5a-4528-8281-b94a6216b9eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.840 182939 DEBUG oslo_concurrency.lockutils [req-144b68e5-26a4-4974-a992-1c7c898a1b84 req-a80c7f5a-60d9-4771-adfa-e31386950d75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.842 182939 DEBUG oslo_concurrency.lockutils [req-144b68e5-26a4-4974-a992-1c7c898a1b84 req-a80c7f5a-60d9-4771-adfa-e31386950d75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.842 182939 DEBUG oslo_concurrency.lockutils [req-144b68e5-26a4-4974-a992-1c7c898a1b84 req-a80c7f5a-60d9-4771-adfa-e31386950d75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.842 182939 DEBUG nova.compute.manager [req-144b68e5-26a4-4974-a992-1c7c898a1b84 req-a80c7f5a-60d9-4771-adfa-e31386950d75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] No waiting events found dispatching network-vif-unplugged-2d3c6c23-7c5a-4528-8281-b94a6216b9eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:52 compute-0 nova_compute[182935]: 2026-01-21 23:55:52.843 182939 DEBUG nova.compute.manager [req-144b68e5-26a4-4974-a992-1c7c898a1b84 req-a80c7f5a-60d9-4771-adfa-e31386950d75 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Received event network-vif-unplugged-2d3c6c23-7c5a-4528-8281-b94a6216b9eb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:55:53 compute-0 nova_compute[182935]: 2026-01-21 23:55:53.682 182939 DEBUG nova.network.neutron [-] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:55:53 compute-0 nova_compute[182935]: 2026-01-21 23:55:53.699 182939 INFO nova.compute.manager [-] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Took 1.80 seconds to deallocate network for instance.
Jan 21 23:55:53 compute-0 nova_compute[182935]: 2026-01-21 23:55:53.771 182939 DEBUG oslo_concurrency.lockutils [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:53 compute-0 nova_compute[182935]: 2026-01-21 23:55:53.772 182939 DEBUG oslo_concurrency.lockutils [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:53 compute-0 nova_compute[182935]: 2026-01-21 23:55:53.843 182939 DEBUG nova.compute.provider_tree [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:55:53 compute-0 nova_compute[182935]: 2026-01-21 23:55:53.910 182939 DEBUG nova.scheduler.client.report [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:55:53 compute-0 nova_compute[182935]: 2026-01-21 23:55:53.935 182939 DEBUG oslo_concurrency.lockutils [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:53 compute-0 nova_compute[182935]: 2026-01-21 23:55:53.978 182939 INFO nova.scheduler.client.report [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Deleted allocations for instance 6be88afb-f958-4578-a2a6-678e18a8fcef
Jan 21 23:55:54 compute-0 nova_compute[182935]: 2026-01-21 23:55:54.010 182939 DEBUG nova.compute.manager [req-f99b078b-c543-4d47-8db7-767cb67ab6b6 req-9ec824d7-abe7-462e-828c-6a6caecc130a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Received event network-vif-deleted-2d3c6c23-7c5a-4528-8281-b94a6216b9eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:54 compute-0 nova_compute[182935]: 2026-01-21 23:55:54.053 182939 DEBUG oslo_concurrency.lockutils [None req-0da9afbe-de36-4f69-b5fa-409efad6d949 fb46f340c44c473b9286568553cb6374 8556453a9e6644b4b29f7e2585b6beb3 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.555s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:54 compute-0 nova_compute[182935]: 2026-01-21 23:55:54.928 182939 DEBUG nova.compute.manager [req-bf3f3c63-f0cd-49ea-8833-7311a329e442 req-1b372867-f85e-4a5e-add5-b00fcbb49c94 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Received event network-vif-plugged-2d3c6c23-7c5a-4528-8281-b94a6216b9eb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:55:54 compute-0 nova_compute[182935]: 2026-01-21 23:55:54.929 182939 DEBUG oslo_concurrency.lockutils [req-bf3f3c63-f0cd-49ea-8833-7311a329e442 req-1b372867-f85e-4a5e-add5-b00fcbb49c94 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:55:54 compute-0 nova_compute[182935]: 2026-01-21 23:55:54.929 182939 DEBUG oslo_concurrency.lockutils [req-bf3f3c63-f0cd-49ea-8833-7311a329e442 req-1b372867-f85e-4a5e-add5-b00fcbb49c94 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:55:54 compute-0 nova_compute[182935]: 2026-01-21 23:55:54.930 182939 DEBUG oslo_concurrency.lockutils [req-bf3f3c63-f0cd-49ea-8833-7311a329e442 req-1b372867-f85e-4a5e-add5-b00fcbb49c94 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6be88afb-f958-4578-a2a6-678e18a8fcef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:55:54 compute-0 nova_compute[182935]: 2026-01-21 23:55:54.930 182939 DEBUG nova.compute.manager [req-bf3f3c63-f0cd-49ea-8833-7311a329e442 req-1b372867-f85e-4a5e-add5-b00fcbb49c94 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] No waiting events found dispatching network-vif-plugged-2d3c6c23-7c5a-4528-8281-b94a6216b9eb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:55:54 compute-0 nova_compute[182935]: 2026-01-21 23:55:54.930 182939 WARNING nova.compute.manager [req-bf3f3c63-f0cd-49ea-8833-7311a329e442 req-1b372867-f85e-4a5e-add5-b00fcbb49c94 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Received unexpected event network-vif-plugged-2d3c6c23-7c5a-4528-8281-b94a6216b9eb for instance with vm_state deleted and task_state None.
Jan 21 23:55:55 compute-0 nova_compute[182935]: 2026-01-21 23:55:55.913 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:55:56 compute-0 nova_compute[182935]: 2026-01-21 23:55:56.811 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:00 compute-0 nova_compute[182935]: 2026-01-21 23:56:00.939 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:01 compute-0 ovn_controller[95047]: 2026-01-21T23:56:01Z|00228|binding|INFO|Releasing lport bb8c3f45-55b8-4c8e-8a31-26c5ecb4fb32 from this chassis (sb_readonly=0)
Jan 21 23:56:01 compute-0 nova_compute[182935]: 2026-01-21 23:56:01.091 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:01 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:01.374 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:01 compute-0 nova_compute[182935]: 2026-01-21 23:56:01.374 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:01 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:01.375 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:56:01 compute-0 nova_compute[182935]: 2026-01-21 23:56:01.814 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:02 compute-0 podman[220670]: 2026-01-21 23:56:02.696677082 +0000 UTC m=+0.058654892 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:56:02 compute-0 podman[220669]: 2026-01-21 23:56:02.734925655 +0000 UTC m=+0.100015979 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 21 23:56:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:03.190 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:03.190 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:03.191 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:05 compute-0 ovn_controller[95047]: 2026-01-21T23:56:05Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f0:34:e9 10.100.0.3
Jan 21 23:56:05 compute-0 ovn_controller[95047]: 2026-01-21T23:56:05Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f0:34:e9 10.100.0.3
Jan 21 23:56:05 compute-0 nova_compute[182935]: 2026-01-21 23:56:05.942 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:06 compute-0 nova_compute[182935]: 2026-01-21 23:56:06.785 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039751.783358, 6be88afb-f958-4578-a2a6-678e18a8fcef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:06 compute-0 nova_compute[182935]: 2026-01-21 23:56:06.785 182939 INFO nova.compute.manager [-] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] VM Stopped (Lifecycle Event)
Jan 21 23:56:06 compute-0 nova_compute[182935]: 2026-01-21 23:56:06.815 182939 DEBUG nova.compute.manager [None req-abcf79fc-4803-4bfa-bf70-cea1a3221412 - - - - - -] [instance: 6be88afb-f958-4578-a2a6-678e18a8fcef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:06 compute-0 nova_compute[182935]: 2026-01-21 23:56:06.816 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:07 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:07.377 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:09 compute-0 podman[220733]: 2026-01-21 23:56:09.754043376 +0000 UTC m=+0.113849690 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:56:10 compute-0 nova_compute[182935]: 2026-01-21 23:56:10.984 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:11 compute-0 nova_compute[182935]: 2026-01-21 23:56:11.818 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:13 compute-0 podman[220758]: 2026-01-21 23:56:13.689028004 +0000 UTC m=+0.061169082 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.023 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "c83d6412-f55f-4cbb-92ef-d461894cfdce" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.023 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.077 182939 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.257 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.258 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.270 182939 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.270 182939 INFO nova.compute.claims [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.472 182939 DEBUG nova.compute.provider_tree [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.487 182939 DEBUG nova.scheduler.client.report [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.514 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.515 182939 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.607 182939 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.607 182939 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.639 182939 INFO nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.684 182939 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.852 182939 DEBUG nova.policy [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.862 182939 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.863 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.863 182939 INFO nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Creating image(s)
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.864 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "/var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.864 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "/var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.865 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "/var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.879 182939 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.974 182939 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.975 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.976 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:14 compute-0 nova_compute[182935]: 2026-01-21 23:56:14.987 182939 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.042 182939 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.044 182939 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.092 182939 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.093 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.094 182939 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.149 182939 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.151 182939 DEBUG nova.virt.disk.api [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Checking if we can resize image /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.151 182939 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.207 182939 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.208 182939 DEBUG nova.virt.disk.api [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Cannot resize image /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.208 182939 DEBUG nova.objects.instance [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lazy-loading 'migration_context' on Instance uuid c83d6412-f55f-4cbb-92ef-d461894cfdce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.225 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.226 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Ensure instance console log exists: /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.226 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.226 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.227 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.865 182939 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Successfully created port: efba6c4f-8bb6-4730-b537-68853c9b33aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:56:15 compute-0 nova_compute[182935]: 2026-01-21 23:56:15.986 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:16 compute-0 nova_compute[182935]: 2026-01-21 23:56:16.820 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:17 compute-0 nova_compute[182935]: 2026-01-21 23:56:17.493 182939 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Successfully updated port: efba6c4f-8bb6-4730-b537-68853c9b33aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:56:17 compute-0 nova_compute[182935]: 2026-01-21 23:56:17.527 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "refresh_cache-c83d6412-f55f-4cbb-92ef-d461894cfdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:56:17 compute-0 nova_compute[182935]: 2026-01-21 23:56:17.528 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquired lock "refresh_cache-c83d6412-f55f-4cbb-92ef-d461894cfdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:56:17 compute-0 nova_compute[182935]: 2026-01-21 23:56:17.528 182939 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:56:17 compute-0 nova_compute[182935]: 2026-01-21 23:56:17.663 182939 DEBUG nova.compute.manager [req-51fd5771-c61e-49b1-adb8-8988ef96579a req-f61cf08a-e597-44b0-bcae-381a6e750fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Received event network-changed-efba6c4f-8bb6-4730-b537-68853c9b33aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:17 compute-0 nova_compute[182935]: 2026-01-21 23:56:17.664 182939 DEBUG nova.compute.manager [req-51fd5771-c61e-49b1-adb8-8988ef96579a req-f61cf08a-e597-44b0-bcae-381a6e750fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Refreshing instance network info cache due to event network-changed-efba6c4f-8bb6-4730-b537-68853c9b33aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:56:17 compute-0 nova_compute[182935]: 2026-01-21 23:56:17.664 182939 DEBUG oslo_concurrency.lockutils [req-51fd5771-c61e-49b1-adb8-8988ef96579a req-f61cf08a-e597-44b0-bcae-381a6e750fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c83d6412-f55f-4cbb-92ef-d461894cfdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:56:18 compute-0 nova_compute[182935]: 2026-01-21 23:56:18.181 182939 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.662 182939 DEBUG nova.network.neutron [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Updating instance_info_cache with network_info: [{"id": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "address": "fa:16:3e:25:f0:96", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefba6c4f-8b", "ovs_interfaceid": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.682 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Releasing lock "refresh_cache-c83d6412-f55f-4cbb-92ef-d461894cfdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.683 182939 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Instance network_info: |[{"id": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "address": "fa:16:3e:25:f0:96", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefba6c4f-8b", "ovs_interfaceid": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.684 182939 DEBUG oslo_concurrency.lockutils [req-51fd5771-c61e-49b1-adb8-8988ef96579a req-f61cf08a-e597-44b0-bcae-381a6e750fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c83d6412-f55f-4cbb-92ef-d461894cfdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.684 182939 DEBUG nova.network.neutron [req-51fd5771-c61e-49b1-adb8-8988ef96579a req-f61cf08a-e597-44b0-bcae-381a6e750fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Refreshing network info cache for port efba6c4f-8bb6-4730-b537-68853c9b33aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.689 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Start _get_guest_xml network_info=[{"id": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "address": "fa:16:3e:25:f0:96", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefba6c4f-8b", "ovs_interfaceid": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.698 182939 WARNING nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.705 182939 DEBUG nova.virt.libvirt.host [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.707 182939 DEBUG nova.virt.libvirt.host [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.710 182939 DEBUG nova.virt.libvirt.host [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.710 182939 DEBUG nova.virt.libvirt.host [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.712 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.713 182939 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.713 182939 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.713 182939 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.713 182939 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.714 182939 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.714 182939 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.714 182939 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.714 182939 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.714 182939 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.715 182939 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.715 182939 DEBUG nova.virt.hardware [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.718 182939 DEBUG nova.virt.libvirt.vif [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:56:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1677728672',display_name='tempest-ListServersNegativeTestJSON-server-1677728672-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1677728672-2',id=66,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='414437860afc460b9e86d674975e9d1f',ramdisk_id='',reservation_id='r-dznclk2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1787990789',owner_user_name='tempest-ListServersNegativeTestJSON-1787990789-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:14Z,user_data=None,user_id='9a4a4a5f3c9f4c5091261592272bcb81',uuid=c83d6412-f55f-4cbb-92ef-d461894cfdce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "address": "fa:16:3e:25:f0:96", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefba6c4f-8b", "ovs_interfaceid": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.719 182939 DEBUG nova.network.os_vif_util [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converting VIF {"id": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "address": "fa:16:3e:25:f0:96", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefba6c4f-8b", "ovs_interfaceid": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.719 182939 DEBUG nova.network.os_vif_util [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=efba6c4f-8bb6-4730-b537-68853c9b33aa,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefba6c4f-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.720 182939 DEBUG nova.objects.instance [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lazy-loading 'pci_devices' on Instance uuid c83d6412-f55f-4cbb-92ef-d461894cfdce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.734 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:56:19 compute-0 nova_compute[182935]:   <uuid>c83d6412-f55f-4cbb-92ef-d461894cfdce</uuid>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   <name>instance-00000042</name>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <nova:name>tempest-ListServersNegativeTestJSON-server-1677728672-2</nova:name>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:56:19</nova:creationTime>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:56:19 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:56:19 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:56:19 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:56:19 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:56:19 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:56:19 compute-0 nova_compute[182935]:         <nova:user uuid="9a4a4a5f3c9f4c5091261592272bcb81">tempest-ListServersNegativeTestJSON-1787990789-project-member</nova:user>
Jan 21 23:56:19 compute-0 nova_compute[182935]:         <nova:project uuid="414437860afc460b9e86d674975e9d1f">tempest-ListServersNegativeTestJSON-1787990789</nova:project>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:56:19 compute-0 nova_compute[182935]:         <nova:port uuid="efba6c4f-8bb6-4730-b537-68853c9b33aa">
Jan 21 23:56:19 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <system>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <entry name="serial">c83d6412-f55f-4cbb-92ef-d461894cfdce</entry>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <entry name="uuid">c83d6412-f55f-4cbb-92ef-d461894cfdce</entry>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     </system>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   <os>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   </os>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   <features>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   </features>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.config"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:25:f0:96"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <target dev="tapefba6c4f-8b"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/console.log" append="off"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <video>
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     </video>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:56:19 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:56:19 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:56:19 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:56:19 compute-0 nova_compute[182935]: </domain>
Jan 21 23:56:19 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.736 182939 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Preparing to wait for external event network-vif-plugged-efba6c4f-8bb6-4730-b537-68853c9b33aa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.736 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.736 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.736 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.737 182939 DEBUG nova.virt.libvirt.vif [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:56:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1677728672',display_name='tempest-ListServersNegativeTestJSON-server-1677728672-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1677728672-2',id=66,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='414437860afc460b9e86d674975e9d1f',ramdisk_id='',reservation_id='r-dznclk2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1787990789',owner_user_name='tempest-ListServersNegativeTestJSON-1787990789-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:56:14Z,user_data=None,user_id='9a4a4a5f3c9f4c5091261592272bcb81',uuid=c83d6412-f55f-4cbb-92ef-d461894cfdce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "address": "fa:16:3e:25:f0:96", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefba6c4f-8b", "ovs_interfaceid": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.737 182939 DEBUG nova.network.os_vif_util [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converting VIF {"id": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "address": "fa:16:3e:25:f0:96", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefba6c4f-8b", "ovs_interfaceid": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.738 182939 DEBUG nova.network.os_vif_util [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=efba6c4f-8bb6-4730-b537-68853c9b33aa,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefba6c4f-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.738 182939 DEBUG os_vif [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=efba6c4f-8bb6-4730-b537-68853c9b33aa,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefba6c4f-8b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.739 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.739 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.740 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.742 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.742 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapefba6c4f-8b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.743 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapefba6c4f-8b, col_values=(('external_ids', {'iface-id': 'efba6c4f-8bb6-4730-b537-68853c9b33aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:25:f0:96', 'vm-uuid': 'c83d6412-f55f-4cbb-92ef-d461894cfdce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.745 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:19 compute-0 NetworkManager[55139]: <info>  [1769039779.7464] manager: (tapefba6c4f-8b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.747 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.757 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.758 182939 INFO os_vif [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=efba6c4f-8bb6-4730-b537-68853c9b33aa,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefba6c4f-8b')
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.821 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.821 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.822 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] No VIF found with MAC fa:16:3e:25:f0:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:56:19 compute-0 nova_compute[182935]: 2026-01-21 23:56:19.822 182939 INFO nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Using config drive
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.346 182939 INFO nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Creating config drive at /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.config
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.353 182939 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9gnpcp_9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.486 182939 DEBUG oslo_concurrency.processutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9gnpcp_9" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:20 compute-0 kernel: tapefba6c4f-8b: entered promiscuous mode
Jan 21 23:56:20 compute-0 NetworkManager[55139]: <info>  [1769039780.5606] manager: (tapefba6c4f-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/107)
Jan 21 23:56:20 compute-0 ovn_controller[95047]: 2026-01-21T23:56:20Z|00229|binding|INFO|Claiming lport efba6c4f-8bb6-4730-b537-68853c9b33aa for this chassis.
Jan 21 23:56:20 compute-0 ovn_controller[95047]: 2026-01-21T23:56:20Z|00230|binding|INFO|efba6c4f-8bb6-4730-b537-68853c9b33aa: Claiming fa:16:3e:25:f0:96 10.100.0.7
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.563 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.569 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.583 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:f0:96 10.100.0.7'], port_security=['fa:16:3e:25:f0:96 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-835f4434-3fa6-458b-b79c-b27830f531cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '414437860afc460b9e86d674975e9d1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '30db0ce4-28a9-4add-b257-f90dc081c48d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2a61a9c-1832-4a5f-89c7-e09ac8a1046e, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=efba6c4f-8bb6-4730-b537-68853c9b33aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.584 104408 INFO neutron.agent.ovn.metadata.agent [-] Port efba6c4f-8bb6-4730-b537-68853c9b33aa in datapath 835f4434-3fa6-458b-b79c-b27830f531cf bound to our chassis
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.585 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 835f4434-3fa6-458b-b79c-b27830f531cf
Jan 21 23:56:20 compute-0 systemd-udevd[220811]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:56:20 compute-0 systemd-machined[154182]: New machine qemu-33-instance-00000042.
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.602 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[387f7782-3bcb-42dc-8486-b217ac66f1af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.604 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap835f4434-31 in ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:56:20 compute-0 NetworkManager[55139]: <info>  [1769039780.6077] device (tapefba6c4f-8b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.607 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap835f4434-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.607 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f73d2ff3-b8a6-4904-a591-fa25868da808]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 NetworkManager[55139]: <info>  [1769039780.6086] device (tapefba6c4f-8b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.608 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[83ed8657-289a-46fa-95a9-ff6d6931ac11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.621 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.622 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[7a40c251-47b3-4dd9-bc14-800d7474a03a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 ovn_controller[95047]: 2026-01-21T23:56:20Z|00231|binding|INFO|Setting lport efba6c4f-8bb6-4730-b537-68853c9b33aa ovn-installed in OVS
Jan 21 23:56:20 compute-0 ovn_controller[95047]: 2026-01-21T23:56:20Z|00232|binding|INFO|Setting lport efba6c4f-8bb6-4730-b537-68853c9b33aa up in Southbound
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.627 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:20 compute-0 systemd[1]: Started Virtual Machine qemu-33-instance-00000042.
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.650 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[af7301f6-570d-4ca8-93f2-a282e5347db2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.681 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe40c1c-d9ea-4c32-829a-7646b97d34e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 NetworkManager[55139]: <info>  [1769039780.6902] manager: (tap835f4434-30): new Veth device (/org/freedesktop/NetworkManager/Devices/108)
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.689 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[606bba1b-34b8-43d0-89cd-9cdfb3872ef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.724 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[501ecaee-5a90-424b-bd8b-56801d718e2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.727 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d8509334-868f-4543-9735-be01edcb9259]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 NetworkManager[55139]: <info>  [1769039780.7492] device (tap835f4434-30): carrier: link connected
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.755 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[253116e8-6b44-45b0-ae7e-7ddc34afc23c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.775 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[09d75a6d-0bab-480b-bc04-f5e5b29f3e49]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap835f4434-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:51:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428348, 'reachable_time': 38839, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220845, 'error': None, 'target': 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.794 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6452f7-bde6-4ee8-b88e-554b9beeb402]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3d:5107'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428348, 'tstamp': 428348}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220846, 'error': None, 'target': 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.813 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[62bc6904-72c2-478b-8add-9c08acd3791a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap835f4434-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:51:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428348, 'reachable_time': 38839, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220847, 'error': None, 'target': 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.845 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[51804108-86af-47db-81d3-873a71f016aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.898 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d82730b8-53d1-4542-8f15-144bcbd09a76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.900 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap835f4434-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.900 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.901 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap835f4434-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.903 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:20 compute-0 kernel: tap835f4434-30: entered promiscuous mode
Jan 21 23:56:20 compute-0 NetworkManager[55139]: <info>  [1769039780.9040] manager: (tap835f4434-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.906 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap835f4434-30, col_values=(('external_ids', {'iface-id': '8bc16eeb-6666-4300-9ce8-0a810442a173'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.907 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:20 compute-0 ovn_controller[95047]: 2026-01-21T23:56:20Z|00233|binding|INFO|Releasing lport 8bc16eeb-6666-4300-9ce8-0a810442a173 from this chassis (sb_readonly=0)
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.909 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.909 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/835f4434-3fa6-458b-b79c-b27830f531cf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/835f4434-3fa6-458b-b79c-b27830f531cf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.911 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9188030b-620a-48ff-bdb2-e55e685854de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.912 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-835f4434-3fa6-458b-b79c-b27830f531cf
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/835f4434-3fa6-458b-b79c-b27830f531cf.pid.haproxy
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 835f4434-3fa6-458b-b79c-b27830f531cf
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:56:20 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:20.913 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'env', 'PROCESS_TAG=haproxy-835f4434-3fa6-458b-b79c-b27830f531cf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/835f4434-3fa6-458b-b79c-b27830f531cf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.919 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.942 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039780.9409096, c83d6412-f55f-4cbb-92ef-d461894cfdce => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.942 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] VM Started (Lifecycle Event)
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.975 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.980 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039780.9412172, c83d6412-f55f-4cbb-92ef-d461894cfdce => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.980 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] VM Paused (Lifecycle Event)
Jan 21 23:56:20 compute-0 nova_compute[182935]: 2026-01-21 23:56:20.988 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.004 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.008 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.063 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.085 182939 DEBUG oslo_concurrency.lockutils [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "70e91a38-1e04-4d71-93e1-4b946f228d7e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.086 182939 DEBUG oslo_concurrency.lockutils [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.087 182939 DEBUG oslo_concurrency.lockutils [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.087 182939 DEBUG oslo_concurrency.lockutils [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.088 182939 DEBUG oslo_concurrency.lockutils [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.104 182939 INFO nova.compute.manager [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Terminating instance
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.117 182939 DEBUG nova.compute.manager [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:56:21 compute-0 kernel: tap9222eb38-8c (unregistering): left promiscuous mode
Jan 21 23:56:21 compute-0 NetworkManager[55139]: <info>  [1769039781.1449] device (tap9222eb38-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:56:21 compute-0 ovn_controller[95047]: 2026-01-21T23:56:21Z|00234|binding|INFO|Releasing lport 9222eb38-8c2b-4811-ba8b-69dee7a49f2e from this chassis (sb_readonly=0)
Jan 21 23:56:21 compute-0 ovn_controller[95047]: 2026-01-21T23:56:21Z|00235|binding|INFO|Setting lport 9222eb38-8c2b-4811-ba8b-69dee7a49f2e down in Southbound
Jan 21 23:56:21 compute-0 ovn_controller[95047]: 2026-01-21T23:56:21Z|00236|binding|INFO|Removing iface tap9222eb38-8c ovn-installed in OVS
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.155 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.170 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.173 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:34:e9 10.100.0.3'], port_security=['fa:16:3e:f0:34:e9 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '70e91a38-1e04-4d71-93e1-4b946f228d7e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70b1c9f8be0042aa8de9841a26729700', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5943869c-ade1-4cd3-81a5-29e65236fb49', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88d3d39a-f56f-4f3b-95e9-79768ac7b596, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=9222eb38-8c2b-4811-ba8b-69dee7a49f2e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:21 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Jan 21 23:56:21 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000003e.scope: Consumed 14.820s CPU time.
Jan 21 23:56:21 compute-0 systemd-machined[154182]: Machine qemu-32-instance-0000003e terminated.
Jan 21 23:56:21 compute-0 podman[220889]: 2026-01-21 23:56:21.30534991 +0000 UTC m=+0.063131850 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:56:21 compute-0 podman[220888]: 2026-01-21 23:56:21.31040682 +0000 UTC m=+0.063813375 container create 88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 21 23:56:21 compute-0 podman[220882]: 2026-01-21 23:56:21.319519628 +0000 UTC m=+0.079415198 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.334 182939 DEBUG nova.compute.manager [req-76efb168-6a09-476f-958a-85d4a2e2f7c2 req-335587fc-7af4-4bf9-96c4-e2df8608f01d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Received event network-vif-plugged-efba6c4f-8bb6-4730-b537-68853c9b33aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.335 182939 DEBUG oslo_concurrency.lockutils [req-76efb168-6a09-476f-958a-85d4a2e2f7c2 req-335587fc-7af4-4bf9-96c4-e2df8608f01d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.336 182939 DEBUG oslo_concurrency.lockutils [req-76efb168-6a09-476f-958a-85d4a2e2f7c2 req-335587fc-7af4-4bf9-96c4-e2df8608f01d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.336 182939 DEBUG oslo_concurrency.lockutils [req-76efb168-6a09-476f-958a-85d4a2e2f7c2 req-335587fc-7af4-4bf9-96c4-e2df8608f01d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.336 182939 DEBUG nova.compute.manager [req-76efb168-6a09-476f-958a-85d4a2e2f7c2 req-335587fc-7af4-4bf9-96c4-e2df8608f01d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Processing event network-vif-plugged-efba6c4f-8bb6-4730-b537-68853c9b33aa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.337 182939 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.343 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039781.3434289, c83d6412-f55f-4cbb-92ef-d461894cfdce => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.344 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] VM Resumed (Lifecycle Event)
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.346 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:56:21 compute-0 systemd[1]: Started libpod-conmon-88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2.scope.
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.360 182939 INFO nova.virt.libvirt.driver [-] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Instance spawned successfully.
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.361 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:56:21 compute-0 podman[220888]: 2026-01-21 23:56:21.280084275 +0000 UTC m=+0.033490850 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:56:21 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.377 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/de7b40bbb129e24fb8fa7c9ada3b5fee3b1f2ad50cbc8e7e70cf7b056c2d7b89/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.384 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.387 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.387 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.387 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.388 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.388 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.388 182939 DEBUG nova.virt.libvirt.driver [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.392 182939 INFO nova.virt.libvirt.driver [-] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Instance destroyed successfully.
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.394 182939 DEBUG nova.objects.instance [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lazy-loading 'resources' on Instance uuid 70e91a38-1e04-4d71-93e1-4b946f228d7e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:21 compute-0 podman[220888]: 2026-01-21 23:56:21.398858322 +0000 UTC m=+0.152264877 container init 88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 21 23:56:21 compute-0 podman[220888]: 2026-01-21 23:56:21.404305773 +0000 UTC m=+0.157712338 container start 88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:56:21 compute-0 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220948]: [NOTICE]   (220961) : New worker (220963) forked
Jan 21 23:56:21 compute-0 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220948]: [NOTICE]   (220961) : Loading success.
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.426 182939 DEBUG nova.virt.libvirt.vif [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:55:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-767265430',display_name='tempest-ListServerFiltersTestJSON-instance-767265430',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-767265430',id=62,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:55:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70b1c9f8be0042aa8de9841a26729700',ramdisk_id='',reservation_id='r-jkx275db',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1547380946',owner_user_name='tempest-ListServerFiltersTestJSON-1547380946-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:55:52Z,user_data=None,user_id='7e79b904cb8a49f990b05eb0ed72fdf4',uuid=70e91a38-1e04-4d71-93e1-4b946f228d7e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "address": "fa:16:3e:f0:34:e9", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9222eb38-8c", "ovs_interfaceid": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.427 182939 DEBUG nova.network.os_vif_util [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converting VIF {"id": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "address": "fa:16:3e:f0:34:e9", "network": {"id": "a78bfb22-a192-4dbe-a117-9f8a59130e27", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-9709523-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70b1c9f8be0042aa8de9841a26729700", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9222eb38-8c", "ovs_interfaceid": "9222eb38-8c2b-4811-ba8b-69dee7a49f2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.428 182939 DEBUG nova.network.os_vif_util [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:34:e9,bridge_name='br-int',has_traffic_filtering=True,id=9222eb38-8c2b-4811-ba8b-69dee7a49f2e,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9222eb38-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.428 182939 DEBUG os_vif [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:34:e9,bridge_name='br-int',has_traffic_filtering=True,id=9222eb38-8c2b-4811-ba8b-69dee7a49f2e,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9222eb38-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.430 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.431 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9222eb38-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.435 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.436 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.437 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.439 182939 INFO os_vif [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:34:e9,bridge_name='br-int',has_traffic_filtering=True,id=9222eb38-8c2b-4811-ba8b-69dee7a49f2e,network=Network(a78bfb22-a192-4dbe-a117-9f8a59130e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9222eb38-8c')
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.440 182939 INFO nova.virt.libvirt.driver [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Deleting instance files /var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e_del
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.441 182939 INFO nova.virt.libvirt.driver [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Deletion of /var/lib/nova/instances/70e91a38-1e04-4d71-93e1-4b946f228d7e_del complete
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.475 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 9222eb38-8c2b-4811-ba8b-69dee7a49f2e in datapath a78bfb22-a192-4dbe-a117-9f8a59130e27 unbound from our chassis
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.476 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a78bfb22-a192-4dbe-a117-9f8a59130e27, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.477 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[227880a6-d0d6-42ca-b3ee-470ccfb81045]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.478 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 namespace which is not needed anymore
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.511 182939 INFO nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Took 6.65 seconds to spawn the instance on the hypervisor.
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.511 182939 DEBUG nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.577 182939 INFO nova.compute.manager [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Took 0.46 seconds to destroy the instance on the hypervisor.
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.577 182939 DEBUG oslo.service.loopingcall [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.577 182939 DEBUG nova.compute.manager [-] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.578 182939 DEBUG nova.network.neutron [-] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.581 182939 DEBUG nova.network.neutron [req-51fd5771-c61e-49b1-adb8-8988ef96579a req-f61cf08a-e597-44b0-bcae-381a6e750fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Updated VIF entry in instance network info cache for port efba6c4f-8bb6-4730-b537-68853c9b33aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.582 182939 DEBUG nova.network.neutron [req-51fd5771-c61e-49b1-adb8-8988ef96579a req-f61cf08a-e597-44b0-bcae-381a6e750fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Updating instance_info_cache with network_info: [{"id": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "address": "fa:16:3e:25:f0:96", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefba6c4f-8b", "ovs_interfaceid": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:21 compute-0 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[220583]: [NOTICE]   (220587) : haproxy version is 2.8.14-c23fe91
Jan 21 23:56:21 compute-0 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[220583]: [NOTICE]   (220587) : path to executable is /usr/sbin/haproxy
Jan 21 23:56:21 compute-0 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[220583]: [WARNING]  (220587) : Exiting Master process...
Jan 21 23:56:21 compute-0 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[220583]: [WARNING]  (220587) : Exiting Master process...
Jan 21 23:56:21 compute-0 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[220583]: [ALERT]    (220587) : Current worker (220589) exited with code 143 (Terminated)
Jan 21 23:56:21 compute-0 neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27[220583]: [WARNING]  (220587) : All workers exited. Exiting... (0)
Jan 21 23:56:21 compute-0 systemd[1]: libpod-99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89.scope: Deactivated successfully.
Jan 21 23:56:21 compute-0 podman[220988]: 2026-01-21 23:56:21.600419757 +0000 UTC m=+0.045039567 container died 99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.607 182939 DEBUG oslo_concurrency.lockutils [req-51fd5771-c61e-49b1-adb8-8988ef96579a req-f61cf08a-e597-44b0-bcae-381a6e750fe2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c83d6412-f55f-4cbb-92ef-d461894cfdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:56:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89-userdata-shm.mount: Deactivated successfully.
Jan 21 23:56:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-0f6990b2f8a9261035cdfabe49db0733a624b3b330de3dc65889aa411f14d581-merged.mount: Deactivated successfully.
Jan 21 23:56:21 compute-0 podman[220988]: 2026-01-21 23:56:21.634249074 +0000 UTC m=+0.078868874 container cleanup 99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:56:21 compute-0 systemd[1]: libpod-conmon-99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89.scope: Deactivated successfully.
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.648 182939 INFO nova.compute.manager [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Took 7.43 seconds to build instance.
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.673 182939 DEBUG oslo_concurrency.lockutils [None req-38ece97f-025d-4e62-9204-d5dc562a12a1 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.650s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:21 compute-0 podman[221016]: 2026-01-21 23:56:21.703357376 +0000 UTC m=+0.047430294 container remove 99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.712 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb2af9c-cded-4a54-9fe3-aee253a52252]: (4, ('Wed Jan 21 11:56:21 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 (99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89)\n99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89\nWed Jan 21 11:56:21 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 (99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89)\n99215c0adf3f1bcd7a75a175dd3f87c228f878283fe6059197bd31f2cb109f89\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.714 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a521d124-3a51-4a4e-ac03-1b4e84005ed6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.717 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa78bfb22-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.718 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-0 kernel: tapa78bfb22-a0: left promiscuous mode
Jan 21 23:56:21 compute-0 nova_compute[182935]: 2026-01-21 23:56:21.734 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.737 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[481dc454-bb4b-4587-bc4d-46c10664f874]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.753 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f194bfbe-4647-44b2-b19e-73e99e5e2b6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.755 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f2d5420c-d72b-4ec9-bdb1-f342710c628a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.773 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1d3ea7b1-cfda-4480-bc9e-b39d4f70af51]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425392, 'reachable_time': 27420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221030, 'error': None, 'target': 'ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:21 compute-0 systemd[1]: run-netns-ovnmeta\x2da78bfb22\x2da192\x2d4dbe\x2da117\x2d9f8a59130e27.mount: Deactivated successfully.
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.775 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a78bfb22-a192-4dbe-a117-9f8a59130e27 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:56:21 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:21.776 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[258ebce5-6c6d-4854-b098-b19864947acc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:23 compute-0 sshd-session[221031]: Invalid user weblogic from 188.166.69.60 port 56122
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.140 182939 DEBUG nova.network.neutron [-] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.168 182939 INFO nova.compute.manager [-] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Took 1.59 seconds to deallocate network for instance.
Jan 21 23:56:23 compute-0 sshd-session[221031]: Connection closed by invalid user weblogic 188.166.69.60 port 56122 [preauth]
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.305 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000042', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '414437860afc460b9e86d674975e9d1f', 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'hostId': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.306 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.311 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c83d6412-f55f-4cbb-92ef-d461894cfdce / tapefba6c4f-8b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.311 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a7c1268-3e1e-4e82-9e2f-0ded40ffb337', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000042-c83d6412-f55f-4cbb-92ef-d461894cfdce-tapefba6c4f-8b', 'timestamp': '2026-01-21T23:56:23.307104', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'tapefba6c4f-8b', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:f0:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefba6c4f-8b'}, 'message_id': 'ca4896de-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.101436946, 'message_signature': '73c14c8ee944eccb0a8f33a6608e4f810ec51eaacd1e4b440732eba834a2a546'}]}, 'timestamp': '2026-01-21 23:56:23.313067', '_unique_id': 'bf22f371a36c47b68bb7ed5f5218c511'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.317 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.319 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d74c183-ff0c-4bce-bf3b-20bad8bb39a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000042-c83d6412-f55f-4cbb-92ef-d461894cfdce-tapefba6c4f-8b', 'timestamp': '2026-01-21T23:56:23.319303', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'tapefba6c4f-8b', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:f0:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefba6c4f-8b'}, 'message_id': 'ca49b0f0-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.101436946, 'message_signature': '2edb61616a3663d1f96fc566826c902c94d05d7d77c62c4456080c04d7e13923'}]}, 'timestamp': '2026-01-21 23:56:23.319851', '_unique_id': '4908ec5f0df64eb1b8bf3f025eb8431e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.320 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.322 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f05dbe4-c7a5-434f-9cfc-22a3f803d105', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000042-c83d6412-f55f-4cbb-92ef-d461894cfdce-tapefba6c4f-8b', 'timestamp': '2026-01-21T23:56:23.322297', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'tapefba6c4f-8b', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:f0:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefba6c4f-8b'}, 'message_id': 'ca4a2670-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.101436946, 'message_signature': 'ac66a7b5a8bc8b96f12e9e523672149e0aef0576da5831c2adb308406e8284d8'}]}, 'timestamp': '2026-01-21 23:56:23.322855', '_unique_id': '57406ff9834d4226bddf4411d041c0b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.323 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.343 182939 DEBUG nova.compute.manager [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Received event network-vif-unplugged-9222eb38-8c2b-4811-ba8b-69dee7a49f2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.346 182939 DEBUG oslo_concurrency.lockutils [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.350 182939 DEBUG oslo_concurrency.lockutils [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.352 182939 DEBUG oslo_concurrency.lockutils [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.353 182939 DEBUG nova.compute.manager [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] No waiting events found dispatching network-vif-unplugged-9222eb38-8c2b-4811-ba8b-69dee7a49f2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.355 182939 WARNING nova.compute.manager [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Received unexpected event network-vif-unplugged-9222eb38-8c2b-4811-ba8b-69dee7a49f2e for instance with vm_state deleted and task_state None.
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.356 182939 DEBUG nova.compute.manager [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Received event network-vif-plugged-9222eb38-8c2b-4811-ba8b-69dee7a49f2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.360 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.361 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.359 182939 DEBUG oslo_concurrency.lockutils [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bd3f810-d9cf-4ccb-8628-66cfe2147d0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-vda', 'timestamp': '2026-01-21T23:56:23.325298', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca4ff87a-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.119592359, 'message_signature': '5b730fdf1c01984ef37bd73b141cacf03adca7a8e6e0ef74d70719b7aa19d33c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-sda', 'timestamp': '2026-01-21T23:56:23.325298', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca500f40-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.119592359, 'message_signature': '133c1459e2412c3d33fa9c9d2f77026a640d44b415e4daeefb519167b26cb004'}]}, 'timestamp': '2026-01-21 23:56:23.361539', '_unique_id': '50c5511684324c5aa13c32cdf52c47c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.363 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.364 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.364 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.364 182939 DEBUG oslo_concurrency.lockutils [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '671596cd-37ed-4b86-b6fc-86b3403562ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000042-c83d6412-f55f-4cbb-92ef-d461894cfdce-tapefba6c4f-8b', 'timestamp': '2026-01-21T23:56:23.364875', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'tapefba6c4f-8b', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:f0:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefba6c4f-8b'}, 'message_id': 'ca50a536-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.101436946, 'message_signature': 'ceb52f9215b656b1f8bafcaaeb65652c725b39a48b15a106cdb86fcbd6906ff4'}]}, 'timestamp': '2026-01-21 23:56:23.365398', '_unique_id': 'fabe50b66099448a8e5656e3a044dfe0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.366 182939 DEBUG oslo_concurrency.lockutils [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.366 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.367 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.367 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.368 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.368 182939 DEBUG nova.compute.manager [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] No waiting events found dispatching network-vif-plugged-9222eb38-8c2b-4811-ba8b-69dee7a49f2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.370 182939 WARNING nova.compute.manager [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Received unexpected event network-vif-plugged-9222eb38-8c2b-4811-ba8b-69dee7a49f2e for instance with vm_state deleted and task_state None.
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed8d0abf-0350-4292-a94f-725f7bdda22c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-vda', 'timestamp': '2026-01-21T23:56:23.367785', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca51173c-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.119592359, 'message_signature': '188bcb5fe3dd46d07ae22a5ef6081c3e277599ef416915a634b36d85e5d943f2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-sda', 'timestamp': '2026-01-21T23:56:23.367785', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca512d62-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.119592359, 'message_signature': '4b338146b80c7b2eb1a2931cfdd24a94f04bae0baa27d5b4d1690d0da0b21a90'}]}, 'timestamp': '2026-01-21 23:56:23.368883', '_unique_id': '28f2f48002cc437481218580f618f9f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.369 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.371 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.371 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.371 182939 DEBUG nova.compute.manager [req-09b0d55d-fc8c-4039-99ef-3c0ba4a6bdef req-5e089d0c-68ba-4188-b6dc-4347a1cae637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Received event network-vif-deleted-9222eb38-8c2b-4811-ba8b-69dee7a49f2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2d3d3bf-ff69-4ad5-945b-79e8ce7fe5a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000042-c83d6412-f55f-4cbb-92ef-d461894cfdce-tapefba6c4f-8b', 'timestamp': '2026-01-21T23:56:23.371293', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'tapefba6c4f-8b', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:f0:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefba6c4f-8b'}, 'message_id': 'ca519f0e-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.101436946, 'message_signature': '5d31a9eb2528d1ed7c09f400647962a74db828f4de0f9b9bbb5d5386f699f6b0'}]}, 'timestamp': '2026-01-21 23:56:23.371883', '_unique_id': 'a79739fb8adb4e9fbe9afeff8d492013'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.373 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.374 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.393 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/cpu volume: 1860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd21748f-ed0d-434f-bb0b-377bb4f827e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1860000000, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'timestamp': '2026-01-21T23:56:23.374465', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ca551076-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.187787849, 'message_signature': '1f7f2ce780d47ced658706f7913da08991a45587ce395bcf0d3d3e6891488c6f'}]}, 'timestamp': '2026-01-21 23:56:23.394403', '_unique_id': '8f6efb1161e44a27ac41f21583d68d04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.395 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.397 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.397 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.397 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-2>]
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.398 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.398 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '831f3fc8-95a7-4ec6-b772-da2eb79094df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000042-c83d6412-f55f-4cbb-92ef-d461894cfdce-tapefba6c4f-8b', 'timestamp': '2026-01-21T23:56:23.398165', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'tapefba6c4f-8b', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:f0:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefba6c4f-8b'}, 'message_id': 'ca55b8e6-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.101436946, 'message_signature': 'ba3c4f331855a8132ea51c8c37397c395aa1da4ca0c13cc65fc9f7f11432bfe9'}]}, 'timestamp': '2026-01-21 23:56:23.398681', '_unique_id': '75f0721432e542b1a39a4f6bd47ff33d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.399 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.400 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.414 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.416 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e11ba94-4f37-4065-bc7e-7e3e39a571d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-vda', 'timestamp': '2026-01-21T23:56:23.401119', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca585c90-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.195608566, 'message_signature': '892c47058d2c72868290b7d31371fb5a053c6f02c7faa8e7f9ae47e9655bcbd7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-sda', 'timestamp': '2026-01-21T23:56:23.401119', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca58784c-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.195608566, 'message_signature': '901029d8442d9a826f38ddc2e6b1a792345587250cb9afad7953acc8a3623cd8'}]}, 'timestamp': '2026-01-21 23:56:23.416775', '_unique_id': '709da1875e9a4f90ad3724273c4b7e0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.418 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.419 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.420 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.420 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad340c40-25fa-4437-bfe2-410187bca48e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-vda', 'timestamp': '2026-01-21T23:56:23.420175', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca5919c8-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.195608566, 'message_signature': 'cdd9520bb4e20c25978e98940df124cee30b2a737be52ec10728760af7ac343f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-sda', 'timestamp': '2026-01-21T23:56:23.420175', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca59339a-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.195608566, 'message_signature': '343a76f5d097af2c56e1d45785440157fc091365c392c18290b92811b4e32025'}]}, 'timestamp': '2026-01-21 23:56:23.421520', '_unique_id': '8cb4235a88f84d63b20b6aa509a501dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.424 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.424 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.425 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '474c7bf8-f9f5-4d8e-96e9-55b1fe2c4762', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-vda', 'timestamp': '2026-01-21T23:56:23.424906', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca59d142-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.195608566, 'message_signature': '82f53b24710c6d390bd17f124d600c40842d4e0c503d22cdad56499e8ea757bb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-sda', 'timestamp': '2026-01-21T23:56:23.424906', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca59ef88-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.195608566, 'message_signature': '308f1578bc856270a740bb2b2c91b588bcbd4271773236aed82e145c73d8e097'}]}, 'timestamp': '2026-01-21 23:56:23.426329', '_unique_id': '23d5d24cfdc04a008756a82cf733923b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.427 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.429 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.429 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.430 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c86f95fd-5fdf-46f5-8b28-14a04c92793a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-vda', 'timestamp': '2026-01-21T23:56:23.429780', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca5a917c-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.119592359, 'message_signature': 'f66a16c4d7384316c9cdcac7b18890546a6b46a0285f78f978ea35a74c4e6113'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-sda', 'timestamp': '2026-01-21T23:56:23.429780', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca5aa8e2-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.119592359, 'message_signature': '4b5f352eeb89c01eca658b2bcc18d5cb109fc34235eff14762225b474581382d'}]}, 'timestamp': '2026-01-21 23:56:23.431111', '_unique_id': 'bb155540e3b74ae6bc804acefee908e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.432 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.433 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.433 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.434 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance c83d6412-f55f-4cbb-92ef-d461894cfdce: ceilometer.compute.pollsters.NoVolumeException
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.434 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.434 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8253096-7a5f-412c-a448-d607da495029', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000042-c83d6412-f55f-4cbb-92ef-d461894cfdce-tapefba6c4f-8b', 'timestamp': '2026-01-21T23:56:23.434414', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'tapefba6c4f-8b', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:f0:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefba6c4f-8b'}, 'message_id': 'ca5b3fa0-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.101436946, 'message_signature': '71c683cab55534a49aa1f90686cceb44d32d2e45eaa7288f928a8aa2b0852b87'}]}, 'timestamp': '2026-01-21 23:56:23.434882', '_unique_id': '0afdb0403d574750b4f4e5a1b7620339'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.435 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.436 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.436 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f4c91b5-bd8c-4067-a366-38eb198c0400', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000042-c83d6412-f55f-4cbb-92ef-d461894cfdce-tapefba6c4f-8b', 'timestamp': '2026-01-21T23:56:23.436955', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'tapefba6c4f-8b', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:f0:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefba6c4f-8b'}, 'message_id': 'ca5ba2b0-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.101436946, 'message_signature': 'a3083722e1caf9d46072a6ecec52c3d824a6aea838571443235680bb4509a6a1'}]}, 'timestamp': '2026-01-21 23:56:23.437391', '_unique_id': '21e1c8250d2643f3bb24c1e8d7c06e4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.438 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.439 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.439 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.439 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '583f3d9e-e857-4229-aabb-80ed8096e978', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-vda', 'timestamp': '2026-01-21T23:56:23.439398', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca5c017e-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.119592359, 'message_signature': '601d2eb1147aac08733354895785b7283a66f17ef7440a022f00256a114f3c1b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-sda', 'timestamp': '2026-01-21T23:56:23.439398', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca5c1380-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.119592359, 'message_signature': '6ffcec624a6639991dfdadc8d81abc295e529e08b248acc4a811c70722c73f8b'}]}, 'timestamp': '2026-01-21 23:56:23.440277', '_unique_id': 'a467d2c8a8b24a5699630413ac559330'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.441 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.442 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.442 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.442 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-2>]
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.443 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.443 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.443 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-2>]
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.443 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.443 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.443 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServersNegativeTestJSON-server-1677728672-2>]
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.444 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.444 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.read.latency volume: 104097496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.444 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.read.latency volume: 347688 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '664c04d6-9074-44d4-95f5-81ec3c3cf36c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 104097496, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-vda', 'timestamp': '2026-01-21T23:56:23.444312', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca5cc2ee-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.119592359, 'message_signature': 'e3ac313aa7c86f4a44f526999e3ff65cc097dd6e8c4d860d8dd22041df8325d4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 347688, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-sda', 'timestamp': '2026-01-21T23:56:23.444312', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca5cd342-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.119592359, 'message_signature': 'ade4da8fed93832895deea4bdccef999135c3da7f789620ad889cad760ea7f09'}]}, 'timestamp': '2026-01-21 23:56:23.445166', '_unique_id': 'b4e826a333834b2db633655fb4f3fea7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.446 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.447 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.447 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53de29c2-9a9f-4c14-802f-6e2e6396ab77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000042-c83d6412-f55f-4cbb-92ef-d461894cfdce-tapefba6c4f-8b', 'timestamp': '2026-01-21T23:56:23.447232', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'tapefba6c4f-8b', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:f0:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefba6c4f-8b'}, 'message_id': 'ca5d340e-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.101436946, 'message_signature': '7f2f23578b269a41945098380e99f554963ebec64c514434570f09f7718007e2'}]}, 'timestamp': '2026-01-21 23:56:23.447662', '_unique_id': 'ce0bc407950b49fe80660f0dfa968be3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.448 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.449 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.449 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.449 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc69debc-0a67-4446-ab95-b2311ac5f228', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-vda', 'timestamp': '2026-01-21T23:56:23.449439', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ca5d8832-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.119592359, 'message_signature': 'c5481df929b206dd960a7f9b64c3c81b56078d6875b8efc5b3e02e9fea36082d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce-sda', 'timestamp': '2026-01-21T23:56:23.449439', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'instance-00000042', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ca5d94ee-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.119592359, 'message_signature': 'd3a3357c8428e1e7ee96ef815ec8d1f5bc1b89ef0beb6f1eb747e87078ab6d8f'}]}, 'timestamp': '2026-01-21 23:56:23.450067', '_unique_id': 'f8899ab069d54e5d95c61ce6c99f2991'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.450 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.451 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.451 12 DEBUG ceilometer.compute.pollsters [-] c83d6412-f55f-4cbb-92ef-d461894cfdce/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c41d671e-098d-4c63-8a3b-2b3309a1a6c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a4a4a5f3c9f4c5091261592272bcb81', 'user_name': None, 'project_id': '414437860afc460b9e86d674975e9d1f', 'project_name': None, 'resource_id': 'instance-00000042-c83d6412-f55f-4cbb-92ef-d461894cfdce-tapefba6c4f-8b', 'timestamp': '2026-01-21T23:56:23.451599', 'resource_metadata': {'display_name': 'tempest-ListServersNegativeTestJSON-server-1677728672-2', 'name': 'tapefba6c4f-8b', 'instance_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'instance_type': 'm1.nano', 'host': '8a767d268d106d35f50a33d3248905e3f9928c1dbbe9759dbaf8ae72', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:25:f0:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefba6c4f-8b'}, 'message_id': 'ca5ddc9c-f724-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4286.101436946, 'message_signature': 'e17699155862f0a2e7c6e55894b6a90467d00eef156ccafea8ca86033d2337c9'}]}, 'timestamp': '2026-01-21 23:56:23.451975', '_unique_id': '506ac3f75c524f10bcf6d2dfe63e33a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:56:23.452 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.583 182939 DEBUG oslo_concurrency.lockutils [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.584 182939 DEBUG oslo_concurrency.lockutils [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.659 182939 DEBUG nova.compute.manager [req-db8b4f3b-051e-44b1-8c83-7edfe8f168bc req-d8b9d609-aa88-42ce-ad5a-79c59a8b5321 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Received event network-vif-plugged-efba6c4f-8bb6-4730-b537-68853c9b33aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.659 182939 DEBUG oslo_concurrency.lockutils [req-db8b4f3b-051e-44b1-8c83-7edfe8f168bc req-d8b9d609-aa88-42ce-ad5a-79c59a8b5321 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.660 182939 DEBUG oslo_concurrency.lockutils [req-db8b4f3b-051e-44b1-8c83-7edfe8f168bc req-d8b9d609-aa88-42ce-ad5a-79c59a8b5321 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.660 182939 DEBUG oslo_concurrency.lockutils [req-db8b4f3b-051e-44b1-8c83-7edfe8f168bc req-d8b9d609-aa88-42ce-ad5a-79c59a8b5321 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.660 182939 DEBUG nova.compute.manager [req-db8b4f3b-051e-44b1-8c83-7edfe8f168bc req-d8b9d609-aa88-42ce-ad5a-79c59a8b5321 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] No waiting events found dispatching network-vif-plugged-efba6c4f-8bb6-4730-b537-68853c9b33aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.661 182939 WARNING nova.compute.manager [req-db8b4f3b-051e-44b1-8c83-7edfe8f168bc req-d8b9d609-aa88-42ce-ad5a-79c59a8b5321 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Received unexpected event network-vif-plugged-efba6c4f-8bb6-4730-b537-68853c9b33aa for instance with vm_state active and task_state None.
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.673 182939 DEBUG nova.compute.provider_tree [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.704 182939 DEBUG nova.scheduler.client.report [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.733 182939 DEBUG oslo_concurrency.lockutils [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.762 182939 INFO nova.scheduler.client.report [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Deleted allocations for instance 70e91a38-1e04-4d71-93e1-4b946f228d7e
Jan 21 23:56:23 compute-0 nova_compute[182935]: 2026-01-21 23:56:23.878 182939 DEBUG oslo_concurrency.lockutils [None req-b0a94847-5388-48fc-8201-1a9b35e4b663 7e79b904cb8a49f990b05eb0ed72fdf4 70b1c9f8be0042aa8de9841a26729700 - - default default] Lock "70e91a38-1e04-4d71-93e1-4b946f228d7e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.791s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:25 compute-0 nova_compute[182935]: 2026-01-21 23:56:25.990 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:26 compute-0 nova_compute[182935]: 2026-01-21 23:56:26.433 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:26 compute-0 nova_compute[182935]: 2026-01-21 23:56:26.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:26 compute-0 nova_compute[182935]: 2026-01-21 23:56:26.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:56:26 compute-0 nova_compute[182935]: 2026-01-21 23:56:26.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:56:26 compute-0 nova_compute[182935]: 2026-01-21 23:56:26.965 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-c83d6412-f55f-4cbb-92ef-d461894cfdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:56:26 compute-0 nova_compute[182935]: 2026-01-21 23:56:26.965 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-c83d6412-f55f-4cbb-92ef-d461894cfdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:56:26 compute-0 nova_compute[182935]: 2026-01-21 23:56:26.966 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:56:26 compute-0 nova_compute[182935]: 2026-01-21 23:56:26.966 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c83d6412-f55f-4cbb-92ef-d461894cfdce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.665 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Updating instance_info_cache with network_info: [{"id": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "address": "fa:16:3e:25:f0:96", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefba6c4f-8b", "ovs_interfaceid": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.686 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-c83d6412-f55f-4cbb-92ef-d461894cfdce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.686 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.686 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.687 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.687 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.687 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.687 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.688 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.713 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.714 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.714 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.714 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.790 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.854 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.855 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:56:29 compute-0 nova_compute[182935]: 2026-01-21 23:56:29.910 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:56:30 compute-0 nova_compute[182935]: 2026-01-21 23:56:30.067 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:56:30 compute-0 nova_compute[182935]: 2026-01-21 23:56:30.068 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5565MB free_disk=73.27067184448242GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:56:30 compute-0 nova_compute[182935]: 2026-01-21 23:56:30.069 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:30 compute-0 nova_compute[182935]: 2026-01-21 23:56:30.069 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:30 compute-0 nova_compute[182935]: 2026-01-21 23:56:30.142 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance c83d6412-f55f-4cbb-92ef-d461894cfdce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:56:30 compute-0 nova_compute[182935]: 2026-01-21 23:56:30.143 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:56:30 compute-0 nova_compute[182935]: 2026-01-21 23:56:30.143 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:56:30 compute-0 nova_compute[182935]: 2026-01-21 23:56:30.204 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:56:30 compute-0 nova_compute[182935]: 2026-01-21 23:56:30.224 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:56:30 compute-0 nova_compute[182935]: 2026-01-21 23:56:30.250 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:56:30 compute-0 nova_compute[182935]: 2026-01-21 23:56:30.250 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:30 compute-0 nova_compute[182935]: 2026-01-21 23:56:30.992 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:31 compute-0 ovn_controller[95047]: 2026-01-21T23:56:31Z|00237|binding|INFO|Releasing lport 8bc16eeb-6666-4300-9ce8-0a810442a173 from this chassis (sb_readonly=0)
Jan 21 23:56:31 compute-0 nova_compute[182935]: 2026-01-21 23:56:31.435 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:31 compute-0 nova_compute[182935]: 2026-01-21 23:56:31.474 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:32 compute-0 nova_compute[182935]: 2026-01-21 23:56:32.357 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:32 compute-0 nova_compute[182935]: 2026-01-21 23:56:32.360 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:33 compute-0 podman[221059]: 2026-01-21 23:56:33.692078195 +0000 UTC m=+0.053638682 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:56:33 compute-0 podman[221058]: 2026-01-21 23:56:33.730931313 +0000 UTC m=+0.093944075 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.154 182939 DEBUG oslo_concurrency.lockutils [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "c83d6412-f55f-4cbb-92ef-d461894cfdce" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.155 182939 DEBUG oslo_concurrency.lockutils [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.155 182939 DEBUG oslo_concurrency.lockutils [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.155 182939 DEBUG oslo_concurrency.lockutils [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.156 182939 DEBUG oslo_concurrency.lockutils [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.169 182939 INFO nova.compute.manager [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Terminating instance
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.180 182939 DEBUG nova.compute.manager [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:56:34 compute-0 kernel: tapefba6c4f-8b (unregistering): left promiscuous mode
Jan 21 23:56:34 compute-0 NetworkManager[55139]: <info>  [1769039794.1977] device (tapefba6c4f-8b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.206 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-0 ovn_controller[95047]: 2026-01-21T23:56:34Z|00238|binding|INFO|Releasing lport efba6c4f-8bb6-4730-b537-68853c9b33aa from this chassis (sb_readonly=0)
Jan 21 23:56:34 compute-0 ovn_controller[95047]: 2026-01-21T23:56:34Z|00239|binding|INFO|Setting lport efba6c4f-8bb6-4730-b537-68853c9b33aa down in Southbound
Jan 21 23:56:34 compute-0 ovn_controller[95047]: 2026-01-21T23:56:34Z|00240|binding|INFO|Removing iface tapefba6c4f-8b ovn-installed in OVS
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.220 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:f0:96 10.100.0.7'], port_security=['fa:16:3e:25:f0:96 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'c83d6412-f55f-4cbb-92ef-d461894cfdce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-835f4434-3fa6-458b-b79c-b27830f531cf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '414437860afc460b9e86d674975e9d1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '30db0ce4-28a9-4add-b257-f90dc081c48d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2a61a9c-1832-4a5f-89c7-e09ac8a1046e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=efba6c4f-8bb6-4730-b537-68853c9b33aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.222 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.223 104408 INFO neutron.agent.ovn.metadata.agent [-] Port efba6c4f-8bb6-4730-b537-68853c9b33aa in datapath 835f4434-3fa6-458b-b79c-b27830f531cf unbound from our chassis
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.224 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 835f4434-3fa6-458b-b79c-b27830f531cf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.226 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0825b2-c069-4846-a34f-1bd726d18b32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.227 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf namespace which is not needed anymore
Jan 21 23:56:34 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000042.scope: Deactivated successfully.
Jan 21 23:56:34 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000042.scope: Consumed 12.743s CPU time.
Jan 21 23:56:34 compute-0 systemd-machined[154182]: Machine qemu-33-instance-00000042 terminated.
Jan 21 23:56:34 compute-0 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220948]: [NOTICE]   (220961) : haproxy version is 2.8.14-c23fe91
Jan 21 23:56:34 compute-0 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220948]: [NOTICE]   (220961) : path to executable is /usr/sbin/haproxy
Jan 21 23:56:34 compute-0 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220948]: [WARNING]  (220961) : Exiting Master process...
Jan 21 23:56:34 compute-0 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220948]: [ALERT]    (220961) : Current worker (220963) exited with code 143 (Terminated)
Jan 21 23:56:34 compute-0 neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf[220948]: [WARNING]  (220961) : All workers exited. Exiting... (0)
Jan 21 23:56:34 compute-0 systemd[1]: libpod-88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2.scope: Deactivated successfully.
Jan 21 23:56:34 compute-0 podman[221127]: 2026-01-21 23:56:34.361697309 +0000 UTC m=+0.046668486 container died 88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:56:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2-userdata-shm.mount: Deactivated successfully.
Jan 21 23:56:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-de7b40bbb129e24fb8fa7c9ada3b5fee3b1f2ad50cbc8e7e70cf7b056c2d7b89-merged.mount: Deactivated successfully.
Jan 21 23:56:34 compute-0 podman[221127]: 2026-01-21 23:56:34.398848246 +0000 UTC m=+0.083819423 container cleanup 88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 23:56:34 compute-0 systemd[1]: libpod-conmon-88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2.scope: Deactivated successfully.
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.458 182939 INFO nova.virt.libvirt.driver [-] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Instance destroyed successfully.
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.460 182939 DEBUG nova.objects.instance [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lazy-loading 'resources' on Instance uuid c83d6412-f55f-4cbb-92ef-d461894cfdce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:56:34 compute-0 podman[221163]: 2026-01-21 23:56:34.482607936 +0000 UTC m=+0.059597864 container remove 88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.488 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[236431c2-99b8-43ab-bec1-07c0b5064f89]: (4, ('Wed Jan 21 11:56:34 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf (88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2)\n88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2\nWed Jan 21 11:56:34 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf (88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2)\n88462ad4ef59ac5b9cb5ad7f43373361705c147bbb43ce39272ee6f552e255e2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.491 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[52383831-e57b-4e0c-8e77-984c15c42297]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.491 182939 DEBUG nova.virt.libvirt.vif [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:56:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1677728672',display_name='tempest-ListServersNegativeTestJSON-server-1677728672-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1677728672-2',id=66,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-21T23:56:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='414437860afc460b9e86d674975e9d1f',ramdisk_id='',reservation_id='r-dznclk2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1787990789',owner_user_name='tempest-ListServersNegativeTestJSON-1787990789-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:56:21Z,user_data=None,user_id='9a4a4a5f3c9f4c5091261592272bcb81',uuid=c83d6412-f55f-4cbb-92ef-d461894cfdce,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "address": "fa:16:3e:25:f0:96", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefba6c4f-8b", "ovs_interfaceid": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.492 182939 DEBUG nova.network.os_vif_util [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converting VIF {"id": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "address": "fa:16:3e:25:f0:96", "network": {"id": "835f4434-3fa6-458b-b79c-b27830f531cf", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1274650069-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "414437860afc460b9e86d674975e9d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefba6c4f-8b", "ovs_interfaceid": "efba6c4f-8bb6-4730-b537-68853c9b33aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.493 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap835f4434-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.493 182939 DEBUG nova.network.os_vif_util [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:25:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=efba6c4f-8bb6-4730-b537-68853c9b33aa,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefba6c4f-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.494 182939 DEBUG os_vif [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:25:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=efba6c4f-8bb6-4730-b537-68853c9b33aa,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefba6c4f-8b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:56:34 compute-0 kernel: tap835f4434-30: left promiscuous mode
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.498 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.499 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefba6c4f-8b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.500 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.502 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.512 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.516 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ba78a2df-158b-4d54-8878-fff3c4d1672d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.517 182939 INFO os_vif [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:25:f0:96,bridge_name='br-int',has_traffic_filtering=True,id=efba6c4f-8bb6-4730-b537-68853c9b33aa,network=Network(835f4434-3fa6-458b-b79c-b27830f531cf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefba6c4f-8b')
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.518 182939 INFO nova.virt.libvirt.driver [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Deleting instance files /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce_del
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.519 182939 INFO nova.virt.libvirt.driver [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Deletion of /var/lib/nova/instances/c83d6412-f55f-4cbb-92ef-d461894cfdce_del complete
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.529 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[43a6d078-e856-41f9-b207-7e23d120844a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.531 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[15a30290-0962-4ffb-8bfa-8e8630b93942]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.552 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ab05ce2c-c7ea-477c-8215-d2e92c791d0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428341, 'reachable_time': 22593, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221191, 'error': None, 'target': 'ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.555 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-835f4434-3fa6-458b-b79c-b27830f531cf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:56:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:56:34.555 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab5e16b-182e-4941-a7a6-d02d9982860b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:56:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d835f4434\x2d3fa6\x2d458b\x2db79c\x2db27830f531cf.mount: Deactivated successfully.
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.624 182939 INFO nova.compute.manager [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Took 0.44 seconds to destroy the instance on the hypervisor.
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.625 182939 DEBUG oslo.service.loopingcall [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.626 182939 DEBUG nova.compute.manager [-] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.626 182939 DEBUG nova.network.neutron [-] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.687 182939 DEBUG nova.compute.manager [req-a03e2193-5458-4065-856a-dd192ea45894 req-745f260a-f99a-40e9-942f-9208fccbbc10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Received event network-vif-unplugged-efba6c4f-8bb6-4730-b537-68853c9b33aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.688 182939 DEBUG oslo_concurrency.lockutils [req-a03e2193-5458-4065-856a-dd192ea45894 req-745f260a-f99a-40e9-942f-9208fccbbc10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.689 182939 DEBUG oslo_concurrency.lockutils [req-a03e2193-5458-4065-856a-dd192ea45894 req-745f260a-f99a-40e9-942f-9208fccbbc10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.689 182939 DEBUG oslo_concurrency.lockutils [req-a03e2193-5458-4065-856a-dd192ea45894 req-745f260a-f99a-40e9-942f-9208fccbbc10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.690 182939 DEBUG nova.compute.manager [req-a03e2193-5458-4065-856a-dd192ea45894 req-745f260a-f99a-40e9-942f-9208fccbbc10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] No waiting events found dispatching network-vif-unplugged-efba6c4f-8bb6-4730-b537-68853c9b33aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:34 compute-0 nova_compute[182935]: 2026-01-21 23:56:34.690 182939 DEBUG nova.compute.manager [req-a03e2193-5458-4065-856a-dd192ea45894 req-745f260a-f99a-40e9-942f-9208fccbbc10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Received event network-vif-unplugged-efba6c4f-8bb6-4730-b537-68853c9b33aa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:56:35 compute-0 nova_compute[182935]: 2026-01-21 23:56:35.582 182939 DEBUG nova.network.neutron [-] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:56:35 compute-0 nova_compute[182935]: 2026-01-21 23:56:35.601 182939 INFO nova.compute.manager [-] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Took 0.97 seconds to deallocate network for instance.
Jan 21 23:56:35 compute-0 nova_compute[182935]: 2026-01-21 23:56:35.687 182939 DEBUG oslo_concurrency.lockutils [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:35 compute-0 nova_compute[182935]: 2026-01-21 23:56:35.688 182939 DEBUG oslo_concurrency.lockutils [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:35 compute-0 nova_compute[182935]: 2026-01-21 23:56:35.795 182939 DEBUG nova.compute.provider_tree [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:56:35 compute-0 nova_compute[182935]: 2026-01-21 23:56:35.818 182939 DEBUG nova.scheduler.client.report [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:56:35 compute-0 nova_compute[182935]: 2026-01-21 23:56:35.853 182939 DEBUG oslo_concurrency.lockutils [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:35 compute-0 nova_compute[182935]: 2026-01-21 23:56:35.880 182939 INFO nova.scheduler.client.report [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Deleted allocations for instance c83d6412-f55f-4cbb-92ef-d461894cfdce
Jan 21 23:56:35 compute-0 nova_compute[182935]: 2026-01-21 23:56:35.991 182939 DEBUG oslo_concurrency.lockutils [None req-8e36e3fe-7139-47bd-b812-caad390de33e 9a4a4a5f3c9f4c5091261592272bcb81 414437860afc460b9e86d674975e9d1f - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.836s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:35 compute-0 nova_compute[182935]: 2026-01-21 23:56:35.994 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:36 compute-0 nova_compute[182935]: 2026-01-21 23:56:36.437 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039781.3835945, 70e91a38-1e04-4d71-93e1-4b946f228d7e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:36 compute-0 nova_compute[182935]: 2026-01-21 23:56:36.438 182939 INFO nova.compute.manager [-] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] VM Stopped (Lifecycle Event)
Jan 21 23:56:36 compute-0 nova_compute[182935]: 2026-01-21 23:56:36.463 182939 DEBUG nova.compute.manager [None req-d7728c33-1e13-469a-bf55-d12da460b284 - - - - - -] [instance: 70e91a38-1e04-4d71-93e1-4b946f228d7e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:36 compute-0 nova_compute[182935]: 2026-01-21 23:56:36.810 182939 DEBUG nova.compute.manager [req-d4fb14e7-1ef6-47f2-a869-d15b3e844bab req-a15ca9df-6c20-4e0d-8852-389b5e5b1f77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Received event network-vif-plugged-efba6c4f-8bb6-4730-b537-68853c9b33aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:36 compute-0 nova_compute[182935]: 2026-01-21 23:56:36.811 182939 DEBUG oslo_concurrency.lockutils [req-d4fb14e7-1ef6-47f2-a869-d15b3e844bab req-a15ca9df-6c20-4e0d-8852-389b5e5b1f77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:56:36 compute-0 nova_compute[182935]: 2026-01-21 23:56:36.811 182939 DEBUG oslo_concurrency.lockutils [req-d4fb14e7-1ef6-47f2-a869-d15b3e844bab req-a15ca9df-6c20-4e0d-8852-389b5e5b1f77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:56:36 compute-0 nova_compute[182935]: 2026-01-21 23:56:36.811 182939 DEBUG oslo_concurrency.lockutils [req-d4fb14e7-1ef6-47f2-a869-d15b3e844bab req-a15ca9df-6c20-4e0d-8852-389b5e5b1f77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c83d6412-f55f-4cbb-92ef-d461894cfdce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:56:36 compute-0 nova_compute[182935]: 2026-01-21 23:56:36.811 182939 DEBUG nova.compute.manager [req-d4fb14e7-1ef6-47f2-a869-d15b3e844bab req-a15ca9df-6c20-4e0d-8852-389b5e5b1f77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] No waiting events found dispatching network-vif-plugged-efba6c4f-8bb6-4730-b537-68853c9b33aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:56:36 compute-0 nova_compute[182935]: 2026-01-21 23:56:36.811 182939 WARNING nova.compute.manager [req-d4fb14e7-1ef6-47f2-a869-d15b3e844bab req-a15ca9df-6c20-4e0d-8852-389b5e5b1f77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Received unexpected event network-vif-plugged-efba6c4f-8bb6-4730-b537-68853c9b33aa for instance with vm_state deleted and task_state None.
Jan 21 23:56:36 compute-0 nova_compute[182935]: 2026-01-21 23:56:36.812 182939 DEBUG nova.compute.manager [req-d4fb14e7-1ef6-47f2-a869-d15b3e844bab req-a15ca9df-6c20-4e0d-8852-389b5e5b1f77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Received event network-vif-deleted-efba6c4f-8bb6-4730-b537-68853c9b33aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:56:37 compute-0 nova_compute[182935]: 2026-01-21 23:56:37.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:56:39 compute-0 nova_compute[182935]: 2026-01-21 23:56:39.501 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:40 compute-0 nova_compute[182935]: 2026-01-21 23:56:40.474 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:40 compute-0 podman[221193]: 2026-01-21 23:56:40.692684103 +0000 UTC m=+0.065466025 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:56:40 compute-0 nova_compute[182935]: 2026-01-21 23:56:40.998 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:44 compute-0 nova_compute[182935]: 2026-01-21 23:56:44.503 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:44 compute-0 podman[221217]: 2026-01-21 23:56:44.781119884 +0000 UTC m=+0.084861287 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 21 23:56:46 compute-0 nova_compute[182935]: 2026-01-21 23:56:46.000 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:49 compute-0 nova_compute[182935]: 2026-01-21 23:56:49.456 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039794.4544349, c83d6412-f55f-4cbb-92ef-d461894cfdce => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:56:49 compute-0 nova_compute[182935]: 2026-01-21 23:56:49.458 182939 INFO nova.compute.manager [-] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] VM Stopped (Lifecycle Event)
Jan 21 23:56:49 compute-0 nova_compute[182935]: 2026-01-21 23:56:49.492 182939 DEBUG nova.compute.manager [None req-a0bec85b-d25e-4dcf-8535-ee5875c02754 - - - - - -] [instance: c83d6412-f55f-4cbb-92ef-d461894cfdce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:56:49 compute-0 nova_compute[182935]: 2026-01-21 23:56:49.504 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:51 compute-0 nova_compute[182935]: 2026-01-21 23:56:51.002 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:51 compute-0 podman[221236]: 2026-01-21 23:56:51.707169181 +0000 UTC m=+0.074539691 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Jan 21 23:56:51 compute-0 podman[221237]: 2026-01-21 23:56:51.718105623 +0000 UTC m=+0.075842033 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 21 23:56:54 compute-0 nova_compute[182935]: 2026-01-21 23:56:54.506 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:56 compute-0 nova_compute[182935]: 2026-01-21 23:56:56.003 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:56:59 compute-0 nova_compute[182935]: 2026-01-21 23:56:59.508 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:01 compute-0 nova_compute[182935]: 2026-01-21 23:57:01.005 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:03.191 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:03.192 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:03.192 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:03 compute-0 nova_compute[182935]: 2026-01-21 23:57:03.867 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:03.867 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:57:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:03.869 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:57:04 compute-0 nova_compute[182935]: 2026-01-21 23:57:04.510 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:04 compute-0 podman[221280]: 2026-01-21 23:57:04.707394799 +0000 UTC m=+0.075365761 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:57:04 compute-0 podman[221279]: 2026-01-21 23:57:04.72464026 +0000 UTC m=+0.094070727 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 21 23:57:04 compute-0 sshd-session[221277]: Invalid user weblogic from 188.166.69.60 port 45654
Jan 21 23:57:04 compute-0 sshd-session[221277]: Connection closed by invalid user weblogic 188.166.69.60 port 45654 [preauth]
Jan 21 23:57:06 compute-0 nova_compute[182935]: 2026-01-21 23:57:06.007 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:09 compute-0 nova_compute[182935]: 2026-01-21 23:57:09.513 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:09 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:09.871 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:11 compute-0 nova_compute[182935]: 2026-01-21 23:57:11.011 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:11 compute-0 podman[221330]: 2026-01-21 23:57:11.695724723 +0000 UTC m=+0.069593763 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 21 23:57:14 compute-0 nova_compute[182935]: 2026-01-21 23:57:14.514 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:15 compute-0 podman[221353]: 2026-01-21 23:57:15.688728096 +0000 UTC m=+0.056005319 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:57:15 compute-0 nova_compute[182935]: 2026-01-21 23:57:15.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:16 compute-0 nova_compute[182935]: 2026-01-21 23:57:16.012 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:19 compute-0 nova_compute[182935]: 2026-01-21 23:57:19.516 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:21 compute-0 nova_compute[182935]: 2026-01-21 23:57:21.055 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:22 compute-0 podman[221372]: 2026-01-21 23:57:22.688280459 +0000 UTC m=+0.062077153 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 21 23:57:22 compute-0 podman[221373]: 2026-01-21 23:57:22.692637763 +0000 UTC m=+0.061257604 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:57:24 compute-0 nova_compute[182935]: 2026-01-21 23:57:24.517 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:24 compute-0 nova_compute[182935]: 2026-01-21 23:57:24.690 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:24 compute-0 nova_compute[182935]: 2026-01-21 23:57:24.691 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:24 compute-0 nova_compute[182935]: 2026-01-21 23:57:24.727 182939 DEBUG nova.compute.manager [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:57:24 compute-0 nova_compute[182935]: 2026-01-21 23:57:24.862 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:24 compute-0 nova_compute[182935]: 2026-01-21 23:57:24.863 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:24 compute-0 nova_compute[182935]: 2026-01-21 23:57:24.870 182939 DEBUG nova.virt.hardware [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:57:24 compute-0 nova_compute[182935]: 2026-01-21 23:57:24.870 182939 INFO nova.compute.claims [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.049 182939 DEBUG nova.compute.provider_tree [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.071 182939 DEBUG nova.scheduler.client.report [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.095 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.096 182939 DEBUG nova.compute.manager [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.206 182939 DEBUG nova.compute.manager [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.207 182939 DEBUG nova.network.neutron [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.242 182939 INFO nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.273 182939 DEBUG nova.compute.manager [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.416 182939 DEBUG nova.compute.manager [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.418 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.418 182939 INFO nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Creating image(s)
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.419 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.420 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.421 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.442 182939 DEBUG oslo_concurrency.processutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.522 182939 DEBUG oslo_concurrency.processutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.523 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.524 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.546 182939 DEBUG oslo_concurrency.processutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.575 182939 DEBUG nova.policy [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.609 182939 DEBUG oslo_concurrency.processutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.611 182939 DEBUG oslo_concurrency.processutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.667 182939 DEBUG oslo_concurrency.processutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.669 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.670 182939 DEBUG oslo_concurrency.processutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.731 182939 DEBUG oslo_concurrency.processutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.732 182939 DEBUG nova.virt.disk.api [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Checking if we can resize image /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.733 182939 DEBUG oslo_concurrency.processutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.798 182939 DEBUG oslo_concurrency.processutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.800 182939 DEBUG nova.virt.disk.api [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Cannot resize image /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.800 182939 DEBUG nova.objects.instance [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'migration_context' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.825 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.826 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Ensure instance console log exists: /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.826 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.827 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:25 compute-0 nova_compute[182935]: 2026-01-21 23:57:25.827 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:26 compute-0 nova_compute[182935]: 2026-01-21 23:57:26.057 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:26 compute-0 nova_compute[182935]: 2026-01-21 23:57:26.432 182939 DEBUG nova.network.neutron [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Successfully created port: bce17837-9218-4b02-868d-09dba821ce49 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:57:26 compute-0 nova_compute[182935]: 2026-01-21 23:57:26.807 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:26 compute-0 nova_compute[182935]: 2026-01-21 23:57:26.831 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:26 compute-0 nova_compute[182935]: 2026-01-21 23:57:26.832 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:57:26 compute-0 nova_compute[182935]: 2026-01-21 23:57:26.832 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:57:26 compute-0 nova_compute[182935]: 2026-01-21 23:57:26.846 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 21 23:57:26 compute-0 nova_compute[182935]: 2026-01-21 23:57:26.846 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 21 23:57:27 compute-0 nova_compute[182935]: 2026-01-21 23:57:27.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:27 compute-0 nova_compute[182935]: 2026-01-21 23:57:27.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:27 compute-0 nova_compute[182935]: 2026-01-21 23:57:27.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:27 compute-0 nova_compute[182935]: 2026-01-21 23:57:27.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:57:28 compute-0 nova_compute[182935]: 2026-01-21 23:57:28.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:28 compute-0 nova_compute[182935]: 2026-01-21 23:57:28.829 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:28 compute-0 nova_compute[182935]: 2026-01-21 23:57:28.830 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:28 compute-0 nova_compute[182935]: 2026-01-21 23:57:28.830 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:28 compute-0 nova_compute[182935]: 2026-01-21 23:57:28.831 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.051 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.052 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5735MB free_disk=73.27122116088867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.052 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.053 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.119 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance ada4a724-2307-431d-8c29-075bfd90b43e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.119 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.119 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.184 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.206 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.231 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.232 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.232 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.233 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 21 23:57:29 compute-0 nova_compute[182935]: 2026-01-21 23:57:29.520 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:30 compute-0 nova_compute[182935]: 2026-01-21 23:57:30.243 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:30 compute-0 nova_compute[182935]: 2026-01-21 23:57:30.919 182939 DEBUG nova.network.neutron [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Successfully updated port: bce17837-9218-4b02-868d-09dba821ce49 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:57:30 compute-0 nova_compute[182935]: 2026-01-21 23:57:30.934 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:57:30 compute-0 nova_compute[182935]: 2026-01-21 23:57:30.935 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquired lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:57:30 compute-0 nova_compute[182935]: 2026-01-21 23:57:30.935 182939 DEBUG nova.network.neutron [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:57:31 compute-0 nova_compute[182935]: 2026-01-21 23:57:31.059 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:31 compute-0 nova_compute[182935]: 2026-01-21 23:57:31.237 182939 DEBUG nova.network.neutron [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:57:31 compute-0 nova_compute[182935]: 2026-01-21 23:57:31.598 182939 DEBUG nova.compute.manager [req-c4ac6cd0-b76f-4c2d-b948-9d53a99dedab req-3039c183-508c-4c31-997c-ea06a6a10eb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-changed-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:57:31 compute-0 nova_compute[182935]: 2026-01-21 23:57:31.599 182939 DEBUG nova.compute.manager [req-c4ac6cd0-b76f-4c2d-b948-9d53a99dedab req-3039c183-508c-4c31-997c-ea06a6a10eb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Refreshing instance network info cache due to event network-changed-bce17837-9218-4b02-868d-09dba821ce49. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:57:31 compute-0 nova_compute[182935]: 2026-01-21 23:57:31.600 182939 DEBUG oslo_concurrency.lockutils [req-c4ac6cd0-b76f-4c2d-b948-9d53a99dedab req-3039c183-508c-4c31-997c-ea06a6a10eb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:57:31 compute-0 nova_compute[182935]: 2026-01-21 23:57:31.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:32 compute-0 nova_compute[182935]: 2026-01-21 23:57:32.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.474 182939 DEBUG nova.network.neutron [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Updating instance_info_cache with network_info: [{"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.521 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Releasing lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.522 182939 DEBUG nova.compute.manager [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Instance network_info: |[{"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.522 182939 DEBUG oslo_concurrency.lockutils [req-c4ac6cd0-b76f-4c2d-b948-9d53a99dedab req-3039c183-508c-4c31-997c-ea06a6a10eb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.522 182939 DEBUG nova.network.neutron [req-c4ac6cd0-b76f-4c2d-b948-9d53a99dedab req-3039c183-508c-4c31-997c-ea06a6a10eb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Refreshing network info cache for port bce17837-9218-4b02-868d-09dba821ce49 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.525 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Start _get_guest_xml network_info=[{"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.530 182939 WARNING nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.536 182939 DEBUG nova.virt.libvirt.host [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.537 182939 DEBUG nova.virt.libvirt.host [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.548 182939 DEBUG nova.virt.libvirt.host [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.549 182939 DEBUG nova.virt.libvirt.host [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.552 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.552 182939 DEBUG nova.virt.hardware [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.553 182939 DEBUG nova.virt.hardware [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.553 182939 DEBUG nova.virt.hardware [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.553 182939 DEBUG nova.virt.hardware [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.554 182939 DEBUG nova.virt.hardware [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.554 182939 DEBUG nova.virt.hardware [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.554 182939 DEBUG nova.virt.hardware [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.554 182939 DEBUG nova.virt.hardware [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.555 182939 DEBUG nova.virt.hardware [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.555 182939 DEBUG nova.virt.hardware [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.555 182939 DEBUG nova.virt.hardware [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.559 182939 DEBUG nova.virt.libvirt.vif [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:57:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1788045668',display_name='tempest-ServerStableDeviceRescueTest-server-1788045668',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1788045668',id=72,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='011e84f966444a668bd6c0f5674f551f',ramdisk_id='',reservation_id='r-piy4urn6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1256721315',owner_user_name='tempest-ServerStableDeviceRescueTest-1256721315-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:57:25Z,user_data=None,user_id='55710edfd4b24e368807c8b5087ec91c',uuid=ada4a724-2307-431d-8c29-075bfd90b43e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.559 182939 DEBUG nova.network.os_vif_util [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converting VIF {"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.560 182939 DEBUG nova.network.os_vif_util [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:09:d8,bridge_name='br-int',has_traffic_filtering=True,id=bce17837-9218-4b02-868d-09dba821ce49,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbce17837-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.561 182939 DEBUG nova.objects.instance [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'pci_devices' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.601 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:57:33 compute-0 nova_compute[182935]:   <uuid>ada4a724-2307-431d-8c29-075bfd90b43e</uuid>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   <name>instance-00000048</name>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-1788045668</nova:name>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:57:33</nova:creationTime>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:57:33 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:57:33 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:57:33 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:57:33 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:57:33 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:57:33 compute-0 nova_compute[182935]:         <nova:user uuid="55710edfd4b24e368807c8b5087ec91c">tempest-ServerStableDeviceRescueTest-1256721315-project-member</nova:user>
Jan 21 23:57:33 compute-0 nova_compute[182935]:         <nova:project uuid="011e84f966444a668bd6c0f5674f551f">tempest-ServerStableDeviceRescueTest-1256721315</nova:project>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:57:33 compute-0 nova_compute[182935]:         <nova:port uuid="bce17837-9218-4b02-868d-09dba821ce49">
Jan 21 23:57:33 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <system>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <entry name="serial">ada4a724-2307-431d-8c29-075bfd90b43e</entry>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <entry name="uuid">ada4a724-2307-431d-8c29-075bfd90b43e</entry>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     </system>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   <os>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   </os>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   <features>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   </features>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.config"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:7e:09:d8"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <target dev="tapbce17837-92"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/console.log" append="off"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <video>
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     </video>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:57:33 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:57:33 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:57:33 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:57:33 compute-0 nova_compute[182935]: </domain>
Jan 21 23:57:33 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.603 182939 DEBUG nova.compute.manager [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Preparing to wait for external event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.603 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.604 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.604 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.604 182939 DEBUG nova.virt.libvirt.vif [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:57:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1788045668',display_name='tempest-ServerStableDeviceRescueTest-server-1788045668',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1788045668',id=72,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='011e84f966444a668bd6c0f5674f551f',ramdisk_id='',reservation_id='r-piy4urn6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1256721315',owner_user_name='tempest-ServerStableDeviceRescueTest-1256721315-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:57:25Z,user_data=None,user_id='55710edfd4b24e368807c8b5087ec91c',uuid=ada4a724-2307-431d-8c29-075bfd90b43e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.605 182939 DEBUG nova.network.os_vif_util [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converting VIF {"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.605 182939 DEBUG nova.network.os_vif_util [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:09:d8,bridge_name='br-int',has_traffic_filtering=True,id=bce17837-9218-4b02-868d-09dba821ce49,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbce17837-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.606 182939 DEBUG os_vif [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:09:d8,bridge_name='br-int',has_traffic_filtering=True,id=bce17837-9218-4b02-868d-09dba821ce49,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbce17837-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.606 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.607 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.607 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.612 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.612 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbce17837-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.613 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbce17837-92, col_values=(('external_ids', {'iface-id': 'bce17837-9218-4b02-868d-09dba821ce49', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7e:09:d8', 'vm-uuid': 'ada4a724-2307-431d-8c29-075bfd90b43e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.616 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:33 compute-0 NetworkManager[55139]: <info>  [1769039853.6180] manager: (tapbce17837-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.619 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.628 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.629 182939 INFO os_vif [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:09:d8,bridge_name='br-int',has_traffic_filtering=True,id=bce17837-9218-4b02-868d-09dba821ce49,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbce17837-92')
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.694 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.695 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.695 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No VIF found with MAC fa:16:3e:7e:09:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:57:33 compute-0 nova_compute[182935]: 2026-01-21 23:57:33.696 182939 INFO nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Using config drive
Jan 21 23:57:34 compute-0 nova_compute[182935]: 2026-01-21 23:57:34.351 182939 INFO nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Creating config drive at /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.config
Jan 21 23:57:34 compute-0 nova_compute[182935]: 2026-01-21 23:57:34.361 182939 DEBUG oslo_concurrency.processutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphag9o0ki execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:34 compute-0 nova_compute[182935]: 2026-01-21 23:57:34.494 182939 DEBUG oslo_concurrency.processutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphag9o0ki" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:34 compute-0 kernel: tapbce17837-92: entered promiscuous mode
Jan 21 23:57:34 compute-0 NetworkManager[55139]: <info>  [1769039854.5757] manager: (tapbce17837-92): new Tun device (/org/freedesktop/NetworkManager/Devices/111)
Jan 21 23:57:34 compute-0 ovn_controller[95047]: 2026-01-21T23:57:34Z|00241|binding|INFO|Claiming lport bce17837-9218-4b02-868d-09dba821ce49 for this chassis.
Jan 21 23:57:34 compute-0 nova_compute[182935]: 2026-01-21 23:57:34.576 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:34 compute-0 ovn_controller[95047]: 2026-01-21T23:57:34Z|00242|binding|INFO|bce17837-9218-4b02-868d-09dba821ce49: Claiming fa:16:3e:7e:09:d8 10.100.0.11
Jan 21 23:57:34 compute-0 nova_compute[182935]: 2026-01-21 23:57:34.579 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:34 compute-0 nova_compute[182935]: 2026-01-21 23:57:34.585 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.600 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:09:d8 10.100.0.11'], port_security=['fa:16:3e:7e:09:d8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=bce17837-9218-4b02-868d-09dba821ce49) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.601 104408 INFO neutron.agent.ovn.metadata.agent [-] Port bce17837-9218-4b02-868d-09dba821ce49 in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a bound to our chassis
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.603 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:57:34 compute-0 systemd-udevd[221447]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:57:34 compute-0 NetworkManager[55139]: <info>  [1769039854.6299] device (tapbce17837-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:57:34 compute-0 NetworkManager[55139]: <info>  [1769039854.6307] device (tapbce17837-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:57:34 compute-0 systemd-machined[154182]: New machine qemu-34-instance-00000048.
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.630 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bf8cda7f-0e4c-4c6c-a8a3-9aed29c92cfe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.631 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58cd83db-d1 in ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.635 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58cd83db-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.636 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[278f43c7-5992-4a00-9f17-686272c76b6d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.637 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cc9227aa-c292-445a-9acd-0dfe65a1980a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 nova_compute[182935]: 2026-01-21 23:57:34.655 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:34 compute-0 ovn_controller[95047]: 2026-01-21T23:57:34Z|00243|binding|INFO|Setting lport bce17837-9218-4b02-868d-09dba821ce49 ovn-installed in OVS
Jan 21 23:57:34 compute-0 ovn_controller[95047]: 2026-01-21T23:57:34Z|00244|binding|INFO|Setting lport bce17837-9218-4b02-868d-09dba821ce49 up in Southbound
Jan 21 23:57:34 compute-0 nova_compute[182935]: 2026-01-21 23:57:34.658 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.659 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[61e1d9b7-77de-479a-96f0-aaaabe9d99f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 systemd[1]: Started Virtual Machine qemu-34-instance-00000048.
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.693 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ae12ad77-36c0-481c-a822-b13c7ddfad8f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.732 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[06c593af-e076-41a4-b4eb-af7b6e26dbc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.740 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[33a11bfe-daec-4c05-82d3-698b2baa2e25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 NetworkManager[55139]: <info>  [1769039854.7420] manager: (tap58cd83db-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/112)
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.781 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f062fe-7778-4fac-aab5-c9fecf7f00a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.785 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[aaccc0eb-29ff-4802-abef-b017d0807b8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 NetworkManager[55139]: <info>  [1769039854.8150] device (tap58cd83db-d0): carrier: link connected
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.818 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[66073f1f-a4bb-40b8-bf03-deda7bfd8e6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.840 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6979b534-bb1d-4e36-9ba5-11ab0127b33c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435755, 'reachable_time': 42620, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221500, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.867 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[00d30688-de05-4511-884b-069ba9bf74df]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:9a20'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435755, 'tstamp': 435755}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221507, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.887 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[694f9cca-555f-4345-998c-a791b6bec6fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435755, 'reachable_time': 42620, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221522, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 podman[221480]: 2026-01-21 23:57:34.905871387 +0000 UTC m=+0.101353422 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 21 23:57:34 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:34.934 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[899ccdc3-5e0c-497a-b050-ef8c9f127421]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:34 compute-0 podman[221470]: 2026-01-21 23:57:34.946882866 +0000 UTC m=+0.152911713 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.015 182939 DEBUG nova.compute.manager [req-1dc644f8-7baf-4476-832e-493f4bb4eacd req-04a6e6f2-2638-40ca-8838-1c34a2ca8159 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.016 182939 DEBUG oslo_concurrency.lockutils [req-1dc644f8-7baf-4476-832e-493f4bb4eacd req-04a6e6f2-2638-40ca-8838-1c34a2ca8159 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.017 182939 DEBUG oslo_concurrency.lockutils [req-1dc644f8-7baf-4476-832e-493f4bb4eacd req-04a6e6f2-2638-40ca-8838-1c34a2ca8159 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.017 182939 DEBUG oslo_concurrency.lockutils [req-1dc644f8-7baf-4476-832e-493f4bb4eacd req-04a6e6f2-2638-40ca-8838-1c34a2ca8159 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.017 182939 DEBUG nova.compute.manager [req-1dc644f8-7baf-4476-832e-493f4bb4eacd req-04a6e6f2-2638-40ca-8838-1c34a2ca8159 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Processing event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:35.019 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1a2445eb-327e-4536-afba-a9a713d2b159]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:35.021 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:35.021 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:35.021 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58cd83db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:35 compute-0 kernel: tap58cd83db-d0: entered promiscuous mode
Jan 21 23:57:35 compute-0 NetworkManager[55139]: <info>  [1769039855.0241] manager: (tap58cd83db-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.023 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:35.028 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58cd83db-d0, col_values=(('external_ids', {'iface-id': '2d113249-07d3-443f-9b57-5f5a422d1c98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.029 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:35 compute-0 ovn_controller[95047]: 2026-01-21T23:57:35Z|00245|binding|INFO|Releasing lport 2d113249-07d3-443f-9b57-5f5a422d1c98 from this chassis (sb_readonly=0)
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:35.032 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:35.033 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dc8e0bb0-6162-492e-b7c4-2592d4486af3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:35.034 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:57:35 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:35.035 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'env', 'PROCESS_TAG=haproxy-58cd83db-dcb3-409c-a108-07601ce5f67a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58cd83db-dcb3-409c-a108-07601ce5f67a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.096 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.224 182939 DEBUG nova.compute.manager [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.226 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039855.2237713, ada4a724-2307-431d-8c29-075bfd90b43e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.227 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] VM Started (Lifecycle Event)
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.237 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.243 182939 INFO nova.virt.libvirt.driver [-] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Instance spawned successfully.
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.244 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.253 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.257 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.289 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.290 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039855.225452, ada4a724-2307-431d-8c29-075bfd90b43e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.290 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] VM Paused (Lifecycle Event)
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.296 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.297 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.297 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.298 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.298 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.299 182939 DEBUG nova.virt.libvirt.driver [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.331 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.336 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039855.2352295, ada4a724-2307-431d-8c29-075bfd90b43e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.336 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] VM Resumed (Lifecycle Event)
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.368 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.372 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.400 182939 INFO nova.compute.manager [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Took 9.98 seconds to spawn the instance on the hypervisor.
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.401 182939 DEBUG nova.compute.manager [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.402 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.490 182939 INFO nova.compute.manager [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Took 10.69 seconds to build instance.
Jan 21 23:57:35 compute-0 podman[221566]: 2026-01-21 23:57:35.49884624 +0000 UTC m=+0.070387493 container create fd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.525 182939 DEBUG oslo_concurrency.lockutils [None req-4974dae0-9c0a-43d2-b4a5-2029b89812c5 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.833s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:35 compute-0 systemd[1]: Started libpod-conmon-fd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065.scope.
Jan 21 23:57:35 compute-0 podman[221566]: 2026-01-21 23:57:35.460434622 +0000 UTC m=+0.031975965 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.579 182939 DEBUG nova.network.neutron [req-c4ac6cd0-b76f-4c2d-b948-9d53a99dedab req-3039c183-508c-4c31-997c-ea06a6a10eb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Updated VIF entry in instance network info cache for port bce17837-9218-4b02-868d-09dba821ce49. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.580 182939 DEBUG nova.network.neutron [req-c4ac6cd0-b76f-4c2d-b948-9d53a99dedab req-3039c183-508c-4c31-997c-ea06a6a10eb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Updating instance_info_cache with network_info: [{"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:57:35 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:57:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d836145ffae4476033596152274c8f83e5957266e0ba115adbc3e1f08a35d581/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:57:35 compute-0 nova_compute[182935]: 2026-01-21 23:57:35.607 182939 DEBUG oslo_concurrency.lockutils [req-c4ac6cd0-b76f-4c2d-b948-9d53a99dedab req-3039c183-508c-4c31-997c-ea06a6a10eb1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:57:35 compute-0 podman[221566]: 2026-01-21 23:57:35.611524321 +0000 UTC m=+0.183065584 container init fd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 21 23:57:35 compute-0 podman[221566]: 2026-01-21 23:57:35.619897991 +0000 UTC m=+0.191439244 container start fd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:57:35 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221581]: [NOTICE]   (221585) : New worker (221587) forked
Jan 21 23:57:35 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221581]: [NOTICE]   (221585) : Loading success.
Jan 21 23:57:36 compute-0 nova_compute[182935]: 2026-01-21 23:57:36.061 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:37 compute-0 nova_compute[182935]: 2026-01-21 23:57:37.579 182939 DEBUG nova.compute.manager [req-87977812-2e62-4b69-bea6-d8c799f17308 req-24c80389-eacb-4dc1-8d7d-1e184e351897 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:57:37 compute-0 nova_compute[182935]: 2026-01-21 23:57:37.580 182939 DEBUG oslo_concurrency.lockutils [req-87977812-2e62-4b69-bea6-d8c799f17308 req-24c80389-eacb-4dc1-8d7d-1e184e351897 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:37 compute-0 nova_compute[182935]: 2026-01-21 23:57:37.581 182939 DEBUG oslo_concurrency.lockutils [req-87977812-2e62-4b69-bea6-d8c799f17308 req-24c80389-eacb-4dc1-8d7d-1e184e351897 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:37 compute-0 nova_compute[182935]: 2026-01-21 23:57:37.581 182939 DEBUG oslo_concurrency.lockutils [req-87977812-2e62-4b69-bea6-d8c799f17308 req-24c80389-eacb-4dc1-8d7d-1e184e351897 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:37 compute-0 nova_compute[182935]: 2026-01-21 23:57:37.582 182939 DEBUG nova.compute.manager [req-87977812-2e62-4b69-bea6-d8c799f17308 req-24c80389-eacb-4dc1-8d7d-1e184e351897 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] No waiting events found dispatching network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:57:37 compute-0 nova_compute[182935]: 2026-01-21 23:57:37.583 182939 WARNING nova.compute.manager [req-87977812-2e62-4b69-bea6-d8c799f17308 req-24c80389-eacb-4dc1-8d7d-1e184e351897 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received unexpected event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 for instance with vm_state active and task_state None.
Jan 21 23:57:38 compute-0 nova_compute[182935]: 2026-01-21 23:57:38.444 182939 DEBUG nova.compute.manager [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:57:38 compute-0 nova_compute[182935]: 2026-01-21 23:57:38.522 182939 INFO nova.compute.manager [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] instance snapshotting
Jan 21 23:57:38 compute-0 nova_compute[182935]: 2026-01-21 23:57:38.651 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:38 compute-0 nova_compute[182935]: 2026-01-21 23:57:38.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:38 compute-0 nova_compute[182935]: 2026-01-21 23:57:38.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:38 compute-0 nova_compute[182935]: 2026-01-21 23:57:38.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 21 23:57:38 compute-0 nova_compute[182935]: 2026-01-21 23:57:38.825 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.054 182939 INFO nova.virt.libvirt.driver [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Beginning live snapshot process
Jan 21 23:57:39 compute-0 virtqemud[182477]: invalid argument: disk vda does not have an active block job
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.331 182939 DEBUG oslo_concurrency.processutils [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.438 182939 DEBUG oslo_concurrency.processutils [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json -f qcow2" returned: 0 in 0.107s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.440 182939 DEBUG oslo_concurrency.processutils [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.501 182939 DEBUG oslo_concurrency.processutils [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json -f qcow2" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.530 182939 DEBUG oslo_concurrency.processutils [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.605 182939 DEBUG oslo_concurrency.processutils [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.608 182939 DEBUG oslo_concurrency.processutils [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpbblsb9c8/917a345bf245404fbb189d66c552ca64.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.658 182939 DEBUG oslo_concurrency.processutils [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpbblsb9c8/917a345bf245404fbb189d66c552ca64.delta 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.659 182939 INFO nova.virt.libvirt.driver [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.730 182939 DEBUG nova.virt.libvirt.guest [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.735 182939 INFO nova.virt.libvirt.driver [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.792 182939 DEBUG nova.privsep.utils [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.793 182939 DEBUG oslo_concurrency.processutils [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpbblsb9c8/917a345bf245404fbb189d66c552ca64.delta /var/lib/nova/instances/snapshots/tmpbblsb9c8/917a345bf245404fbb189d66c552ca64 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.994 182939 DEBUG oslo_concurrency.processutils [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpbblsb9c8/917a345bf245404fbb189d66c552ca64.delta /var/lib/nova/instances/snapshots/tmpbblsb9c8/917a345bf245404fbb189d66c552ca64" returned: 0 in 0.201s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:57:39 compute-0 nova_compute[182935]: 2026-01-21 23:57:39.997 182939 INFO nova.virt.libvirt.driver [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Snapshot extracted, beginning image upload
Jan 21 23:57:41 compute-0 nova_compute[182935]: 2026-01-21 23:57:41.063 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:42 compute-0 podman[221625]: 2026-01-21 23:57:42.727266989 +0000 UTC m=+0.086236990 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:57:42 compute-0 nova_compute[182935]: 2026-01-21 23:57:42.769 182939 INFO nova.virt.libvirt.driver [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Snapshot image upload complete
Jan 21 23:57:42 compute-0 nova_compute[182935]: 2026-01-21 23:57:42.772 182939 INFO nova.compute.manager [None req-0be36a47-7095-4f49-89d0-7d5c4fd7265b 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Took 4.24 seconds to snapshot the instance on the hypervisor.
Jan 21 23:57:43 compute-0 nova_compute[182935]: 2026-01-21 23:57:43.654 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:44 compute-0 nova_compute[182935]: 2026-01-21 23:57:44.097 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:57:44 compute-0 nova_compute[182935]: 2026-01-21 23:57:44.123 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Triggering sync for uuid ada4a724-2307-431d-8c29-075bfd90b43e _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 21 23:57:44 compute-0 nova_compute[182935]: 2026-01-21 23:57:44.124 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:57:44 compute-0 nova_compute[182935]: 2026-01-21 23:57:44.124 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "ada4a724-2307-431d-8c29-075bfd90b43e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:57:44 compute-0 nova_compute[182935]: 2026-01-21 23:57:44.161 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "ada4a724-2307-431d-8c29-075bfd90b43e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:57:45 compute-0 nova_compute[182935]: 2026-01-21 23:57:45.238 182939 INFO nova.compute.manager [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Rescuing
Jan 21 23:57:45 compute-0 nova_compute[182935]: 2026-01-21 23:57:45.239 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:57:45 compute-0 nova_compute[182935]: 2026-01-21 23:57:45.239 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquired lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:57:45 compute-0 nova_compute[182935]: 2026-01-21 23:57:45.240 182939 DEBUG nova.network.neutron [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:57:46 compute-0 nova_compute[182935]: 2026-01-21 23:57:46.065 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:46 compute-0 sshd-session[221649]: Invalid user git from 188.166.69.60 port 48360
Jan 21 23:57:46 compute-0 podman[221651]: 2026-01-21 23:57:46.642483184 +0000 UTC m=+0.057457874 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 21 23:57:46 compute-0 sshd-session[221649]: Connection closed by invalid user git 188.166.69.60 port 48360 [preauth]
Jan 21 23:57:46 compute-0 nova_compute[182935]: 2026-01-21 23:57:46.759 182939 DEBUG nova.network.neutron [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Updating instance_info_cache with network_info: [{"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:57:46 compute-0 nova_compute[182935]: 2026-01-21 23:57:46.782 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Releasing lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:57:47 compute-0 nova_compute[182935]: 2026-01-21 23:57:47.058 182939 DEBUG nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:57:48 compute-0 nova_compute[182935]: 2026-01-21 23:57:48.657 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:49 compute-0 ovn_controller[95047]: 2026-01-21T23:57:49Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7e:09:d8 10.100.0.11
Jan 21 23:57:49 compute-0 ovn_controller[95047]: 2026-01-21T23:57:49Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7e:09:d8 10.100.0.11
Jan 21 23:57:51 compute-0 nova_compute[182935]: 2026-01-21 23:57:51.067 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:53 compute-0 nova_compute[182935]: 2026-01-21 23:57:53.659 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:53 compute-0 podman[221682]: 2026-01-21 23:57:53.689579773 +0000 UTC m=+0.058158601 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Jan 21 23:57:53 compute-0 podman[221683]: 2026-01-21 23:57:53.702481921 +0000 UTC m=+0.062734670 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Jan 21 23:57:56 compute-0 nova_compute[182935]: 2026-01-21 23:57:56.069 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:57 compute-0 nova_compute[182935]: 2026-01-21 23:57:57.106 182939 DEBUG nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 21 23:57:58 compute-0 nova_compute[182935]: 2026-01-21 23:57:58.661 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-0 kernel: tapbce17837-92 (unregistering): left promiscuous mode
Jan 21 23:57:59 compute-0 NetworkManager[55139]: <info>  [1769039879.2882] device (tapbce17837-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:57:59 compute-0 ovn_controller[95047]: 2026-01-21T23:57:59Z|00246|binding|INFO|Releasing lport bce17837-9218-4b02-868d-09dba821ce49 from this chassis (sb_readonly=0)
Jan 21 23:57:59 compute-0 ovn_controller[95047]: 2026-01-21T23:57:59Z|00247|binding|INFO|Setting lport bce17837-9218-4b02-868d-09dba821ce49 down in Southbound
Jan 21 23:57:59 compute-0 ovn_controller[95047]: 2026-01-21T23:57:59Z|00248|binding|INFO|Removing iface tapbce17837-92 ovn-installed in OVS
Jan 21 23:57:59 compute-0 nova_compute[182935]: 2026-01-21 23:57:59.342 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-0 nova_compute[182935]: 2026-01-21 23:57:59.345 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.352 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:09:d8 10.100.0.11'], port_security=['fa:16:3e:7e:09:d8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=bce17837-9218-4b02-868d-09dba821ce49) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.354 104408 INFO neutron.agent.ovn.metadata.agent [-] Port bce17837-9218-4b02-868d-09dba821ce49 in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a unbound from our chassis
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.355 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58cd83db-dcb3-409c-a108-07601ce5f67a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:57:59 compute-0 nova_compute[182935]: 2026-01-21 23:57:59.357 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.356 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[67b750c2-fd84-4349-8691-4ec5e40b492c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.358 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a namespace which is not needed anymore
Jan 21 23:57:59 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000048.scope: Deactivated successfully.
Jan 21 23:57:59 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000048.scope: Consumed 14.301s CPU time.
Jan 21 23:57:59 compute-0 systemd-machined[154182]: Machine qemu-34-instance-00000048 terminated.
Jan 21 23:57:59 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221581]: [NOTICE]   (221585) : haproxy version is 2.8.14-c23fe91
Jan 21 23:57:59 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221581]: [NOTICE]   (221585) : path to executable is /usr/sbin/haproxy
Jan 21 23:57:59 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221581]: [WARNING]  (221585) : Exiting Master process...
Jan 21 23:57:59 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221581]: [ALERT]    (221585) : Current worker (221587) exited with code 143 (Terminated)
Jan 21 23:57:59 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221581]: [WARNING]  (221585) : All workers exited. Exiting... (0)
Jan 21 23:57:59 compute-0 systemd[1]: libpod-fd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065.scope: Deactivated successfully.
Jan 21 23:57:59 compute-0 podman[221743]: 2026-01-21 23:57:59.52915679 +0000 UTC m=+0.060383513 container died fd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 21 23:57:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065-userdata-shm.mount: Deactivated successfully.
Jan 21 23:57:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-d836145ffae4476033596152274c8f83e5957266e0ba115adbc3e1f08a35d581-merged.mount: Deactivated successfully.
Jan 21 23:57:59 compute-0 podman[221743]: 2026-01-21 23:57:59.56894387 +0000 UTC m=+0.100170563 container cleanup fd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 23:57:59 compute-0 systemd[1]: libpod-conmon-fd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065.scope: Deactivated successfully.
Jan 21 23:57:59 compute-0 podman[221773]: 2026-01-21 23:57:59.641784 +0000 UTC m=+0.043773756 container remove fd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.648 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3138a763-900f-486b-9433-d41dba9afcbe]: (4, ('Wed Jan 21 11:57:59 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a (fd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065)\nfd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065\nWed Jan 21 11:57:59 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a (fd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065)\nfd52ef320407890755bbb660e1a4318d8f7e58efc515ea9cd175b2290e924065\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.651 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bf56a837-e4c9-4b40-8fce-f80b2e6320cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.652 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:57:59 compute-0 nova_compute[182935]: 2026-01-21 23:57:59.654 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-0 kernel: tap58cd83db-d0: left promiscuous mode
Jan 21 23:57:59 compute-0 nova_compute[182935]: 2026-01-21 23:57:59.680 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.684 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1ff5dc4e-2d8c-493f-822a-33ef1659897d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.699 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[564123ce-dc39-4d83-8145-dd267b9b9bc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.701 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b65255b8-f1f8-46ae-87c9-69fa6c54265b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.718 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fcec32bc-c246-4619-bd8f-fd59f3c98494]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435746, 'reachable_time': 25377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221803, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:57:59 compute-0 systemd[1]: run-netns-ovnmeta\x2d58cd83db\x2ddcb3\x2d409c\x2da108\x2d07601ce5f67a.mount: Deactivated successfully.
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.724 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:57:59 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:57:59.724 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[c41bd8e4-ba52-4ded-adfc-a3466eb96a04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.121 182939 INFO nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Instance shutdown successfully after 13 seconds.
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.128 182939 INFO nova.virt.libvirt.driver [-] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Instance destroyed successfully.
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.129 182939 DEBUG nova.objects.instance [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'numa_topology' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.161 182939 INFO nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Attempting a stable device rescue
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.350 182939 DEBUG nova.compute.manager [req-31c32357-8ad2-4de4-93e5-dd104e4aa5de req-31d2512f-acc7-488f-b0d6-a1b660724d4e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-unplugged-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.351 182939 DEBUG oslo_concurrency.lockutils [req-31c32357-8ad2-4de4-93e5-dd104e4aa5de req-31d2512f-acc7-488f-b0d6-a1b660724d4e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.352 182939 DEBUG oslo_concurrency.lockutils [req-31c32357-8ad2-4de4-93e5-dd104e4aa5de req-31d2512f-acc7-488f-b0d6-a1b660724d4e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.353 182939 DEBUG oslo_concurrency.lockutils [req-31c32357-8ad2-4de4-93e5-dd104e4aa5de req-31d2512f-acc7-488f-b0d6-a1b660724d4e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.353 182939 DEBUG nova.compute.manager [req-31c32357-8ad2-4de4-93e5-dd104e4aa5de req-31d2512f-acc7-488f-b0d6-a1b660724d4e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] No waiting events found dispatching network-vif-unplugged-bce17837-9218-4b02-868d-09dba821ce49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.354 182939 WARNING nova.compute.manager [req-31c32357-8ad2-4de4-93e5-dd104e4aa5de req-31d2512f-acc7-488f-b0d6-a1b660724d4e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received unexpected event network-vif-unplugged-bce17837-9218-4b02-868d-09dba821ce49 for instance with vm_state active and task_state rescuing.
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.485 182939 DEBUG nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.494 182939 DEBUG nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.495 182939 INFO nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Creating image(s)
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.497 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.498 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.499 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.500 182939 DEBUG nova.objects.instance [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'trusted_certs' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.524 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "bf25a60a05476d5b07338b195b9b51af2b6b007b" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:00 compute-0 nova_compute[182935]: 2026-01-21 23:58:00.525 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "bf25a60a05476d5b07338b195b9b51af2b6b007b" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:01 compute-0 nova_compute[182935]: 2026-01-21 23:58:01.071 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:01 compute-0 nova_compute[182935]: 2026-01-21 23:58:01.881 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:01 compute-0 nova_compute[182935]: 2026-01-21 23:58:01.946 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b.part --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:01 compute-0 nova_compute[182935]: 2026-01-21 23:58:01.947 182939 DEBUG nova.virt.images [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] 173cedb5-ca53-47c8-8979-62749d7af470 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 21 23:58:01 compute-0 nova_compute[182935]: 2026-01-21 23:58:01.948 182939 DEBUG nova.privsep.utils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:58:01 compute-0 nova_compute[182935]: 2026-01-21 23:58:01.948 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b.part /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.149 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b.part /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b.converted" returned: 0 in 0.201s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.159 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.240 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b.converted --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.243 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "bf25a60a05476d5b07338b195b9b51af2b6b007b" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.271 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "bf25a60a05476d5b07338b195b9b51af2b6b007b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.272 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "bf25a60a05476d5b07338b195b9b51af2b6b007b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.295 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.364 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.366 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b,backing_fmt=raw /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.402 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b,backing_fmt=raw /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.rescue" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.404 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "bf25a60a05476d5b07338b195b9b51af2b6b007b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.404 182939 DEBUG nova.objects.instance [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'migration_context' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.426 182939 DEBUG nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.428 182939 DEBUG nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Start _get_guest_xml network_info=[{"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "vif_mac": "fa:16:3e:7e:09:d8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '173cedb5-ca53-47c8-8979-62749d7af470', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.429 182939 DEBUG nova.objects.instance [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'resources' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.456 182939 WARNING nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.462 182939 DEBUG nova.virt.libvirt.host [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.463 182939 DEBUG nova.virt.libvirt.host [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.468 182939 DEBUG nova.compute.manager [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.468 182939 DEBUG oslo_concurrency.lockutils [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.468 182939 DEBUG oslo_concurrency.lockutils [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.469 182939 DEBUG oslo_concurrency.lockutils [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.469 182939 DEBUG nova.compute.manager [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] No waiting events found dispatching network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.469 182939 WARNING nova.compute.manager [req-7453fe32-1149-43c1-a4d4-7518eef0a4a5 req-9fab2623-3ee1-437e-b414-d38d49d08c95 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received unexpected event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 for instance with vm_state active and task_state rescuing.
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.470 182939 DEBUG nova.virt.libvirt.host [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.471 182939 DEBUG nova.virt.libvirt.host [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.472 182939 DEBUG nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.472 182939 DEBUG nova.virt.hardware [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.473 182939 DEBUG nova.virt.hardware [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.473 182939 DEBUG nova.virt.hardware [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.473 182939 DEBUG nova.virt.hardware [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.473 182939 DEBUG nova.virt.hardware [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.473 182939 DEBUG nova.virt.hardware [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.474 182939 DEBUG nova.virt.hardware [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.474 182939 DEBUG nova.virt.hardware [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.474 182939 DEBUG nova.virt.hardware [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.474 182939 DEBUG nova.virt.hardware [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.474 182939 DEBUG nova.virt.hardware [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.475 182939 DEBUG nova.objects.instance [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'vcpu_model' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.493 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.590 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.config --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.591 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.592 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.592 182939 DEBUG oslo_concurrency.lockutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.594 182939 DEBUG nova.virt.libvirt.vif [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1788045668',display_name='tempest-ServerStableDeviceRescueTest-server-1788045668',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1788045668',id=72,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:57:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='011e84f966444a668bd6c0f5674f551f',ramdisk_id='',reservation_id='r-piy4urn6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1256721315',owner_user_name='tempest-ServerStableDeviceRescueTest-1256721315-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:57:42Z,user_data=None,user_id='55710edfd4b24e368807c8b5087ec91c',uuid=ada4a724-2307-431d-8c29-075bfd90b43e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "vif_mac": "fa:16:3e:7e:09:d8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.594 182939 DEBUG nova.network.os_vif_util [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converting VIF {"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "vif_mac": "fa:16:3e:7e:09:d8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.595 182939 DEBUG nova.network.os_vif_util [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7e:09:d8,bridge_name='br-int',has_traffic_filtering=True,id=bce17837-9218-4b02-868d-09dba821ce49,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbce17837-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.596 182939 DEBUG nova.objects.instance [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'pci_devices' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.618 182939 DEBUG nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:58:02 compute-0 nova_compute[182935]:   <uuid>ada4a724-2307-431d-8c29-075bfd90b43e</uuid>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   <name>instance-00000048</name>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-1788045668</nova:name>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:58:02</nova:creationTime>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:58:02 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:58:02 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:58:02 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:58:02 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:58:02 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:58:02 compute-0 nova_compute[182935]:         <nova:user uuid="55710edfd4b24e368807c8b5087ec91c">tempest-ServerStableDeviceRescueTest-1256721315-project-member</nova:user>
Jan 21 23:58:02 compute-0 nova_compute[182935]:         <nova:project uuid="011e84f966444a668bd6c0f5674f551f">tempest-ServerStableDeviceRescueTest-1256721315</nova:project>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:58:02 compute-0 nova_compute[182935]:         <nova:port uuid="bce17837-9218-4b02-868d-09dba821ce49">
Jan 21 23:58:02 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <system>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <entry name="serial">ada4a724-2307-431d-8c29-075bfd90b43e</entry>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <entry name="uuid">ada4a724-2307-431d-8c29-075bfd90b43e</entry>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     </system>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   <os>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   </os>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   <features>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   </features>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.config"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.rescue"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <target dev="sdb" bus="scsi"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <boot order="1"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:7e:09:d8"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <target dev="tapbce17837-92"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/console.log" append="off"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <video>
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     </video>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:58:02 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:58:02 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:58:02 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:58:02 compute-0 nova_compute[182935]: </domain>
Jan 21 23:58:02 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.627 182939 INFO nova.virt.libvirt.driver [-] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Instance destroyed successfully.
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.706 182939 DEBUG nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.707 182939 DEBUG nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.707 182939 DEBUG nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.707 182939 DEBUG nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No VIF found with MAC fa:16:3e:7e:09:d8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.708 182939 INFO nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Using config drive
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.723 182939 DEBUG nova.objects.instance [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'ec2_ids' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:02 compute-0 nova_compute[182935]: 2026-01-21 23:58:02.763 182939 DEBUG nova.objects.instance [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'keypairs' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:03.192 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:03.193 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:03.193 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:03 compute-0 nova_compute[182935]: 2026-01-21 23:58:03.665 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:04 compute-0 nova_compute[182935]: 2026-01-21 23:58:04.712 182939 INFO nova.virt.libvirt.driver [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Creating config drive at /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.config.rescue
Jan 21 23:58:04 compute-0 nova_compute[182935]: 2026-01-21 23:58:04.724 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqwd5dhu0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:04.867 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:04 compute-0 nova_compute[182935]: 2026-01-21 23:58:04.867 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:04.868 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:58:04 compute-0 nova_compute[182935]: 2026-01-21 23:58:04.874 182939 DEBUG oslo_concurrency.processutils [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqwd5dhu0" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:04 compute-0 kernel: tapbce17837-92: entered promiscuous mode
Jan 21 23:58:04 compute-0 NetworkManager[55139]: <info>  [1769039884.9664] manager: (tapbce17837-92): new Tun device (/org/freedesktop/NetworkManager/Devices/114)
Jan 21 23:58:04 compute-0 ovn_controller[95047]: 2026-01-21T23:58:04Z|00249|binding|INFO|Claiming lport bce17837-9218-4b02-868d-09dba821ce49 for this chassis.
Jan 21 23:58:04 compute-0 ovn_controller[95047]: 2026-01-21T23:58:04Z|00250|binding|INFO|bce17837-9218-4b02-868d-09dba821ce49: Claiming fa:16:3e:7e:09:d8 10.100.0.11
Jan 21 23:58:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:04.976 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:09:d8 10.100.0.11'], port_security=['fa:16:3e:7e:09:d8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=bce17837-9218-4b02-868d-09dba821ce49) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:04.977 104408 INFO neutron.agent.ovn.metadata.agent [-] Port bce17837-9218-4b02-868d-09dba821ce49 in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a bound to our chassis
Jan 21 23:58:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:04.978 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:58:04 compute-0 nova_compute[182935]: 2026-01-21 23:58:04.973 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:04 compute-0 ovn_controller[95047]: 2026-01-21T23:58:04Z|00251|binding|INFO|Setting lport bce17837-9218-4b02-868d-09dba821ce49 ovn-installed in OVS
Jan 21 23:58:04 compute-0 ovn_controller[95047]: 2026-01-21T23:58:04Z|00252|binding|INFO|Setting lport bce17837-9218-4b02-868d-09dba821ce49 up in Southbound
Jan 21 23:58:04 compute-0 nova_compute[182935]: 2026-01-21 23:58:04.981 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:04 compute-0 nova_compute[182935]: 2026-01-21 23:58:04.990 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:04.993 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[84f18082-8a7c-4028-992d-3f38aff426d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:04.993 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58cd83db-d1 in ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:58:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:04.995 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58cd83db-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:58:04 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:04.996 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[876e1561-eadf-4e03-a139-ca8ba02c5b86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:04.996 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[db1ab1df-3a1e-42e2-b1ab-755bafc47bd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.013 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[da27e28a-2416-4652-a5b3-c36dc45993bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 systemd-machined[154182]: New machine qemu-35-instance-00000048.
Jan 21 23:58:05 compute-0 podman[221837]: 2026-01-21 23:58:05.02971472 +0000 UTC m=+0.079391718 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.030 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4f70347b-8944-4c6b-a895-e1a9e1944779]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 systemd[1]: Started Virtual Machine qemu-35-instance-00000048.
Jan 21 23:58:05 compute-0 systemd-udevd[221889]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:58:05 compute-0 NetworkManager[55139]: <info>  [1769039885.0633] device (tapbce17837-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:58:05 compute-0 NetworkManager[55139]: <info>  [1769039885.0642] device (tapbce17837-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.065 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee60616-ca91-4c18-a907-336915ee2f79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 systemd-udevd[221893]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.071 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f4ea86ee-4644-4da2-9e2d-1a4ccd07f5b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 NetworkManager[55139]: <info>  [1769039885.0724] manager: (tap58cd83db-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/115)
Jan 21 23:58:05 compute-0 podman[221854]: 2026-01-21 23:58:05.112382434 +0000 UTC m=+0.112064878 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.117 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8c008615-738b-4fb9-8af2-c0549aab933b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.122 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[0b8e9e10-0264-4e8f-b6c2-46ea11e0b5fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 NetworkManager[55139]: <info>  [1769039885.1559] device (tap58cd83db-d0): carrier: link connected
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.165 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[ea8b498b-75aa-4fc8-a520-f17b65a2015d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.188 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fda6ce20-b25d-4bb0-a6cd-7b9adfe9ec31]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438789, 'reachable_time': 16825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221927, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.211 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0a41dc28-365c-425c-80d4-68b60aad7878]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:9a20'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438789, 'tstamp': 438789}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221928, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.232 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d3dde8b6-566e-4847-b1da-1b202d2ffff3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438789, 'reachable_time': 16825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221929, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.281 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8652a74b-6ed3-48f8-933e-364fa9818431]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.360 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8265780c-6b3f-404e-85ef-839d39f192e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.362 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.362 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.363 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58cd83db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:05 compute-0 kernel: tap58cd83db-d0: entered promiscuous mode
Jan 21 23:58:05 compute-0 NetworkManager[55139]: <info>  [1769039885.3661] manager: (tap58cd83db-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.365 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.370 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58cd83db-d0, col_values=(('external_ids', {'iface-id': '2d113249-07d3-443f-9b57-5f5a422d1c98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:05 compute-0 ovn_controller[95047]: 2026-01-21T23:58:05Z|00253|binding|INFO|Releasing lport 2d113249-07d3-443f-9b57-5f5a422d1c98 from this chassis (sb_readonly=0)
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.372 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.373 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.373 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.374 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1909d0b9-107f-49da-80ba-9fa7d1538ce8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.375 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.377 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'env', 'PROCESS_TAG=haproxy-58cd83db-dcb3-409c-a108-07601ce5f67a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58cd83db-dcb3-409c-a108-07601ce5f67a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.383 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.510 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for ada4a724-2307-431d-8c29-075bfd90b43e due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.511 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039885.5101678, ada4a724-2307-431d-8c29-075bfd90b43e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.512 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] VM Resumed (Lifecycle Event)
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.529 182939 DEBUG nova.compute.manager [None req-bd21a073-be58-4159-8198-13d0db0a6d0f 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.539 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.543 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.584 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] During sync_power_state the instance has a pending task (rescuing). Skip.
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.585 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039885.5157015, ada4a724-2307-431d-8c29-075bfd90b43e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.585 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] VM Started (Lifecycle Event)
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.599 182939 DEBUG nova.compute.manager [req-a99b4d26-3a58-4023-b175-4a42ae725f78 req-ab1c65be-06d0-4813-a6fe-df0346be6e77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.600 182939 DEBUG oslo_concurrency.lockutils [req-a99b4d26-3a58-4023-b175-4a42ae725f78 req-ab1c65be-06d0-4813-a6fe-df0346be6e77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.600 182939 DEBUG oslo_concurrency.lockutils [req-a99b4d26-3a58-4023-b175-4a42ae725f78 req-ab1c65be-06d0-4813-a6fe-df0346be6e77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.601 182939 DEBUG oslo_concurrency.lockutils [req-a99b4d26-3a58-4023-b175-4a42ae725f78 req-ab1c65be-06d0-4813-a6fe-df0346be6e77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.601 182939 DEBUG nova.compute.manager [req-a99b4d26-3a58-4023-b175-4a42ae725f78 req-ab1c65be-06d0-4813-a6fe-df0346be6e77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] No waiting events found dispatching network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.601 182939 WARNING nova.compute.manager [req-a99b4d26-3a58-4023-b175-4a42ae725f78 req-ab1c65be-06d0-4813-a6fe-df0346be6e77 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received unexpected event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 for instance with vm_state active and task_state rescuing.
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.614 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:05 compute-0 nova_compute[182935]: 2026-01-21 23:58:05.620 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:58:05 compute-0 podman[221967]: 2026-01-21 23:58:05.80846352 +0000 UTC m=+0.069784928 container create cae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:58:05 compute-0 systemd[1]: Started libpod-conmon-cae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43.scope.
Jan 21 23:58:05 compute-0 podman[221967]: 2026-01-21 23:58:05.78040029 +0000 UTC m=+0.041721728 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:58:05 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:05.870 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:05 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:58:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93f301753df971308f165b037abc3b9e812c5dce99f317163104f50f44123bea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:58:05 compute-0 podman[221967]: 2026-01-21 23:58:05.911276626 +0000 UTC m=+0.172598054 container init cae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 21 23:58:05 compute-0 podman[221967]: 2026-01-21 23:58:05.918763564 +0000 UTC m=+0.180084972 container start cae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 21 23:58:05 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221983]: [NOTICE]   (221987) : New worker (221989) forked
Jan 21 23:58:05 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221983]: [NOTICE]   (221987) : Loading success.
Jan 21 23:58:06 compute-0 nova_compute[182935]: 2026-01-21 23:58:06.074 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:06 compute-0 nova_compute[182935]: 2026-01-21 23:58:06.765 182939 INFO nova.compute.manager [None req-af8e1dc4-bd2e-4783-a519-3e332e42a70c 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Unrescuing
Jan 21 23:58:06 compute-0 nova_compute[182935]: 2026-01-21 23:58:06.766 182939 DEBUG oslo_concurrency.lockutils [None req-af8e1dc4-bd2e-4783-a519-3e332e42a70c 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:06 compute-0 nova_compute[182935]: 2026-01-21 23:58:06.767 182939 DEBUG oslo_concurrency.lockutils [None req-af8e1dc4-bd2e-4783-a519-3e332e42a70c 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquired lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:06 compute-0 nova_compute[182935]: 2026-01-21 23:58:06.767 182939 DEBUG nova.network.neutron [None req-af8e1dc4-bd2e-4783-a519-3e332e42a70c 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:58:07 compute-0 nova_compute[182935]: 2026-01-21 23:58:07.892 182939 DEBUG nova.compute.manager [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:07 compute-0 nova_compute[182935]: 2026-01-21 23:58:07.893 182939 DEBUG oslo_concurrency.lockutils [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:07 compute-0 nova_compute[182935]: 2026-01-21 23:58:07.893 182939 DEBUG oslo_concurrency.lockutils [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:07 compute-0 nova_compute[182935]: 2026-01-21 23:58:07.893 182939 DEBUG oslo_concurrency.lockutils [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:07 compute-0 nova_compute[182935]: 2026-01-21 23:58:07.894 182939 DEBUG nova.compute.manager [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] No waiting events found dispatching network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:07 compute-0 nova_compute[182935]: 2026-01-21 23:58:07.894 182939 WARNING nova.compute.manager [req-32348627-0dc3-43ed-ab06-ba20b225048c req-65cc736a-5332-4c38-b7d8-0e7127202ae1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received unexpected event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 for instance with vm_state rescued and task_state unrescuing.
Jan 21 23:58:09 compute-0 nova_compute[182935]: 2026-01-21 23:58:09.646 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:10 compute-0 nova_compute[182935]: 2026-01-21 23:58:10.747 182939 DEBUG nova.network.neutron [None req-af8e1dc4-bd2e-4783-a519-3e332e42a70c 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Updating instance_info_cache with network_info: [{"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:10 compute-0 nova_compute[182935]: 2026-01-21 23:58:10.770 182939 DEBUG oslo_concurrency.lockutils [None req-af8e1dc4-bd2e-4783-a519-3e332e42a70c 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Releasing lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:10 compute-0 nova_compute[182935]: 2026-01-21 23:58:10.771 182939 DEBUG nova.objects.instance [None req-af8e1dc4-bd2e-4783-a519-3e332e42a70c 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'flavor' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:10 compute-0 kernel: tapbce17837-92 (unregistering): left promiscuous mode
Jan 21 23:58:10 compute-0 NetworkManager[55139]: <info>  [1769039890.8458] device (tapbce17837-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:58:10 compute-0 ovn_controller[95047]: 2026-01-21T23:58:10Z|00254|binding|INFO|Releasing lport bce17837-9218-4b02-868d-09dba821ce49 from this chassis (sb_readonly=0)
Jan 21 23:58:10 compute-0 nova_compute[182935]: 2026-01-21 23:58:10.850 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:10 compute-0 ovn_controller[95047]: 2026-01-21T23:58:10Z|00255|binding|INFO|Setting lport bce17837-9218-4b02-868d-09dba821ce49 down in Southbound
Jan 21 23:58:10 compute-0 ovn_controller[95047]: 2026-01-21T23:58:10Z|00256|binding|INFO|Removing iface tapbce17837-92 ovn-installed in OVS
Jan 21 23:58:10 compute-0 nova_compute[182935]: 2026-01-21 23:58:10.853 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:10.859 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:09:d8 10.100.0.11'], port_security=['fa:16:3e:7e:09:d8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=bce17837-9218-4b02-868d-09dba821ce49) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:10.861 104408 INFO neutron.agent.ovn.metadata.agent [-] Port bce17837-9218-4b02-868d-09dba821ce49 in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a unbound from our chassis
Jan 21 23:58:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:10.862 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58cd83db-dcb3-409c-a108-07601ce5f67a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:58:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:10.865 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[40f0bdd2-b4d4-453c-bf20-9035c3191165]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:10 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:10.865 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a namespace which is not needed anymore
Jan 21 23:58:10 compute-0 nova_compute[182935]: 2026-01-21 23:58:10.867 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:10 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000048.scope: Deactivated successfully.
Jan 21 23:58:10 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000048.scope: Consumed 5.849s CPU time.
Jan 21 23:58:10 compute-0 systemd-machined[154182]: Machine qemu-35-instance-00000048 terminated.
Jan 21 23:58:10 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221983]: [NOTICE]   (221987) : haproxy version is 2.8.14-c23fe91
Jan 21 23:58:10 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221983]: [NOTICE]   (221987) : path to executable is /usr/sbin/haproxy
Jan 21 23:58:10 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221983]: [WARNING]  (221987) : Exiting Master process...
Jan 21 23:58:10 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221983]: [ALERT]    (221987) : Current worker (221989) exited with code 143 (Terminated)
Jan 21 23:58:10 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[221983]: [WARNING]  (221987) : All workers exited. Exiting... (0)
Jan 21 23:58:10 compute-0 systemd[1]: libpod-cae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43.scope: Deactivated successfully.
Jan 21 23:58:11 compute-0 podman[222022]: 2026-01-21 23:58:11.001258669 +0000 UTC m=+0.040115749 container died cae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:58:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43-userdata-shm.mount: Deactivated successfully.
Jan 21 23:58:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-93f301753df971308f165b037abc3b9e812c5dce99f317163104f50f44123bea-merged.mount: Deactivated successfully.
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.034 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.039 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-0 podman[222022]: 2026-01-21 23:58:11.04146933 +0000 UTC m=+0.080326410 container cleanup cae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:58:11 compute-0 systemd[1]: libpod-conmon-cae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43.scope: Deactivated successfully.
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.059 182939 DEBUG nova.compute.manager [req-f587033c-a153-47c0-9ec2-f72c034b9c66 req-e91913ea-489b-4a42-89e3-520abce8c12b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-unplugged-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.060 182939 DEBUG oslo_concurrency.lockutils [req-f587033c-a153-47c0-9ec2-f72c034b9c66 req-e91913ea-489b-4a42-89e3-520abce8c12b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.060 182939 DEBUG oslo_concurrency.lockutils [req-f587033c-a153-47c0-9ec2-f72c034b9c66 req-e91913ea-489b-4a42-89e3-520abce8c12b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.060 182939 DEBUG oslo_concurrency.lockutils [req-f587033c-a153-47c0-9ec2-f72c034b9c66 req-e91913ea-489b-4a42-89e3-520abce8c12b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.060 182939 DEBUG nova.compute.manager [req-f587033c-a153-47c0-9ec2-f72c034b9c66 req-e91913ea-489b-4a42-89e3-520abce8c12b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] No waiting events found dispatching network-vif-unplugged-bce17837-9218-4b02-868d-09dba821ce49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.060 182939 WARNING nova.compute.manager [req-f587033c-a153-47c0-9ec2-f72c034b9c66 req-e91913ea-489b-4a42-89e3-520abce8c12b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received unexpected event network-vif-unplugged-bce17837-9218-4b02-868d-09dba821ce49 for instance with vm_state rescued and task_state unrescuing.
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.076 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.100 182939 INFO nova.virt.libvirt.driver [-] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Instance destroyed successfully.
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.101 182939 DEBUG nova.objects.instance [None req-af8e1dc4-bd2e-4783-a519-3e332e42a70c 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'numa_topology' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:11 compute-0 podman[222060]: 2026-01-21 23:58:11.107493067 +0000 UTC m=+0.044641798 container remove cae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.114 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[37fa192b-d7fe-4524-9f4a-0d3f38603c9c]: (4, ('Wed Jan 21 11:58:10 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a (cae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43)\ncae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43\nWed Jan 21 11:58:11 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a (cae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43)\ncae25beacb1a99d7647eba84c96fc4067842103a0a714c6b943cf7aedba66b43\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.117 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[59cf9a99-3896-4e8c-85d2-c23b2ba07c25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.118 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:11 compute-0 kernel: tap58cd83db-d0: left promiscuous mode
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.119 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.134 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.138 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b547e125-a86f-4fc2-b08d-94ed1f15657b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.150 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b3bf887b-9b2a-46f9-af13-eff2cf489c02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.151 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e33689c8-b0b9-4747-a3d9-41107f92cbbc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.168 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[586e9666-07bc-4f96-b842-7e388ca03404]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438779, 'reachable_time': 19160, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222090, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 systemd[1]: run-netns-ovnmeta\x2d58cd83db\x2ddcb3\x2d409c\x2da108\x2d07601ce5f67a.mount: Deactivated successfully.
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.173 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.173 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[3e5b4ca2-1737-4731-aba7-5c0ec133d213]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 kernel: tapbce17837-92: entered promiscuous mode
Jan 21 23:58:11 compute-0 systemd-udevd[222002]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:58:11 compute-0 NetworkManager[55139]: <info>  [1769039891.2100] manager: (tapbce17837-92): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.209 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-0 ovn_controller[95047]: 2026-01-21T23:58:11Z|00257|binding|INFO|Claiming lport bce17837-9218-4b02-868d-09dba821ce49 for this chassis.
Jan 21 23:58:11 compute-0 ovn_controller[95047]: 2026-01-21T23:58:11Z|00258|binding|INFO|bce17837-9218-4b02-868d-09dba821ce49: Claiming fa:16:3e:7e:09:d8 10.100.0.11
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.221 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:09:d8 10.100.0.11'], port_security=['fa:16:3e:7e:09:d8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=bce17837-9218-4b02-868d-09dba821ce49) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:58:11 compute-0 NetworkManager[55139]: <info>  [1769039891.2221] device (tapbce17837-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:58:11 compute-0 NetworkManager[55139]: <info>  [1769039891.2228] device (tapbce17837-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.222 104408 INFO neutron.agent.ovn.metadata.agent [-] Port bce17837-9218-4b02-868d-09dba821ce49 in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a bound to our chassis
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.224 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:58:11 compute-0 ovn_controller[95047]: 2026-01-21T23:58:11Z|00259|binding|INFO|Setting lport bce17837-9218-4b02-868d-09dba821ce49 ovn-installed in OVS
Jan 21 23:58:11 compute-0 ovn_controller[95047]: 2026-01-21T23:58:11Z|00260|binding|INFO|Setting lport bce17837-9218-4b02-868d-09dba821ce49 up in Southbound
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.225 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.227 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.236 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b123aa29-ccf0-42f4-a4dd-d6f3344a9e9b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.237 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap58cd83db-d1 in ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.239 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap58cd83db-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.239 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6d3e1105-fff1-436e-a04a-3cf673dd514b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.240 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7f75459b-24c6-4df3-a988-f77ffd5fc151]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 systemd-machined[154182]: New machine qemu-36-instance-00000048.
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.251 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[df48ab26-7efb-4c0a-9c0b-57cc0776e8a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 systemd[1]: Started Virtual Machine qemu-36-instance-00000048.
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.273 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8ff1e259-e7fd-43c5-996a-42f6334a8641]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.308 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[0802e0f4-78da-4aa6-925a-93d263897719]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.315 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[979e59cb-4750-4ac7-9bd3-41513e90ae2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 NetworkManager[55139]: <info>  [1769039891.3158] manager: (tap58cd83db-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/118)
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.351 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[29ffd5c5-9dd9-4b92-9cbb-a19d32949a7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.356 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[32f21ea8-69e9-4bff-856b-e07fe086ece8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 NetworkManager[55139]: <info>  [1769039891.3817] device (tap58cd83db-d0): carrier: link connected
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.388 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c11f5cc5-a231-4bc1-bd58-45335dbe7d03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.408 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e6cc28b0-19f1-48c9-ade3-b2e84b3898b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439412, 'reachable_time': 21509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222136, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.433 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cf87d9ee-d0be-4e97-af53-a00e44c6791c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:9a20'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439412, 'tstamp': 439412}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222137, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.455 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b2b38f99-8477-4655-8f75-02aac685af06]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439412, 'reachable_time': 21509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222140, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.492 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d4f08975-c2ad-41aa-a2e5-b82818f70b0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.537 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for ada4a724-2307-431d-8c29-075bfd90b43e due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.538 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039891.5368412, ada4a724-2307-431d-8c29-075bfd90b43e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.538 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] VM Resumed (Lifecycle Event)
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.541 182939 DEBUG nova.compute.manager [None req-af8e1dc4-bd2e-4783-a519-3e332e42a70c 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.567 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9657b18d-4ebc-4730-a199-c77507a11aad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.569 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.569 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.569 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58cd83db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.571 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-0 NetworkManager[55139]: <info>  [1769039891.5727] manager: (tap58cd83db-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Jan 21 23:58:11 compute-0 kernel: tap58cd83db-d0: entered promiscuous mode
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.574 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58cd83db-d0, col_values=(('external_ids', {'iface-id': '2d113249-07d3-443f-9b57-5f5a422d1c98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:58:11 compute-0 ovn_controller[95047]: 2026-01-21T23:58:11Z|00261|binding|INFO|Releasing lport 2d113249-07d3-443f-9b57-5f5a422d1c98 from this chassis (sb_readonly=0)
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.576 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.577 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.581 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.588 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.589 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.590 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1363dd42-d07a-4059-a45c-2379c464a582]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.591 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/58cd83db-dcb3-409c-a108-07601ce5f67a.pid.haproxy
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:58:11 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:58:11.592 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'env', 'PROCESS_TAG=haproxy-58cd83db-dcb3-409c-a108-07601ce5f67a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/58cd83db-dcb3-409c-a108-07601ce5f67a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.610 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039891.537925, ada4a724-2307-431d-8c29-075bfd90b43e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.610 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] VM Started (Lifecycle Event)
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.635 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:58:11 compute-0 nova_compute[182935]: 2026-01-21 23:58:11.641 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:58:12 compute-0 podman[222177]: 2026-01-21 23:58:12.00902923 +0000 UTC m=+0.055167229 container create b1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:58:12 compute-0 systemd[1]: Started libpod-conmon-b1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0.scope.
Jan 21 23:58:12 compute-0 podman[222177]: 2026-01-21 23:58:11.981081902 +0000 UTC m=+0.027219921 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:58:12 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:58:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cab24be1591f89b523de445b4286b9a88a7db37453a604a0f4ad4000dcbacd5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:58:12 compute-0 podman[222177]: 2026-01-21 23:58:12.110081483 +0000 UTC m=+0.156219502 container init b1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 21 23:58:12 compute-0 podman[222177]: 2026-01-21 23:58:12.118335431 +0000 UTC m=+0.164473430 container start b1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 21 23:58:12 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222192]: [NOTICE]   (222197) : New worker (222199) forked
Jan 21 23:58:12 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222192]: [NOTICE]   (222197) : Loading success.
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.184 182939 DEBUG nova.compute.manager [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.185 182939 DEBUG oslo_concurrency.lockutils [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.185 182939 DEBUG oslo_concurrency.lockutils [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.185 182939 DEBUG oslo_concurrency.lockutils [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.185 182939 DEBUG nova.compute.manager [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] No waiting events found dispatching network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.186 182939 WARNING nova.compute.manager [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received unexpected event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 for instance with vm_state active and task_state None.
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.186 182939 DEBUG nova.compute.manager [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.186 182939 DEBUG oslo_concurrency.lockutils [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.186 182939 DEBUG oslo_concurrency.lockutils [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.186 182939 DEBUG oslo_concurrency.lockutils [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.187 182939 DEBUG nova.compute.manager [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] No waiting events found dispatching network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.187 182939 WARNING nova.compute.manager [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received unexpected event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 for instance with vm_state active and task_state None.
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.187 182939 DEBUG nova.compute.manager [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.187 182939 DEBUG oslo_concurrency.lockutils [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.187 182939 DEBUG oslo_concurrency.lockutils [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.188 182939 DEBUG oslo_concurrency.lockutils [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.188 182939 DEBUG nova.compute.manager [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] No waiting events found dispatching network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:58:13 compute-0 nova_compute[182935]: 2026-01-21 23:58:13.188 182939 WARNING nova.compute.manager [req-2e8fc636-828d-4e03-a936-bca6a47fa78e req-ab0050f0-51bf-4c4e-8a53-0f0809ffe118 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received unexpected event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 for instance with vm_state active and task_state None.
Jan 21 23:58:13 compute-0 podman[222208]: 2026-01-21 23:58:13.69967918 +0000 UTC m=+0.071136670 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 21 23:58:14 compute-0 nova_compute[182935]: 2026-01-21 23:58:14.649 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:16 compute-0 nova_compute[182935]: 2026-01-21 23:58:16.078 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:17 compute-0 podman[222233]: 2026-01-21 23:58:17.681791552 +0000 UTC m=+0.053185701 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:58:19 compute-0 nova_compute[182935]: 2026-01-21 23:58:19.652 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:21 compute-0 nova_compute[182935]: 2026-01-21 23:58:21.080 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.307 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000048', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '011e84f966444a668bd6c0f5674f551f', 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'hostId': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.309 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.337 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.read.bytes volume: 23816192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.339 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2fe06a9-0a75-4e00-ab59-5df873469a79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23816192, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-vda', 'timestamp': '2026-01-21T23:58:23.310191', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11d31e98-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.104518901, 'message_signature': 'dca2e70c6261f235301c92f55016f65835189f15282d472bd64f4dd132006c5e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-sda', 'timestamp': '2026-01-21T23:58:23.310191', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11d339d2-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.104518901, 'message_signature': '58db0102809b034becc0eab0708d0c87b72a3446a8abd508564cc8260c13d752'}]}, 'timestamp': '2026-01-21 23:58:23.339466', '_unique_id': '9d11a3a79b0d4ddc8c209d690564aace'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.346 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.347 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.347 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1788045668>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1788045668>]
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.347 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.351 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ada4a724-2307-431d-8c29-075bfd90b43e / tapbce17837-92 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.351 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97a8bf86-371e-4c3d-bbe2-406db26e99e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'instance-00000048-ada4a724-2307-431d-8c29-075bfd90b43e-tapbce17837-92', 'timestamp': '2026-01-21T23:58:23.347607', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'tapbce17837-92', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7e:09:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbce17837-92'}, 'message_id': '11d51d10-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.141898484, 'message_signature': 'b3f94793944d136feecaca283a7f55af6a16ee186245a8e9bd02b9887ca1b031'}]}, 'timestamp': '2026-01-21 23:58:23.351742', '_unique_id': 'd6ce5e18b1a6440da1ed73cca65fd20c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.352 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.353 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.353 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.353 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1788045668>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1788045668>]
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.353 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.364 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.365 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81b492ce-8bee-4c2c-a85e-cb9666d360dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-vda', 'timestamp': '2026-01-21T23:58:23.353967', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11d73050-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.148280716, 'message_signature': '4324113b030bedc9740143e92124dcec19c404f8452425f599a500cf252195d3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-sda', 'timestamp': '2026-01-21T23:58:23.353967', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11d73e2e-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.148280716, 'message_signature': '73a98459f92a2aaf6534ab4d2cfa09683f1245e8a611872fa5a4185722b9ef02'}]}, 'timestamp': '2026-01-21 23:58:23.365598', '_unique_id': 'b63cd2ccfc1a4dd4b3bbc5d581035b7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.366 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.367 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.367 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28fd5036-fb80-45e3-929e-e1e6c1246578', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'instance-00000048-ada4a724-2307-431d-8c29-075bfd90b43e-tapbce17837-92', 'timestamp': '2026-01-21T23:58:23.367428', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'tapbce17837-92', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7e:09:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbce17837-92'}, 'message_id': '11d78f82-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.141898484, 'message_signature': '1d2c669496701212e35df0a6108140032b68817adf8007c138b0127abfcfc885'}]}, 'timestamp': '2026-01-21 23:58:23.367678', '_unique_id': 'c3d17b7a5ca5459299c43d3d41a270c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.368 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f93d5f6-054b-4491-9f1e-d2c7b79bb3a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'instance-00000048-ada4a724-2307-431d-8c29-075bfd90b43e-tapbce17837-92', 'timestamp': '2026-01-21T23:58:23.368935', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'tapbce17837-92', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7e:09:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbce17837-92'}, 'message_id': '11d7ca4c-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.141898484, 'message_signature': 'c36d8279f1aade3603594fe637396e2372bf354df42840b75a7f98620feac3da'}]}, 'timestamp': '2026-01-21 23:58:23.369182', '_unique_id': '773b49ddd34d445d89ffbb795431d70f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.369 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.370 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.370 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.371 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd600fce-590d-4397-b87e-815973bb8c21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-vda', 'timestamp': '2026-01-21T23:58:23.370729', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11d812b8-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.104518901, 'message_signature': 'b4accb24b0e71c2e4fec6b1ac329b7d1c01f897b3444509199d84f23948e0e37'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-sda', 'timestamp': '2026-01-21T23:58:23.370729', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11d81eac-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.104518901, 'message_signature': '921d03cdc50a9d8e6f8ccaf66d654de5be0ccbb007421a846300e8e2e3922889'}]}, 'timestamp': '2026-01-21 23:58:23.371522', '_unique_id': 'c39228c14f384f51af8694027b65a594'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.372 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.394 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c05a47f4-7279-4850-a199-b1f68addfc8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'timestamp': '2026-01-21T23:58:23.373080', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '11dbb242-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.188414265, 'message_signature': '2a4122db66bb51818402306047760473a101e167641b380fcd97b80a6d8f6a26'}]}, 'timestamp': '2026-01-21 23:58:23.394911', '_unique_id': '1db1605128b34c2bad95dd55dbefc10e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.396 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.397 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.read.requests volume: 770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.397 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0556a1ed-cab4-4514-af05-33f7c2ee655a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 770, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-vda', 'timestamp': '2026-01-21T23:58:23.397016', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11dc13cc-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.104518901, 'message_signature': 'cf1e90ca9bb3c03c5ca8ae56056fcf348598c7867a3bd058d8905c145ff47813'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-sda', 'timestamp': '2026-01-21T23:58:23.397016', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11dc1ffc-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.104518901, 'message_signature': '9b8d92095bab750044f1faa2b43a75b2f87236d2711aff9f76869a37dc49521d'}]}, 'timestamp': '2026-01-21 23:58:23.397615', '_unique_id': '5f9732a48771469cbec3b0487d8ad35e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.399 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.399 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ca854ac-6b3f-4a09-a964-697651a1e2ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'instance-00000048-ada4a724-2307-431d-8c29-075bfd90b43e-tapbce17837-92', 'timestamp': '2026-01-21T23:58:23.399224', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'tapbce17837-92', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7e:09:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbce17837-92'}, 'message_id': '11dc6a2a-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.141898484, 'message_signature': '36341bda329ea0b68c222b573199ff5c63d6e6de2791cb156024aeb35f80af18'}]}, 'timestamp': '2026-01-21 23:58:23.399493', '_unique_id': '96b187e3203f4708b99b8909378a55cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.400 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5b8fa74-8319-4ebe-b2c0-515870851b6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'instance-00000048-ada4a724-2307-431d-8c29-075bfd90b43e-tapbce17837-92', 'timestamp': '2026-01-21T23:58:23.401041', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'tapbce17837-92', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7e:09:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbce17837-92'}, 'message_id': '11dcb034-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.141898484, 'message_signature': '7c3f8969aefedc3b5f951c04fd73a779c984c54bcab710ce7e8270f8b97b21c6'}]}, 'timestamp': '2026-01-21 23:58:23.401281', '_unique_id': '92e1001ea11c4f8ca49f973c63727655'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.401 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.402 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.402 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.402 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c06f8908-87e5-4ae0-bfb4-23991ecfd3a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-vda', 'timestamp': '2026-01-21T23:58:23.402416', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11dce644-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.148280716, 'message_signature': '05312a1a5299982cb8f7a00c3acaff9e1df61103dee92cae11f46d2ed19bd896'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-sda', 'timestamp': '2026-01-21T23:58:23.402416', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11dcf0b2-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.148280716, 'message_signature': 'd5561951329abc8261f3bac08bfa665f6cc3e0a54488457a8b35d74be3c0b64c'}]}, 'timestamp': '2026-01-21 23:58:23.402921', '_unique_id': '0fea9fc4a20f4ae4b8442680fbb137c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.403 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d64adae-21b2-4e08-a794-e2406ffa8354', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'instance-00000048-ada4a724-2307-431d-8c29-075bfd90b43e-tapbce17837-92', 'timestamp': '2026-01-21T23:58:23.404190', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'tapbce17837-92', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7e:09:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbce17837-92'}, 'message_id': '11dd2b04-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.141898484, 'message_signature': 'b4375998fb596c593ece8e0d2bab9178c5575fe8a10c175ed61fcb8c397fd1fa'}]}, 'timestamp': '2026-01-21 23:58:23.404426', '_unique_id': '57fde7722e844591915af14da87421f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.404 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.405 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.405 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dafed834-5a15-4bf4-b5f8-484124a9c565', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-vda', 'timestamp': '2026-01-21T23:58:23.405893', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11dd6d6c-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.148280716, 'message_signature': '4b5eab3e7d7dab8be256eb277a9427d436d3faa0a6515360c03d37ce011cb5a2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-sda', 'timestamp': '2026-01-21T23:58:23.405893', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11dd75a0-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.148280716, 'message_signature': '0827f767e7c24d02bded351fb711f72f93889833cb83c318ce45ae1cf2cc7f28'}]}, 'timestamp': '2026-01-21 23:58:23.406318', '_unique_id': '77de0ce50d2e436ca3df4220c59a5e06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.406 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.407 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.407 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e8ba6e1-8ddb-4ca7-8666-dc2b46ded7cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'instance-00000048-ada4a724-2307-431d-8c29-075bfd90b43e-tapbce17837-92', 'timestamp': '2026-01-21T23:58:23.407469', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'tapbce17837-92', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7e:09:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbce17837-92'}, 'message_id': '11ddab06-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.141898484, 'message_signature': '363cf5d3ee8ec3f4c1f8d8054e66258ead95e75c4465bfad864dd204e8aa3dfc'}]}, 'timestamp': '2026-01-21 23:58:23.407701', '_unique_id': 'fa2cb4dae4804c49a9c29e68b2f08d06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.408 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.409 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '348ab7bc-46ce-4d8f-9cc8-3f7e677f8f77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'instance-00000048-ada4a724-2307-431d-8c29-075bfd90b43e-tapbce17837-92', 'timestamp': '2026-01-21T23:58:23.408994', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'tapbce17837-92', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7e:09:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbce17837-92'}, 'message_id': '11dde7d8-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.141898484, 'message_signature': 'b06759142dfc91897ae2cb343450652fb120293c180f1a24f386ef8b5dbd795d'}]}, 'timestamp': '2026-01-21 23:58:23.409432', '_unique_id': '57f8de7f98b6491f91b2de7e5090680f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.410 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.411 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.411 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.411 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1788045668>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1788045668>]
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.411 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.411 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.read.latency volume: 158787117 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.411 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.read.latency volume: 304688 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb06a7ca-5670-4454-895e-2f27392dae44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 158787117, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-vda', 'timestamp': '2026-01-21T23:58:23.411689', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11de515a-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.104518901, 'message_signature': 'adbab235f3d0de117884b587ef866acfdbb2c0baba377a00ef240978c872e30b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 304688, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-sda', 'timestamp': '2026-01-21T23:58:23.411689', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11de5a4c-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.104518901, 'message_signature': '8172bfd7af5c240cfc8656d8d8c40c5265e84646bc38560501300bdb7dbb697c'}]}, 'timestamp': '2026-01-21 23:58:23.412197', '_unique_id': '1fc8896ad9244fbd8da371120095fa27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.412 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.413 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.413 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/cpu volume: 11090000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f789512d-4dfa-4109-9fdb-cee3fa572612', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11090000000, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'timestamp': '2026-01-21T23:58:23.413650', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '11de9f84-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.188414265, 'message_signature': 'c26fb00bf7e947dc47ffe366174a4acaa3e3c62df457ffce2853329db542f8df'}]}, 'timestamp': '2026-01-21 23:58:23.413974', '_unique_id': '72e03b6ba8ce4473b0ec19bda17d3ecc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.414 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.415 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.415 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.415 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1f35538-290e-4440-a804-5fd5be2f4180', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-vda', 'timestamp': '2026-01-21T23:58:23.415122', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11ded5da-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.104518901, 'message_signature': 'ea788222e3aa7753e56a37e3b617ab3f2c07d874673a9db4677f4ec1d57f8797'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-sda', 'timestamp': '2026-01-21T23:58:23.415122', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11dede40-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.104518901, 'message_signature': 'a5acec05c7a65ac7785005a390966495f38351cbfa50db5cfa04d06ca7ef2230'}]}, 'timestamp': '2026-01-21 23:58:23.415550', '_unique_id': '23654ff690dd49fa903cafc284f3b615'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.416 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.417 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.417 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.417 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd459d3b-bb35-4b07-9285-e75737f30d1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-vda', 'timestamp': '2026-01-21T23:58:23.417273', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '11df2c1a-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.104518901, 'message_signature': '454e514fbdb978c89a2d9c4e3f7c7e9c8e54d8bcf6d0bfe0a3169b7a272fb27a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'ada4a724-2307-431d-8c29-075bfd90b43e-sda', 'timestamp': '2026-01-21T23:58:23.417273', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'instance-00000048', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '11df362e-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.104518901, 'message_signature': '03e6d6fbd19d680839ba9cbbe10829fa82d032908173d30517f6da2ebf7169b7'}]}, 'timestamp': '2026-01-21 23:58:23.417826', '_unique_id': '65668d844c974b6daffe0764460efaac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.418 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef7b1f62-29b9-4f0f-b102-1722d521d347', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'instance-00000048-ada4a724-2307-431d-8c29-075bfd90b43e-tapbce17837-92', 'timestamp': '2026-01-21T23:58:23.419233', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'tapbce17837-92', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7e:09:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbce17837-92'}, 'message_id': '11df7710-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.141898484, 'message_signature': 'bc4b738c1be8c524d5a2a98b3003de1cc4de7be901d037eb5470b325471520ee'}]}, 'timestamp': '2026-01-21 23:58:23.419478', '_unique_id': 'dd9bb22491324fb1a5ce2914897dade4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.419 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.420 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.420 12 DEBUG ceilometer.compute.pollsters [-] ada4a724-2307-431d-8c29-075bfd90b43e/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d64e69e-dc99-48c1-afd7-82e5a64f2201', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_name': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_name': None, 'resource_id': 'instance-00000048-ada4a724-2307-431d-8c29-075bfd90b43e-tapbce17837-92', 'timestamp': '2026-01-21T23:58:23.420785', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-1788045668', 'name': 'tapbce17837-92', 'instance_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'instance_type': 'm1.nano', 'host': '5dce40225de3d60602b61665e6cade089bd81ed1c6b975a094b51551', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7e:09:d8', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbce17837-92'}, 'message_id': '11dfb5fe-f725-11f0-9743-fa163e6b0dfb', 'monotonic_time': 4406.141898484, 'message_signature': '27e64970f5cb37e31fea39b73bd0d73dc4b408a89c47607a999722e33accdb18'}]}, 'timestamp': '2026-01-21 23:58:23.421603', '_unique_id': '48bf35afc1a14fa98dd9805d6531b46e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     yield
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.423 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.423 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 21 23:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-21 23:58:23.423 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1788045668>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-1788045668>]
Jan 21 23:58:24 compute-0 nova_compute[182935]: 2026-01-21 23:58:24.653 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:24 compute-0 podman[222264]: 2026-01-21 23:58:24.695257298 +0000 UTC m=+0.065726231 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 21 23:58:24 compute-0 podman[222265]: 2026-01-21 23:58:24.712863819 +0000 UTC m=+0.079550061 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 21 23:58:24 compute-0 ovn_controller[95047]: 2026-01-21T23:58:24Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7e:09:d8 10.100.0.11
Jan 21 23:58:26 compute-0 nova_compute[182935]: 2026-01-21 23:58:26.082 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:26 compute-0 nova_compute[182935]: 2026-01-21 23:58:26.820 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:26 compute-0 nova_compute[182935]: 2026-01-21 23:58:26.821 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:58:26 compute-0 nova_compute[182935]: 2026-01-21 23:58:26.821 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:58:27 compute-0 nova_compute[182935]: 2026-01-21 23:58:27.439 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:58:27 compute-0 nova_compute[182935]: 2026-01-21 23:58:27.439 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:58:27 compute-0 nova_compute[182935]: 2026-01-21 23:58:27.440 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:58:27 compute-0 nova_compute[182935]: 2026-01-21 23:58:27.440 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.656 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.699 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Updating instance_info_cache with network_info: [{"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.728 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.729 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.730 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.730 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.730 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.731 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.764 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.764 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.765 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.765 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.860 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:29 compute-0 sshd-session[222306]: Invalid user git from 188.166.69.60 port 55730
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.928 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.930 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:29 compute-0 nova_compute[182935]: 2026-01-21 23:58:29.990 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:30 compute-0 sshd-session[222306]: Connection closed by invalid user git 188.166.69.60 port 55730 [preauth]
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.161 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.164 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5513MB free_disk=73.20825958251953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.165 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.166 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.447 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance ada4a724-2307-431d-8c29-075bfd90b43e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.448 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.448 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.658 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.679 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.702 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.703 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.537s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.768 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:30 compute-0 nova_compute[182935]: 2026-01-21 23:58:30.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:31 compute-0 nova_compute[182935]: 2026-01-21 23:58:31.086 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:31 compute-0 nova_compute[182935]: 2026-01-21 23:58:31.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:32 compute-0 nova_compute[182935]: 2026-01-21 23:58:32.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:34 compute-0 nova_compute[182935]: 2026-01-21 23:58:34.658 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:35 compute-0 podman[222316]: 2026-01-21 23:58:35.717760491 +0000 UTC m=+0.084661484 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 21 23:58:35 compute-0 podman[222315]: 2026-01-21 23:58:35.73786266 +0000 UTC m=+0.101101465 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Jan 21 23:58:36 compute-0 nova_compute[182935]: 2026-01-21 23:58:36.124 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:39 compute-0 nova_compute[182935]: 2026-01-21 23:58:39.660 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:40 compute-0 nova_compute[182935]: 2026-01-21 23:58:40.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:58:41 compute-0 nova_compute[182935]: 2026-01-21 23:58:41.126 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:44 compute-0 nova_compute[182935]: 2026-01-21 23:58:44.661 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:44 compute-0 podman[222365]: 2026-01-21 23:58:44.723520704 +0000 UTC m=+0.091975266 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:58:46 compute-0 nova_compute[182935]: 2026-01-21 23:58:46.174 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:48 compute-0 podman[222391]: 2026-01-21 23:58:48.687241601 +0000 UTC m=+0.064262038 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 21 23:58:49 compute-0 nova_compute[182935]: 2026-01-21 23:58:49.664 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:51 compute-0 nova_compute[182935]: 2026-01-21 23:58:51.178 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:54 compute-0 nova_compute[182935]: 2026-01-21 23:58:54.666 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:55 compute-0 podman[222411]: 2026-01-21 23:58:55.69314205 +0000 UTC m=+0.065321613 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 21 23:58:55 compute-0 podman[222410]: 2026-01-21 23:58:55.718917011 +0000 UTC m=+0.086767550 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter)
Jan 21 23:58:56 compute-0 nova_compute[182935]: 2026-01-21 23:58:56.180 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.145 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.145 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.180 182939 DEBUG nova.compute.manager [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.307 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.307 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.313 182939 DEBUG nova.virt.hardware [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.314 182939 INFO nova.compute.claims [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.454 182939 DEBUG nova.compute.provider_tree [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.481 182939 DEBUG nova.scheduler.client.report [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.517 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.518 182939 DEBUG nova.compute.manager [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.616 182939 DEBUG nova.compute.manager [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.616 182939 DEBUG nova.network.neutron [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.652 182939 INFO nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.692 182939 DEBUG nova.compute.manager [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.849 182939 DEBUG nova.compute.manager [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.851 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.851 182939 INFO nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Creating image(s)
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.852 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.852 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.853 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.866 182939 DEBUG oslo_concurrency.processutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.925 182939 DEBUG oslo_concurrency.processutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.926 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.927 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.938 182939 DEBUG oslo_concurrency.processutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.960 182939 DEBUG nova.policy [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '55710edfd4b24e368807c8b5087ec91c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '011e84f966444a668bd6c0f5674f551f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.996 182939 DEBUG oslo_concurrency.processutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:58 compute-0 nova_compute[182935]: 2026-01-21 23:58:58.997 182939 DEBUG oslo_concurrency.processutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.031 182939 DEBUG oslo_concurrency.processutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.033 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.033 182939 DEBUG oslo_concurrency.processutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.091 182939 DEBUG oslo_concurrency.processutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.092 182939 DEBUG nova.virt.disk.api [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Checking if we can resize image /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.093 182939 DEBUG oslo_concurrency.processutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.149 182939 DEBUG oslo_concurrency.processutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.150 182939 DEBUG nova.virt.disk.api [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Cannot resize image /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.150 182939 DEBUG nova.objects.instance [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'migration_context' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.167 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.168 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Ensure instance console log exists: /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.168 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.168 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.169 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.669 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:58:59 compute-0 nova_compute[182935]: 2026-01-21 23:58:59.712 182939 DEBUG nova.network.neutron [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Successfully created port: 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:59:00 compute-0 nova_compute[182935]: 2026-01-21 23:59:00.800 182939 DEBUG nova.network.neutron [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Successfully updated port: 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:59:00 compute-0 nova_compute[182935]: 2026-01-21 23:59:00.817 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "refresh_cache-30452704-b180-41c6-98c4-8b168b3bc5e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:00 compute-0 nova_compute[182935]: 2026-01-21 23:59:00.817 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquired lock "refresh_cache-30452704-b180-41c6-98c4-8b168b3bc5e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:00 compute-0 nova_compute[182935]: 2026-01-21 23:59:00.818 182939 DEBUG nova.network.neutron [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:59:00 compute-0 nova_compute[182935]: 2026-01-21 23:59:00.941 182939 DEBUG nova.compute.manager [req-9c0cbb7d-ee3c-4811-9777-737b80030048 req-ae61e630-b7f3-4ecd-a06a-79ccbfa30d44 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-changed-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:00 compute-0 nova_compute[182935]: 2026-01-21 23:59:00.941 182939 DEBUG nova.compute.manager [req-9c0cbb7d-ee3c-4811-9777-737b80030048 req-ae61e630-b7f3-4ecd-a06a-79ccbfa30d44 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Refreshing instance network info cache due to event network-changed-3b31ee50-1828-4f0b-b32d-f77ee76a8c63. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:59:00 compute-0 nova_compute[182935]: 2026-01-21 23:59:00.941 182939 DEBUG oslo_concurrency.lockutils [req-9c0cbb7d-ee3c-4811-9777-737b80030048 req-ae61e630-b7f3-4ecd-a06a-79ccbfa30d44 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30452704-b180-41c6-98c4-8b168b3bc5e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.097 182939 DEBUG nova.network.neutron [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.182 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.948 182939 DEBUG nova.network.neutron [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Updating instance_info_cache with network_info: [{"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.964 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Releasing lock "refresh_cache-30452704-b180-41c6-98c4-8b168b3bc5e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.965 182939 DEBUG nova.compute.manager [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Instance network_info: |[{"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.965 182939 DEBUG oslo_concurrency.lockutils [req-9c0cbb7d-ee3c-4811-9777-737b80030048 req-ae61e630-b7f3-4ecd-a06a-79ccbfa30d44 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30452704-b180-41c6-98c4-8b168b3bc5e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.965 182939 DEBUG nova.network.neutron [req-9c0cbb7d-ee3c-4811-9777-737b80030048 req-ae61e630-b7f3-4ecd-a06a-79ccbfa30d44 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Refreshing network info cache for port 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.968 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Start _get_guest_xml network_info=[{"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.972 182939 WARNING nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.978 182939 DEBUG nova.virt.libvirt.host [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.979 182939 DEBUG nova.virt.libvirt.host [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.981 182939 DEBUG nova.virt.libvirt.host [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.982 182939 DEBUG nova.virt.libvirt.host [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.983 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.983 182939 DEBUG nova.virt.hardware [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.983 182939 DEBUG nova.virt.hardware [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.984 182939 DEBUG nova.virt.hardware [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.984 182939 DEBUG nova.virt.hardware [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.984 182939 DEBUG nova.virt.hardware [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.984 182939 DEBUG nova.virt.hardware [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.984 182939 DEBUG nova.virt.hardware [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.984 182939 DEBUG nova.virt.hardware [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.984 182939 DEBUG nova.virt.hardware [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.985 182939 DEBUG nova.virt.hardware [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.985 182939 DEBUG nova.virt.hardware [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.988 182939 DEBUG nova.virt.libvirt.vif [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:58:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-878012212',display_name='tempest-ServerStableDeviceRescueTest-server-878012212',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-878012212',id=76,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='011e84f966444a668bd6c0f5674f551f',ramdisk_id='',reservation_id='r-zglabz6y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1256721315',owner_user_name='tempest-ServerStableDeviceRescueTest-1256721315-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:58:58Z,user_data=None,user_id='55710edfd4b24e368807c8b5087ec91c',uuid=30452704-b180-41c6-98c4-8b168b3bc5e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.989 182939 DEBUG nova.network.os_vif_util [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converting VIF {"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.989 182939 DEBUG nova.network.os_vif_util [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=3b31ee50-1828-4f0b-b32d-f77ee76a8c63,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b31ee50-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:01 compute-0 nova_compute[182935]: 2026-01-21 23:59:01.990 182939 DEBUG nova.objects.instance [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'pci_devices' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.008 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:59:02 compute-0 nova_compute[182935]:   <uuid>30452704-b180-41c6-98c4-8b168b3bc5e9</uuid>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   <name>instance-0000004c</name>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-878012212</nova:name>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:59:01</nova:creationTime>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:59:02 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:59:02 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:59:02 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:59:02 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:59:02 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:59:02 compute-0 nova_compute[182935]:         <nova:user uuid="55710edfd4b24e368807c8b5087ec91c">tempest-ServerStableDeviceRescueTest-1256721315-project-member</nova:user>
Jan 21 23:59:02 compute-0 nova_compute[182935]:         <nova:project uuid="011e84f966444a668bd6c0f5674f551f">tempest-ServerStableDeviceRescueTest-1256721315</nova:project>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:59:02 compute-0 nova_compute[182935]:         <nova:port uuid="3b31ee50-1828-4f0b-b32d-f77ee76a8c63">
Jan 21 23:59:02 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <system>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <entry name="serial">30452704-b180-41c6-98c4-8b168b3bc5e9</entry>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <entry name="uuid">30452704-b180-41c6-98c4-8b168b3bc5e9</entry>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     </system>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   <os>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   </os>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   <features>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   </features>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.config"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:b4:b3:f6"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <target dev="tap3b31ee50-18"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/console.log" append="off"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <video>
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     </video>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:59:02 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:59:02 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:59:02 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:59:02 compute-0 nova_compute[182935]: </domain>
Jan 21 23:59:02 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.008 182939 DEBUG nova.compute.manager [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Preparing to wait for external event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.009 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.009 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.009 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.010 182939 DEBUG nova.virt.libvirt.vif [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:58:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-878012212',display_name='tempest-ServerStableDeviceRescueTest-server-878012212',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-878012212',id=76,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='011e84f966444a668bd6c0f5674f551f',ramdisk_id='',reservation_id='r-zglabz6y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1256721315',owner_user_name='tempest-ServerStableDeviceRescueTest-1256721315-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:58:58Z,user_data=None,user_id='55710edfd4b24e368807c8b5087ec91c',uuid=30452704-b180-41c6-98c4-8b168b3bc5e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.010 182939 DEBUG nova.network.os_vif_util [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converting VIF {"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.010 182939 DEBUG nova.network.os_vif_util [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=3b31ee50-1828-4f0b-b32d-f77ee76a8c63,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b31ee50-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.011 182939 DEBUG os_vif [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=3b31ee50-1828-4f0b-b32d-f77ee76a8c63,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b31ee50-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.011 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.012 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.012 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.016 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.016 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b31ee50-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.017 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3b31ee50-18, col_values=(('external_ids', {'iface-id': '3b31ee50-1828-4f0b-b32d-f77ee76a8c63', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:b3:f6', 'vm-uuid': '30452704-b180-41c6-98c4-8b168b3bc5e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.018 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:02 compute-0 NetworkManager[55139]: <info>  [1769039942.0199] manager: (tap3b31ee50-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.021 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.024 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.026 182939 INFO os_vif [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=3b31ee50-1828-4f0b-b32d-f77ee76a8c63,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b31ee50-18')
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.103 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.104 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.104 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No VIF found with MAC fa:16:3e:b4:b3:f6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.105 182939 INFO nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Using config drive
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.458 182939 INFO nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Creating config drive at /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.config
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.463 182939 DEBUG oslo_concurrency.processutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzhvptgf6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.593 182939 DEBUG oslo_concurrency.processutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzhvptgf6" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:02 compute-0 kernel: tap3b31ee50-18: entered promiscuous mode
Jan 21 23:59:02 compute-0 NetworkManager[55139]: <info>  [1769039942.6579] manager: (tap3b31ee50-18): new Tun device (/org/freedesktop/NetworkManager/Devices/121)
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.658 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:02 compute-0 ovn_controller[95047]: 2026-01-21T23:59:02Z|00262|binding|INFO|Claiming lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for this chassis.
Jan 21 23:59:02 compute-0 ovn_controller[95047]: 2026-01-21T23:59:02Z|00263|binding|INFO|3b31ee50-1828-4f0b-b32d-f77ee76a8c63: Claiming fa:16:3e:b4:b3:f6 10.100.0.4
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.666 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:b3:f6 10.100.0.4'], port_security=['fa:16:3e:b4:b3:f6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '30452704-b180-41c6-98c4-8b168b3bc5e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=3b31ee50-1828-4f0b-b32d-f77ee76a8c63) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.669 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a bound to our chassis
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.671 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:59:02 compute-0 ovn_controller[95047]: 2026-01-21T23:59:02Z|00264|binding|INFO|Setting lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 ovn-installed in OVS
Jan 21 23:59:02 compute-0 ovn_controller[95047]: 2026-01-21T23:59:02Z|00265|binding|INFO|Setting lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 up in Southbound
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.673 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.675 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.689 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d62c9404-3d24-4317-b50f-9ca3cf31645c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:02 compute-0 systemd-udevd[222485]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:59:02 compute-0 systemd-machined[154182]: New machine qemu-37-instance-0000004c.
Jan 21 23:59:02 compute-0 NetworkManager[55139]: <info>  [1769039942.7058] device (tap3b31ee50-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:59:02 compute-0 NetworkManager[55139]: <info>  [1769039942.7068] device (tap3b31ee50-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:59:02 compute-0 systemd[1]: Started Virtual Machine qemu-37-instance-0000004c.
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.722 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[988ce305-60e1-4c6c-aca8-c2c9a5eb4143]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.726 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[4cabb66b-7d6c-41c4-ba09-24691ea009d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.752 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[97a20f70-02a7-41b4-b6d6-81cecd7776c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.767 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1b67a5ac-a34b-4b18-ad16-84df02964f69]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439412, 'reachable_time': 21509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222498, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.782 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6b13e457-3124-43c0-b03e-616f96139960]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap58cd83db-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439426, 'tstamp': 439426}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222499, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap58cd83db-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439429, 'tstamp': 439429}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222499, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.784 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.786 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:02 compute-0 nova_compute[182935]: 2026-01-21 23:59:02.787 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.788 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58cd83db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.788 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.788 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58cd83db-d0, col_values=(('external_ids', {'iface-id': '2d113249-07d3-443f-9b57-5f5a422d1c98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:02 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:02.788 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.079 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039943.07837, 30452704-b180-41c6-98c4-8b168b3bc5e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.079 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] VM Started (Lifecycle Event)
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.116 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.121 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039943.0786679, 30452704-b180-41c6-98c4-8b168b3bc5e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.122 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] VM Paused (Lifecycle Event)
Jan 21 23:59:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:03.192 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:03.193 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:03 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:03.193 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.414 182939 DEBUG nova.compute.manager [req-39b04ce4-c002-46d5-9bf3-571683b38673 req-42f9856d-2722-4be5-9165-9fff38f52b8f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.415 182939 DEBUG oslo_concurrency.lockutils [req-39b04ce4-c002-46d5-9bf3-571683b38673 req-42f9856d-2722-4be5-9165-9fff38f52b8f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.415 182939 DEBUG oslo_concurrency.lockutils [req-39b04ce4-c002-46d5-9bf3-571683b38673 req-42f9856d-2722-4be5-9165-9fff38f52b8f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.416 182939 DEBUG oslo_concurrency.lockutils [req-39b04ce4-c002-46d5-9bf3-571683b38673 req-42f9856d-2722-4be5-9165-9fff38f52b8f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.416 182939 DEBUG nova.compute.manager [req-39b04ce4-c002-46d5-9bf3-571683b38673 req-42f9856d-2722-4be5-9165-9fff38f52b8f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Processing event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.417 182939 DEBUG nova.compute.manager [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.421 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.424 182939 INFO nova.virt.libvirt.driver [-] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Instance spawned successfully.
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.424 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.429 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.434 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039943.420368, 30452704-b180-41c6-98c4-8b168b3bc5e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.434 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] VM Resumed (Lifecycle Event)
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.445 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.445 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.446 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.446 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.446 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.447 182939 DEBUG nova.virt.libvirt.driver [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.741 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.744 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.765 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.788 182939 INFO nova.compute.manager [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Took 4.94 seconds to spawn the instance on the hypervisor.
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.789 182939 DEBUG nova.compute.manager [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.841 182939 DEBUG nova.network.neutron [req-9c0cbb7d-ee3c-4811-9777-737b80030048 req-ae61e630-b7f3-4ecd-a06a-79ccbfa30d44 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Updated VIF entry in instance network info cache for port 3b31ee50-1828-4f0b-b32d-f77ee76a8c63. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.841 182939 DEBUG nova.network.neutron [req-9c0cbb7d-ee3c-4811-9777-737b80030048 req-ae61e630-b7f3-4ecd-a06a-79ccbfa30d44 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Updating instance_info_cache with network_info: [{"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.870 182939 DEBUG oslo_concurrency.lockutils [req-9c0cbb7d-ee3c-4811-9777-737b80030048 req-ae61e630-b7f3-4ecd-a06a-79ccbfa30d44 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30452704-b180-41c6-98c4-8b168b3bc5e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.877 182939 INFO nova.compute.manager [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Took 5.61 seconds to build instance.
Jan 21 23:59:03 compute-0 nova_compute[182935]: 2026-01-21 23:59:03.894 182939 DEBUG oslo_concurrency.lockutils [None req-ab7996a9-0871-4ff9-8e87-70ec2fc5cddc 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.749s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:06 compute-0 nova_compute[182935]: 2026-01-21 23:59:06.184 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:06 compute-0 nova_compute[182935]: 2026-01-21 23:59:06.558 182939 DEBUG nova.compute.manager [req-84ee84ac-93f6-4e85-ae4b-9009ebbd6ea5 req-5e400bc4-d648-4e2d-b758-e5f9fcb1dc1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:06 compute-0 nova_compute[182935]: 2026-01-21 23:59:06.559 182939 DEBUG oslo_concurrency.lockutils [req-84ee84ac-93f6-4e85-ae4b-9009ebbd6ea5 req-5e400bc4-d648-4e2d-b758-e5f9fcb1dc1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:06 compute-0 nova_compute[182935]: 2026-01-21 23:59:06.559 182939 DEBUG oslo_concurrency.lockutils [req-84ee84ac-93f6-4e85-ae4b-9009ebbd6ea5 req-5e400bc4-d648-4e2d-b758-e5f9fcb1dc1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:06 compute-0 nova_compute[182935]: 2026-01-21 23:59:06.560 182939 DEBUG oslo_concurrency.lockutils [req-84ee84ac-93f6-4e85-ae4b-9009ebbd6ea5 req-5e400bc4-d648-4e2d-b758-e5f9fcb1dc1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:06 compute-0 nova_compute[182935]: 2026-01-21 23:59:06.560 182939 DEBUG nova.compute.manager [req-84ee84ac-93f6-4e85-ae4b-9009ebbd6ea5 req-5e400bc4-d648-4e2d-b758-e5f9fcb1dc1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] No waiting events found dispatching network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:06 compute-0 nova_compute[182935]: 2026-01-21 23:59:06.560 182939 WARNING nova.compute.manager [req-84ee84ac-93f6-4e85-ae4b-9009ebbd6ea5 req-5e400bc4-d648-4e2d-b758-e5f9fcb1dc1d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received unexpected event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for instance with vm_state active and task_state None.
Jan 21 23:59:06 compute-0 podman[222509]: 2026-01-21 23:59:06.691397438 +0000 UTC m=+0.059760830 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 21 23:59:06 compute-0 podman[222508]: 2026-01-21 23:59:06.726743238 +0000 UTC m=+0.095711105 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 21 23:59:07 compute-0 nova_compute[182935]: 2026-01-21 23:59:07.043 182939 DEBUG nova.compute.manager [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:07 compute-0 nova_compute[182935]: 2026-01-21 23:59:07.067 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:07 compute-0 nova_compute[182935]: 2026-01-21 23:59:07.185 182939 INFO nova.compute.manager [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] instance snapshotting
Jan 21 23:59:07 compute-0 nova_compute[182935]: 2026-01-21 23:59:07.801 182939 INFO nova.virt.libvirt.driver [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Beginning live snapshot process
Jan 21 23:59:08 compute-0 virtqemud[182477]: invalid argument: disk vda does not have an active block job
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.010 182939 DEBUG oslo_concurrency.processutils [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.110 182939 DEBUG oslo_concurrency.processutils [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk --force-share --output=json -f qcow2" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.111 182939 DEBUG oslo_concurrency.processutils [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.206 182939 DEBUG oslo_concurrency.processutils [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk --force-share --output=json -f qcow2" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.229 182939 DEBUG oslo_concurrency.processutils [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.303 182939 DEBUG oslo_concurrency.processutils [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.304 182939 DEBUG oslo_concurrency.processutils [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp8xb3si0e/4046f6fdbd6744539d8275f841bd9d70.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.361 182939 DEBUG oslo_concurrency.processutils [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp8xb3si0e/4046f6fdbd6744539d8275f841bd9d70.delta 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.363 182939 INFO nova.virt.libvirt.driver [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.433 182939 DEBUG nova.virt.libvirt.guest [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.439 182939 INFO nova.virt.libvirt.driver [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.492 182939 DEBUG nova.privsep.utils [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.493 182939 DEBUG oslo_concurrency.processutils [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp8xb3si0e/4046f6fdbd6744539d8275f841bd9d70.delta /var/lib/nova/instances/snapshots/tmp8xb3si0e/4046f6fdbd6744539d8275f841bd9d70 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.669 182939 DEBUG oslo_concurrency.processutils [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp8xb3si0e/4046f6fdbd6744539d8275f841bd9d70.delta /var/lib/nova/instances/snapshots/tmp8xb3si0e/4046f6fdbd6744539d8275f841bd9d70" returned: 0 in 0.175s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:08 compute-0 nova_compute[182935]: 2026-01-21 23:59:08.671 182939 INFO nova.virt.libvirt.driver [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Snapshot extracted, beginning image upload
Jan 21 23:59:10 compute-0 nova_compute[182935]: 2026-01-21 23:59:10.826 182939 INFO nova.virt.libvirt.driver [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Snapshot image upload complete
Jan 21 23:59:10 compute-0 nova_compute[182935]: 2026-01-21 23:59:10.830 182939 INFO nova.compute.manager [None req-29924f5b-47f1-4562-9b34-59ca478ed608 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Took 3.62 seconds to snapshot the instance on the hypervisor.
Jan 21 23:59:11 compute-0 nova_compute[182935]: 2026-01-21 23:59:11.187 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:12 compute-0 nova_compute[182935]: 2026-01-21 23:59:12.069 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:13 compute-0 sshd-session[222589]: Invalid user git from 188.166.69.60 port 53142
Jan 21 23:59:13 compute-0 sshd-session[222589]: Connection closed by invalid user git 188.166.69.60 port 53142 [preauth]
Jan 21 23:59:13 compute-0 nova_compute[182935]: 2026-01-21 23:59:13.930 182939 INFO nova.compute.manager [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Rescuing
Jan 21 23:59:13 compute-0 nova_compute[182935]: 2026-01-21 23:59:13.933 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "refresh_cache-30452704-b180-41c6-98c4-8b168b3bc5e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:13 compute-0 nova_compute[182935]: 2026-01-21 23:59:13.933 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquired lock "refresh_cache-30452704-b180-41c6-98c4-8b168b3bc5e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:13 compute-0 nova_compute[182935]: 2026-01-21 23:59:13.933 182939 DEBUG nova.network.neutron [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:59:15 compute-0 nova_compute[182935]: 2026-01-21 23:59:15.511 182939 DEBUG nova.network.neutron [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Updating instance_info_cache with network_info: [{"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:15 compute-0 nova_compute[182935]: 2026-01-21 23:59:15.536 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Releasing lock "refresh_cache-30452704-b180-41c6-98c4-8b168b3bc5e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:15 compute-0 podman[222608]: 2026-01-21 23:59:15.706961546 +0000 UTC m=+0.083090602 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:59:15 compute-0 nova_compute[182935]: 2026-01-21 23:59:15.867 182939 DEBUG nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:59:16 compute-0 nova_compute[182935]: 2026-01-21 23:59:16.222 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:17 compute-0 nova_compute[182935]: 2026-01-21 23:59:17.074 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:17 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:17.768 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:17 compute-0 nova_compute[182935]: 2026-01-21 23:59:17.768 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:17 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:17.770 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 21 23:59:19 compute-0 kernel: tap3b31ee50-18 (unregistering): left promiscuous mode
Jan 21 23:59:19 compute-0 NetworkManager[55139]: <info>  [1769039959.0089] device (tap3b31ee50-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.025 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:19 compute-0 ovn_controller[95047]: 2026-01-21T23:59:19Z|00266|binding|INFO|Releasing lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 from this chassis (sb_readonly=0)
Jan 21 23:59:19 compute-0 ovn_controller[95047]: 2026-01-21T23:59:19Z|00267|binding|INFO|Setting lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 down in Southbound
Jan 21 23:59:19 compute-0 ovn_controller[95047]: 2026-01-21T23:59:19Z|00268|binding|INFO|Removing iface tap3b31ee50-18 ovn-installed in OVS
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.029 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.047 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:b3:f6 10.100.0.4'], port_security=['fa:16:3e:b4:b3:f6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '30452704-b180-41c6-98c4-8b168b3bc5e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=3b31ee50-1828-4f0b-b32d-f77ee76a8c63) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.046 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.049 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a unbound from our chassis
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.051 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.069 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6f6b02b3-9050-42a4-b052-33b22a906aec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:19 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Jan 21 23:59:19 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004c.scope: Consumed 13.580s CPU time.
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.106 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fbf710-2af4-489b-985c-664694398a2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:19 compute-0 podman[222634]: 2026-01-21 23:59:19.108864728 +0000 UTC m=+0.067919895 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:59:19 compute-0 systemd-machined[154182]: Machine qemu-37-instance-0000004c terminated.
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.110 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[5077907b-a9ad-45a3-9032-7ec92b00190c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.139 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[2c6b5630-ddf1-4a62-841c-52d73d38b88d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.157 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[15f1d2a9-264e-463b-8c96-c1f1af4feec3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439412, 'reachable_time': 21509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222661, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.177 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[65dca5f1-d0da-4ba5-9659-14e9234f9358]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap58cd83db-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439426, 'tstamp': 439426}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222662, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap58cd83db-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439429, 'tstamp': 439429}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222662, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.180 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.181 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.187 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.188 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58cd83db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.188 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.189 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58cd83db-d0, col_values=(('external_ids', {'iface-id': '2d113249-07d3-443f-9b57-5f5a422d1c98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:19 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:19.189 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.266 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.272 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.583 182939 DEBUG nova.compute.manager [req-1f0744ec-2147-40d7-a8fb-95dfc0d0e229 req-2a7cb266-b845-49ec-8fb3-08d1fec31cc9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-unplugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.584 182939 DEBUG oslo_concurrency.lockutils [req-1f0744ec-2147-40d7-a8fb-95dfc0d0e229 req-2a7cb266-b845-49ec-8fb3-08d1fec31cc9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.584 182939 DEBUG oslo_concurrency.lockutils [req-1f0744ec-2147-40d7-a8fb-95dfc0d0e229 req-2a7cb266-b845-49ec-8fb3-08d1fec31cc9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.584 182939 DEBUG oslo_concurrency.lockutils [req-1f0744ec-2147-40d7-a8fb-95dfc0d0e229 req-2a7cb266-b845-49ec-8fb3-08d1fec31cc9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.584 182939 DEBUG nova.compute.manager [req-1f0744ec-2147-40d7-a8fb-95dfc0d0e229 req-2a7cb266-b845-49ec-8fb3-08d1fec31cc9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] No waiting events found dispatching network-vif-unplugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.585 182939 WARNING nova.compute.manager [req-1f0744ec-2147-40d7-a8fb-95dfc0d0e229 req-2a7cb266-b845-49ec-8fb3-08d1fec31cc9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received unexpected event network-vif-unplugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for instance with vm_state active and task_state rescuing.
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.891 182939 INFO nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Instance shutdown successfully after 4 seconds.
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.898 182939 INFO nova.virt.libvirt.driver [-] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Instance destroyed successfully.
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.899 182939 DEBUG nova.objects.instance [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'numa_topology' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:19 compute-0 nova_compute[182935]: 2026-01-21 23:59:19.922 182939 INFO nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Attempting a stable device rescue
Jan 21 23:59:20 compute-0 nova_compute[182935]: 2026-01-21 23:59:20.215 182939 DEBUG nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Jan 21 23:59:20 compute-0 nova_compute[182935]: 2026-01-21 23:59:20.222 182939 DEBUG nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 21 23:59:20 compute-0 nova_compute[182935]: 2026-01-21 23:59:20.223 182939 INFO nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Creating image(s)
Jan 21 23:59:20 compute-0 nova_compute[182935]: 2026-01-21 23:59:20.225 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:20 compute-0 nova_compute[182935]: 2026-01-21 23:59:20.225 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:20 compute-0 nova_compute[182935]: 2026-01-21 23:59:20.227 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:20 compute-0 nova_compute[182935]: 2026-01-21 23:59:20.227 182939 DEBUG nova.objects.instance [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'trusted_certs' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:20 compute-0 nova_compute[182935]: 2026-01-21 23:59:20.247 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "7f2508ebfd258f79131ef449d573cad936dd91f6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:20 compute-0 nova_compute[182935]: 2026-01-21 23:59:20.248 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "7f2508ebfd258f79131ef449d573cad936dd91f6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.325 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.625 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.677 182939 DEBUG nova.compute.manager [req-4dc203c3-8c7f-4de3-bb6e-f19fa20d4b4c req-cd98c31d-9a78-45cd-8e86-32a07ed315eb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.678 182939 DEBUG oslo_concurrency.lockutils [req-4dc203c3-8c7f-4de3-bb6e-f19fa20d4b4c req-cd98c31d-9a78-45cd-8e86-32a07ed315eb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.678 182939 DEBUG oslo_concurrency.lockutils [req-4dc203c3-8c7f-4de3-bb6e-f19fa20d4b4c req-cd98c31d-9a78-45cd-8e86-32a07ed315eb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.679 182939 DEBUG oslo_concurrency.lockutils [req-4dc203c3-8c7f-4de3-bb6e-f19fa20d4b4c req-cd98c31d-9a78-45cd-8e86-32a07ed315eb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.679 182939 DEBUG nova.compute.manager [req-4dc203c3-8c7f-4de3-bb6e-f19fa20d4b4c req-cd98c31d-9a78-45cd-8e86-32a07ed315eb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] No waiting events found dispatching network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.679 182939 WARNING nova.compute.manager [req-4dc203c3-8c7f-4de3-bb6e-f19fa20d4b4c req-cd98c31d-9a78-45cd-8e86-32a07ed315eb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received unexpected event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for instance with vm_state active and task_state rescuing.
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.716 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6.part --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.717 182939 DEBUG nova.virt.images [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] 9893cdb7-a220-45d7-b68c-822cbc41e888 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.718 182939 DEBUG nova.privsep.utils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.719 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6.part /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.901 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6.part /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6.converted" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.907 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.988 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6.converted --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:21 compute-0 nova_compute[182935]: 2026-01-21 23:59:21.991 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "7f2508ebfd258f79131ef449d573cad936dd91f6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.022 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "7f2508ebfd258f79131ef449d573cad936dd91f6" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.024 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "7f2508ebfd258f79131ef449d573cad936dd91f6" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.051 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.076 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.115 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.116 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6,backing_fmt=raw /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.162 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6,backing_fmt=raw /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.rescue" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.164 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "7f2508ebfd258f79131ef449d573cad936dd91f6" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.165 182939 DEBUG nova.objects.instance [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'migration_context' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.187 182939 DEBUG nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.193 182939 DEBUG nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Start _get_guest_xml network_info=[{"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "vif_mac": "fa:16:3e:b4:b3:f6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '9893cdb7-a220-45d7-b68c-822cbc41e888', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.194 182939 DEBUG nova.objects.instance [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'resources' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.222 182939 WARNING nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.231 182939 DEBUG nova.virt.libvirt.host [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.233 182939 DEBUG nova.virt.libvirt.host [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.238 182939 DEBUG nova.virt.libvirt.host [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.239 182939 DEBUG nova.virt.libvirt.host [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.242 182939 DEBUG nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.242 182939 DEBUG nova.virt.hardware [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.244 182939 DEBUG nova.virt.hardware [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.244 182939 DEBUG nova.virt.hardware [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.245 182939 DEBUG nova.virt.hardware [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.245 182939 DEBUG nova.virt.hardware [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.246 182939 DEBUG nova.virt.hardware [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.246 182939 DEBUG nova.virt.hardware [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.247 182939 DEBUG nova.virt.hardware [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.247 182939 DEBUG nova.virt.hardware [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.247 182939 DEBUG nova.virt.hardware [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.248 182939 DEBUG nova.virt.hardware [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.249 182939 DEBUG nova.objects.instance [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.275 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.340 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.config --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.342 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.343 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.345 182939 DEBUG oslo_concurrency.lockutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.348 182939 DEBUG nova.virt.libvirt.vif [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:58:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-878012212',display_name='tempest-ServerStableDeviceRescueTest-server-878012212',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-878012212',id=76,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='011e84f966444a668bd6c0f5674f551f',ramdisk_id='',reservation_id='r-zglabz6y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1256721315',owner_user_name='tempest-ServerStableDeviceRescueTest-1256721315-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:10Z,user_data=None,user_id='55710edfd4b24e368807c8b5087ec91c',uuid=30452704-b180-41c6-98c4-8b168b3bc5e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "vif_mac": "fa:16:3e:b4:b3:f6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.349 182939 DEBUG nova.network.os_vif_util [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converting VIF {"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "vif_mac": "fa:16:3e:b4:b3:f6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.352 182939 DEBUG nova.network.os_vif_util [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=3b31ee50-1828-4f0b-b32d-f77ee76a8c63,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b31ee50-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.354 182939 DEBUG nova.objects.instance [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'pci_devices' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.372 182939 DEBUG nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:59:22 compute-0 nova_compute[182935]:   <uuid>30452704-b180-41c6-98c4-8b168b3bc5e9</uuid>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   <name>instance-0000004c</name>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-878012212</nova:name>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:59:22</nova:creationTime>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:59:22 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:59:22 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:59:22 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:59:22 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:59:22 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:59:22 compute-0 nova_compute[182935]:         <nova:user uuid="55710edfd4b24e368807c8b5087ec91c">tempest-ServerStableDeviceRescueTest-1256721315-project-member</nova:user>
Jan 21 23:59:22 compute-0 nova_compute[182935]:         <nova:project uuid="011e84f966444a668bd6c0f5674f551f">tempest-ServerStableDeviceRescueTest-1256721315</nova:project>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:59:22 compute-0 nova_compute[182935]:         <nova:port uuid="3b31ee50-1828-4f0b-b32d-f77ee76a8c63">
Jan 21 23:59:22 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <system>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <entry name="serial">30452704-b180-41c6-98c4-8b168b3bc5e9</entry>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <entry name="uuid">30452704-b180-41c6-98c4-8b168b3bc5e9</entry>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     </system>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   <os>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   </os>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   <features>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   </features>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.config"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.rescue"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <target dev="vdb" bus="virtio"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <boot order="1"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:b4:b3:f6"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <target dev="tap3b31ee50-18"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/console.log" append="off"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <video>
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     </video>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:59:22 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:59:22 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:59:22 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:59:22 compute-0 nova_compute[182935]: </domain>
Jan 21 23:59:22 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.383 182939 INFO nova.virt.libvirt.driver [-] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Instance destroyed successfully.
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.447 182939 DEBUG nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.448 182939 DEBUG nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.449 182939 DEBUG nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.449 182939 DEBUG nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] No VIF found with MAC fa:16:3e:b4:b3:f6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.450 182939 INFO nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Using config drive
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.472 182939 DEBUG nova.objects.instance [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'ec2_ids' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:22 compute-0 nova_compute[182935]: 2026-01-21 23:59:22.518 182939 DEBUG nova.objects.instance [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'keypairs' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.519 182939 INFO nova.virt.libvirt.driver [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Creating config drive at /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.config.rescue
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.531 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjswn9nav execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.672 182939 DEBUG oslo_concurrency.processutils [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjswn9nav" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:23 compute-0 kernel: tap3b31ee50-18: entered promiscuous mode
Jan 21 23:59:23 compute-0 NetworkManager[55139]: <info>  [1769039963.7585] manager: (tap3b31ee50-18): new Tun device (/org/freedesktop/NetworkManager/Devices/122)
Jan 21 23:59:23 compute-0 ovn_controller[95047]: 2026-01-21T23:59:23Z|00269|binding|INFO|Claiming lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for this chassis.
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.759 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:23 compute-0 ovn_controller[95047]: 2026-01-21T23:59:23Z|00270|binding|INFO|3b31ee50-1828-4f0b-b32d-f77ee76a8c63: Claiming fa:16:3e:b4:b3:f6 10.100.0.4
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.769 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:b3:f6 10.100.0.4'], port_security=['fa:16:3e:b4:b3:f6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '30452704-b180-41c6-98c4-8b168b3bc5e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=3b31ee50-1828-4f0b-b32d-f77ee76a8c63) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.770 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a bound to our chassis
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.772 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:59:23 compute-0 ovn_controller[95047]: 2026-01-21T23:59:23Z|00271|binding|INFO|Setting lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 ovn-installed in OVS
Jan 21 23:59:23 compute-0 ovn_controller[95047]: 2026-01-21T23:59:23Z|00272|binding|INFO|Setting lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 up in Southbound
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.776 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.780 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:23 compute-0 systemd-udevd[222720]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.792 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c80e191b-4a34-4e75-9950-c648894b7446]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:23 compute-0 NetworkManager[55139]: <info>  [1769039963.8052] device (tap3b31ee50-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:59:23 compute-0 NetworkManager[55139]: <info>  [1769039963.8061] device (tap3b31ee50-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:59:23 compute-0 systemd-machined[154182]: New machine qemu-38-instance-0000004c.
Jan 21 23:59:23 compute-0 systemd[1]: Started Virtual Machine qemu-38-instance-0000004c.
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.831 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d4a3c1-cdc2-4411-8ec1-5f8cd0b8a748]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.838 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f435c41b-57c3-4312-881f-df96612b23c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.892 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a93c0b-e73b-4bd2-85ac-91e03b27b409]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.914 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[639a2679-ea42-494b-93aa-cb5842bcee6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439412, 'reachable_time': 21509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222734, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.937 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[84993357-66d5-44e6-bf42-1fa38e59b800]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap58cd83db-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439426, 'tstamp': 439426}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222736, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap58cd83db-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439429, 'tstamp': 439429}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222736, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.940 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.941 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.943 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58cd83db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.943 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.943 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58cd83db-d0, col_values=(('external_ids', {'iface-id': '2d113249-07d3-443f-9b57-5f5a422d1c98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:23 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:23.943 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.990 182939 DEBUG nova.compute.manager [req-82993857-a78d-4165-80cc-af71a6efe117 req-7d7af43b-2859-43de-ae8f-a1b9dd634e55 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.990 182939 DEBUG oslo_concurrency.lockutils [req-82993857-a78d-4165-80cc-af71a6efe117 req-7d7af43b-2859-43de-ae8f-a1b9dd634e55 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.991 182939 DEBUG oslo_concurrency.lockutils [req-82993857-a78d-4165-80cc-af71a6efe117 req-7d7af43b-2859-43de-ae8f-a1b9dd634e55 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.991 182939 DEBUG oslo_concurrency.lockutils [req-82993857-a78d-4165-80cc-af71a6efe117 req-7d7af43b-2859-43de-ae8f-a1b9dd634e55 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.991 182939 DEBUG nova.compute.manager [req-82993857-a78d-4165-80cc-af71a6efe117 req-7d7af43b-2859-43de-ae8f-a1b9dd634e55 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] No waiting events found dispatching network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:23 compute-0 nova_compute[182935]: 2026-01-21 23:59:23.992 182939 WARNING nova.compute.manager [req-82993857-a78d-4165-80cc-af71a6efe117 req-7d7af43b-2859-43de-ae8f-a1b9dd634e55 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received unexpected event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for instance with vm_state active and task_state rescuing.
Jan 21 23:59:24 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:24.773 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:25 compute-0 nova_compute[182935]: 2026-01-21 23:59:25.570 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for 30452704-b180-41c6-98c4-8b168b3bc5e9 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 21 23:59:25 compute-0 nova_compute[182935]: 2026-01-21 23:59:25.571 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039965.5692852, 30452704-b180-41c6-98c4-8b168b3bc5e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:25 compute-0 nova_compute[182935]: 2026-01-21 23:59:25.572 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] VM Resumed (Lifecycle Event)
Jan 21 23:59:25 compute-0 nova_compute[182935]: 2026-01-21 23:59:25.588 182939 DEBUG nova.compute.manager [None req-4a77ec04-a81a-4dda-8ae3-d87d77ad3a0e 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:25 compute-0 nova_compute[182935]: 2026-01-21 23:59:25.597 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:25 compute-0 nova_compute[182935]: 2026-01-21 23:59:25.599 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:59:25 compute-0 nova_compute[182935]: 2026-01-21 23:59:25.658 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] During sync_power_state the instance has a pending task (rescuing). Skip.
Jan 21 23:59:25 compute-0 nova_compute[182935]: 2026-01-21 23:59:25.659 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039965.574941, 30452704-b180-41c6-98c4-8b168b3bc5e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:25 compute-0 nova_compute[182935]: 2026-01-21 23:59:25.659 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] VM Started (Lifecycle Event)
Jan 21 23:59:25 compute-0 nova_compute[182935]: 2026-01-21 23:59:25.692 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:25 compute-0 nova_compute[182935]: 2026-01-21 23:59:25.698 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.123 182939 DEBUG nova.compute.manager [req-49aebb5a-5d7d-4a46-9e5f-00d12b9b5271 req-4d26da36-dfc0-4473-aeae-2bc41b042dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.123 182939 DEBUG oslo_concurrency.lockutils [req-49aebb5a-5d7d-4a46-9e5f-00d12b9b5271 req-4d26da36-dfc0-4473-aeae-2bc41b042dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.123 182939 DEBUG oslo_concurrency.lockutils [req-49aebb5a-5d7d-4a46-9e5f-00d12b9b5271 req-4d26da36-dfc0-4473-aeae-2bc41b042dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.124 182939 DEBUG oslo_concurrency.lockutils [req-49aebb5a-5d7d-4a46-9e5f-00d12b9b5271 req-4d26da36-dfc0-4473-aeae-2bc41b042dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.124 182939 DEBUG nova.compute.manager [req-49aebb5a-5d7d-4a46-9e5f-00d12b9b5271 req-4d26da36-dfc0-4473-aeae-2bc41b042dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] No waiting events found dispatching network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.124 182939 WARNING nova.compute.manager [req-49aebb5a-5d7d-4a46-9e5f-00d12b9b5271 req-4d26da36-dfc0-4473-aeae-2bc41b042dda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received unexpected event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for instance with vm_state rescued and task_state None.
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.268 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:26 compute-0 podman[222758]: 2026-01-21 23:59:26.708536453 +0000 UTC m=+0.071985144 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:59:26 compute-0 podman[222757]: 2026-01-21 23:59:26.712985129 +0000 UTC m=+0.076280066 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.931 182939 INFO nova.compute.manager [None req-cc2b7433-db0e-475f-8bd4-1b1a851e73f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Unrescuing
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.932 182939 DEBUG oslo_concurrency.lockutils [None req-cc2b7433-db0e-475f-8bd4-1b1a851e73f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "refresh_cache-30452704-b180-41c6-98c4-8b168b3bc5e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.932 182939 DEBUG oslo_concurrency.lockutils [None req-cc2b7433-db0e-475f-8bd4-1b1a851e73f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquired lock "refresh_cache-30452704-b180-41c6-98c4-8b168b3bc5e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.932 182939 DEBUG nova.network.neutron [None req-cc2b7433-db0e-475f-8bd4-1b1a851e73f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.952 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.952 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.952 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 21 23:59:26 compute-0 nova_compute[182935]: 2026-01-21 23:59:26.952 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:27 compute-0 nova_compute[182935]: 2026-01-21 23:59:27.079 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.224 182939 DEBUG nova.network.neutron [None req-cc2b7433-db0e-475f-8bd4-1b1a851e73f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Updating instance_info_cache with network_info: [{"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.250 182939 DEBUG oslo_concurrency.lockutils [None req-cc2b7433-db0e-475f-8bd4-1b1a851e73f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Releasing lock "refresh_cache-30452704-b180-41c6-98c4-8b168b3bc5e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.251 182939 DEBUG nova.objects.instance [None req-cc2b7433-db0e-475f-8bd4-1b1a851e73f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'flavor' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:28 compute-0 kernel: tap3b31ee50-18 (unregistering): left promiscuous mode
Jan 21 23:59:28 compute-0 NetworkManager[55139]: <info>  [1769039968.3137] device (tap3b31ee50-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:59:28 compute-0 ovn_controller[95047]: 2026-01-21T23:59:28Z|00273|binding|INFO|Releasing lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 from this chassis (sb_readonly=0)
Jan 21 23:59:28 compute-0 ovn_controller[95047]: 2026-01-21T23:59:28Z|00274|binding|INFO|Setting lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 down in Southbound
Jan 21 23:59:28 compute-0 ovn_controller[95047]: 2026-01-21T23:59:28Z|00275|binding|INFO|Removing iface tap3b31ee50-18 ovn-installed in OVS
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.323 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.332 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:b3:f6 10.100.0.4'], port_security=['fa:16:3e:b4:b3:f6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '30452704-b180-41c6-98c4-8b168b3bc5e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=3b31ee50-1828-4f0b-b32d-f77ee76a8c63) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.333 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a unbound from our chassis
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.335 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.336 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.353 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[738461c4-0e06-45d4-9bf4-a476292dba82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Jan 21 23:59:28 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004c.scope: Consumed 4.446s CPU time.
Jan 21 23:59:28 compute-0 systemd-machined[154182]: Machine qemu-38-instance-0000004c terminated.
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.386 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[e42bfafe-96c8-40d2-8ac9-68b8fb4b089c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.391 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[a823dea7-8e6e-41b3-9fca-0402ba44275e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.429 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c0376d2f-f28d-4428-a236-9d0cf67809eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.447 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[013dc8c4-aad9-4980-b051-680e32fd07d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439412, 'reachable_time': 21509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222811, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.466 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bba3ea33-cd30-4468-8286-b11879a85e67]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap58cd83db-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439426, 'tstamp': 439426}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222812, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap58cd83db-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439429, 'tstamp': 439429}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222812, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.467 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.468 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.472 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Updating instance_info_cache with network_info: [{"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.499 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-ada4a724-2307-431d-8c29-075bfd90b43e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.499 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.536 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58cd83db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.536 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.536 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58cd83db-d0, col_values=(('external_ids', {'iface-id': '2d113249-07d3-443f-9b57-5f5a422d1c98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.537 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.554 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.559 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.616 182939 INFO nova.virt.libvirt.driver [-] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Instance destroyed successfully.
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.616 182939 DEBUG nova.objects.instance [None req-cc2b7433-db0e-475f-8bd4-1b1a851e73f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'numa_topology' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:28 compute-0 kernel: tap3b31ee50-18: entered promiscuous mode
Jan 21 23:59:28 compute-0 systemd-udevd[222803]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:59:28 compute-0 NetworkManager[55139]: <info>  [1769039968.7087] manager: (tap3b31ee50-18): new Tun device (/org/freedesktop/NetworkManager/Devices/123)
Jan 21 23:59:28 compute-0 ovn_controller[95047]: 2026-01-21T23:59:28Z|00276|binding|INFO|Claiming lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for this chassis.
Jan 21 23:59:28 compute-0 ovn_controller[95047]: 2026-01-21T23:59:28Z|00277|binding|INFO|3b31ee50-1828-4f0b-b32d-f77ee76a8c63: Claiming fa:16:3e:b4:b3:f6 10.100.0.4
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.709 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-0 NetworkManager[55139]: <info>  [1769039968.7199] device (tap3b31ee50-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.719 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:b3:f6 10.100.0.4'], port_security=['fa:16:3e:b4:b3:f6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '30452704-b180-41c6-98c4-8b168b3bc5e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=3b31ee50-1828-4f0b-b32d-f77ee76a8c63) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:28 compute-0 NetworkManager[55139]: <info>  [1769039968.7207] device (tap3b31ee50-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.720 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a bound to our chassis
Jan 21 23:59:28 compute-0 ovn_controller[95047]: 2026-01-21T23:59:28Z|00278|binding|INFO|Setting lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 ovn-installed in OVS
Jan 21 23:59:28 compute-0 ovn_controller[95047]: 2026-01-21T23:59:28Z|00279|binding|INFO|Setting lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 up in Southbound
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.723 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.723 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.725 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.748 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0b2c43-deb9-4e9b-863d-601e252644bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-0 systemd-machined[154182]: New machine qemu-39-instance-0000004c.
Jan 21 23:59:28 compute-0 systemd[1]: Started Virtual Machine qemu-39-instance-0000004c.
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.784 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[93839d14-d462-43e5-8ea9-ca73336db353]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.789 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[61fbedbe-85f8-4278-b977-756c95b60e94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.825 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[54fb7504-f1c3-4266-844a-82fddd54a929]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.850 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d022d720-fc4e-4418-b78b-7b0b10937097]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 13, 'rx_bytes': 616, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 13, 'rx_bytes': 616, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439412, 'reachable_time': 21509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222860, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.869 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9275a48b-a9fa-4ac5-afe2-b77bdd31c9bb]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap58cd83db-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439426, 'tstamp': 439426}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222861, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap58cd83db-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439429, 'tstamp': 439429}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222861, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.872 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.874 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-0 nova_compute[182935]: 2026-01-21 23:59:28.875 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.876 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58cd83db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.876 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.877 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58cd83db-d0, col_values=(('external_ids', {'iface-id': '2d113249-07d3-443f-9b57-5f5a422d1c98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:28 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:28.877 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.074 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for 30452704-b180-41c6-98c4-8b168b3bc5e9 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.075 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039969.074303, 30452704-b180-41c6-98c4-8b168b3bc5e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.075 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] VM Resumed (Lifecycle Event)
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.078 182939 DEBUG nova.compute.manager [None req-cc2b7433-db0e-475f-8bd4-1b1a851e73f3 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.117 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.122 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.174 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039969.0773182, 30452704-b180-41c6-98c4-8b168b3bc5e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.175 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] VM Started (Lifecycle Event)
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.201 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.207 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.820 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.820 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.820 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.821 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.899 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.973 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:29 compute-0 nova_compute[182935]: 2026-01-21 23:59:29.974 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.034 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.041 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.106 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.107 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.167 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.319 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.321 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5391MB free_disk=73.14523315429688GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.321 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.321 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.385 182939 DEBUG nova.compute.manager [req-d635bceb-9775-411e-a5cf-85f9b45500c5 req-3364a90d-1244-498a-b9c5-29d04f036166 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-unplugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.385 182939 DEBUG oslo_concurrency.lockutils [req-d635bceb-9775-411e-a5cf-85f9b45500c5 req-3364a90d-1244-498a-b9c5-29d04f036166 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.385 182939 DEBUG oslo_concurrency.lockutils [req-d635bceb-9775-411e-a5cf-85f9b45500c5 req-3364a90d-1244-498a-b9c5-29d04f036166 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.385 182939 DEBUG oslo_concurrency.lockutils [req-d635bceb-9775-411e-a5cf-85f9b45500c5 req-3364a90d-1244-498a-b9c5-29d04f036166 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.385 182939 DEBUG nova.compute.manager [req-d635bceb-9775-411e-a5cf-85f9b45500c5 req-3364a90d-1244-498a-b9c5-29d04f036166 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] No waiting events found dispatching network-vif-unplugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.386 182939 WARNING nova.compute.manager [req-d635bceb-9775-411e-a5cf-85f9b45500c5 req-3364a90d-1244-498a-b9c5-29d04f036166 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received unexpected event network-vif-unplugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for instance with vm_state active and task_state None.
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.423 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance ada4a724-2307-431d-8c29-075bfd90b43e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.424 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 30452704-b180-41c6-98c4-8b168b3bc5e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.424 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.424 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.444 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.512 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.513 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.532 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.554 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.615 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.631 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.657 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 21 23:59:30 compute-0 nova_compute[182935]: 2026-01-21 23:59:30.657 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:31 compute-0 nova_compute[182935]: 2026-01-21 23:59:31.269 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:32 compute-0 nova_compute[182935]: 2026-01-21 23:59:32.113 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:32 compute-0 nova_compute[182935]: 2026-01-21 23:59:32.654 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:32 compute-0 nova_compute[182935]: 2026-01-21 23:59:32.674 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:32 compute-0 nova_compute[182935]: 2026-01-21 23:59:32.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:33 compute-0 nova_compute[182935]: 2026-01-21 23:59:33.551 182939 DEBUG nova.compute.manager [req-dc6373ee-c08d-4ad8-a0be-10314e0c7c46 req-c96a6e13-02ee-47a6-abcf-3ff8345fe3b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:33 compute-0 nova_compute[182935]: 2026-01-21 23:59:33.551 182939 DEBUG oslo_concurrency.lockutils [req-dc6373ee-c08d-4ad8-a0be-10314e0c7c46 req-c96a6e13-02ee-47a6-abcf-3ff8345fe3b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:33 compute-0 nova_compute[182935]: 2026-01-21 23:59:33.552 182939 DEBUG oslo_concurrency.lockutils [req-dc6373ee-c08d-4ad8-a0be-10314e0c7c46 req-c96a6e13-02ee-47a6-abcf-3ff8345fe3b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:33 compute-0 nova_compute[182935]: 2026-01-21 23:59:33.552 182939 DEBUG oslo_concurrency.lockutils [req-dc6373ee-c08d-4ad8-a0be-10314e0c7c46 req-c96a6e13-02ee-47a6-abcf-3ff8345fe3b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:33 compute-0 nova_compute[182935]: 2026-01-21 23:59:33.552 182939 DEBUG nova.compute.manager [req-dc6373ee-c08d-4ad8-a0be-10314e0c7c46 req-c96a6e13-02ee-47a6-abcf-3ff8345fe3b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] No waiting events found dispatching network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:33 compute-0 nova_compute[182935]: 2026-01-21 23:59:33.552 182939 WARNING nova.compute.manager [req-dc6373ee-c08d-4ad8-a0be-10314e0c7c46 req-c96a6e13-02ee-47a6-abcf-3ff8345fe3b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received unexpected event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for instance with vm_state active and task_state None.
Jan 21 23:59:34 compute-0 nova_compute[182935]: 2026-01-21 23:59:34.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:35 compute-0 nova_compute[182935]: 2026-01-21 23:59:35.814 182939 DEBUG nova.compute.manager [req-d64345db-09ba-4b65-9e7c-7f72a18b59de req-98f45e40-7c0d-42d7-a791-65d1271a6f74 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:35 compute-0 nova_compute[182935]: 2026-01-21 23:59:35.814 182939 DEBUG oslo_concurrency.lockutils [req-d64345db-09ba-4b65-9e7c-7f72a18b59de req-98f45e40-7c0d-42d7-a791-65d1271a6f74 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:35 compute-0 nova_compute[182935]: 2026-01-21 23:59:35.815 182939 DEBUG oslo_concurrency.lockutils [req-d64345db-09ba-4b65-9e7c-7f72a18b59de req-98f45e40-7c0d-42d7-a791-65d1271a6f74 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:35 compute-0 nova_compute[182935]: 2026-01-21 23:59:35.815 182939 DEBUG oslo_concurrency.lockutils [req-d64345db-09ba-4b65-9e7c-7f72a18b59de req-98f45e40-7c0d-42d7-a791-65d1271a6f74 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:35 compute-0 nova_compute[182935]: 2026-01-21 23:59:35.815 182939 DEBUG nova.compute.manager [req-d64345db-09ba-4b65-9e7c-7f72a18b59de req-98f45e40-7c0d-42d7-a791-65d1271a6f74 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] No waiting events found dispatching network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:35 compute-0 nova_compute[182935]: 2026-01-21 23:59:35.815 182939 WARNING nova.compute.manager [req-d64345db-09ba-4b65-9e7c-7f72a18b59de req-98f45e40-7c0d-42d7-a791-65d1271a6f74 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received unexpected event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for instance with vm_state active and task_state None.
Jan 21 23:59:36 compute-0 nova_compute[182935]: 2026-01-21 23:59:36.270 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.115 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:37 compute-0 podman[222886]: 2026-01-21 23:59:37.695890127 +0000 UTC m=+0.058792877 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.721 182939 DEBUG oslo_concurrency.lockutils [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.722 182939 DEBUG oslo_concurrency.lockutils [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.722 182939 DEBUG oslo_concurrency.lockutils [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.722 182939 DEBUG oslo_concurrency.lockutils [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.723 182939 DEBUG oslo_concurrency.lockutils [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.735 182939 INFO nova.compute.manager [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Terminating instance
Jan 21 23:59:37 compute-0 podman[222885]: 2026-01-21 23:59:37.738589825 +0000 UTC m=+0.101290939 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.746 182939 DEBUG nova.compute.manager [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:59:37 compute-0 kernel: tap3b31ee50-18 (unregistering): left promiscuous mode
Jan 21 23:59:37 compute-0 NetworkManager[55139]: <info>  [1769039977.7696] device (tap3b31ee50-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:59:37 compute-0 ovn_controller[95047]: 2026-01-21T23:59:37Z|00280|binding|INFO|Releasing lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 from this chassis (sb_readonly=0)
Jan 21 23:59:37 compute-0 ovn_controller[95047]: 2026-01-21T23:59:37Z|00281|binding|INFO|Setting lport 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 down in Southbound
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.773 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:37 compute-0 ovn_controller[95047]: 2026-01-21T23:59:37Z|00282|binding|INFO|Removing iface tap3b31ee50-18 ovn-installed in OVS
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.775 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.789 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.796 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b4:b3:f6 10.100.0.4'], port_security=['fa:16:3e:b4:b3:f6 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '30452704-b180-41c6-98c4-8b168b3bc5e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=3b31ee50-1828-4f0b-b32d-f77ee76a8c63) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.797 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 3b31ee50-1828-4f0b-b32d-f77ee76a8c63 in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a unbound from our chassis
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.799 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 58cd83db-dcb3-409c-a108-07601ce5f67a
Jan 21 23:59:37 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Jan 21 23:59:37 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000004c.scope: Consumed 9.146s CPU time.
Jan 21 23:59:37 compute-0 systemd-machined[154182]: Machine qemu-39-instance-0000004c terminated.
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.820 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[861b4396-f532-42c4-838d-2a39c70131ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.854 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[289f297e-e7ca-418b-9a9f-885088c78176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.858 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[88f0ee96-ab96-4801-a81f-a763e0ada748]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.902 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[0bbe3b14-7a07-4e2d-a671-c5928edfbfda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.920 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d153604b-013c-4ea5-aefa-098c9c93c406]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap58cd83db-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:9a:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 15, 'rx_bytes': 616, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 15, 'rx_bytes': 616, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439412, 'reachable_time': 21509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222950, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.942 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[565f1799-fc2c-4e7a-b052-480818f7b764]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap58cd83db-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439426, 'tstamp': 439426}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222951, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap58cd83db-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439429, 'tstamp': 439429}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222951, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.944 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.947 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:37 compute-0 nova_compute[182935]: 2026-01-21 23:59:37.952 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.952 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58cd83db-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.953 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.953 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap58cd83db-d0, col_values=(('external_ids', {'iface-id': '2d113249-07d3-443f-9b57-5f5a422d1c98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:37 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:37.954 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.007 182939 INFO nova.virt.libvirt.driver [-] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Instance destroyed successfully.
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.007 182939 DEBUG nova.objects.instance [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'resources' on Instance uuid 30452704-b180-41c6-98c4-8b168b3bc5e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.023 182939 DEBUG nova.virt.libvirt.vif [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:58:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-878012212',display_name='tempest-ServerStableDeviceRescueTest-server-878012212',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-878012212',id=76,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='011e84f966444a668bd6c0f5674f551f',ramdisk_id='',reservation_id='r-zglabz6y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1256721315',owner_user_name='tempest-ServerStableDeviceRescueTest-1256721315-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:59:29Z,user_data=None,user_id='55710edfd4b24e368807c8b5087ec91c',uuid=30452704-b180-41c6-98c4-8b168b3bc5e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.023 182939 DEBUG nova.network.os_vif_util [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converting VIF {"id": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "address": "fa:16:3e:b4:b3:f6", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b31ee50-18", "ovs_interfaceid": "3b31ee50-1828-4f0b-b32d-f77ee76a8c63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.024 182939 DEBUG nova.network.os_vif_util [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=3b31ee50-1828-4f0b-b32d-f77ee76a8c63,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b31ee50-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.025 182939 DEBUG os_vif [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=3b31ee50-1828-4f0b-b32d-f77ee76a8c63,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b31ee50-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.027 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.028 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b31ee50-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.030 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.032 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.036 182939 INFO os_vif [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=3b31ee50-1828-4f0b-b32d-f77ee76a8c63,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b31ee50-18')
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.037 182939 INFO nova.virt.libvirt.driver [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Deleting instance files /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9_del
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.038 182939 INFO nova.virt.libvirt.driver [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Deletion of /var/lib/nova/instances/30452704-b180-41c6-98c4-8b168b3bc5e9_del complete
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.132 182939 INFO nova.compute.manager [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.132 182939 DEBUG oslo.service.loopingcall [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.132 182939 DEBUG nova.compute.manager [-] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.133 182939 DEBUG nova.network.neutron [-] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.324 182939 DEBUG nova.compute.manager [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.325 182939 DEBUG oslo_concurrency.lockutils [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.325 182939 DEBUG oslo_concurrency.lockutils [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.325 182939 DEBUG oslo_concurrency.lockutils [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.325 182939 DEBUG nova.compute.manager [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] No waiting events found dispatching network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.326 182939 WARNING nova.compute.manager [req-be79bd0e-f810-4c47-905a-dc41d6b52510 req-8ea40329-9799-4783-ac02-8a1da3264d8b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received unexpected event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for instance with vm_state active and task_state deleting.
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.863 182939 DEBUG nova.network.neutron [-] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.889 182939 INFO nova.compute.manager [-] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Took 0.76 seconds to deallocate network for instance.
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.965 182939 DEBUG oslo_concurrency.lockutils [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:38 compute-0 nova_compute[182935]: 2026-01-21 23:59:38.965 182939 DEBUG oslo_concurrency.lockutils [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:39 compute-0 nova_compute[182935]: 2026-01-21 23:59:39.055 182939 DEBUG nova.compute.provider_tree [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:59:39 compute-0 nova_compute[182935]: 2026-01-21 23:59:39.071 182939 DEBUG nova.scheduler.client.report [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:59:39 compute-0 nova_compute[182935]: 2026-01-21 23:59:39.096 182939 DEBUG oslo_concurrency.lockutils [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:39 compute-0 nova_compute[182935]: 2026-01-21 23:59:39.141 182939 INFO nova.scheduler.client.report [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Deleted allocations for instance 30452704-b180-41c6-98c4-8b168b3bc5e9
Jan 21 23:59:39 compute-0 nova_compute[182935]: 2026-01-21 23:59:39.236 182939 DEBUG oslo_concurrency.lockutils [None req-eb73152e-07e0-45d5-873a-cc02e1bc7322 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.514s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.242 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.243 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.262 182939 DEBUG nova.compute.manager [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.368 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.369 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.375 182939 DEBUG nova.virt.hardware [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.375 182939 INFO nova.compute.claims [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Claim successful on node compute-0.ctlplane.example.com
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.426 182939 DEBUG nova.compute.manager [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-unplugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.426 182939 DEBUG oslo_concurrency.lockutils [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.426 182939 DEBUG oslo_concurrency.lockutils [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.427 182939 DEBUG oslo_concurrency.lockutils [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.427 182939 DEBUG nova.compute.manager [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] No waiting events found dispatching network-vif-unplugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.427 182939 WARNING nova.compute.manager [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received unexpected event network-vif-unplugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for instance with vm_state deleted and task_state None.
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.427 182939 DEBUG nova.compute.manager [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.427 182939 DEBUG oslo_concurrency.lockutils [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.427 182939 DEBUG oslo_concurrency.lockutils [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.428 182939 DEBUG oslo_concurrency.lockutils [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30452704-b180-41c6-98c4-8b168b3bc5e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.428 182939 DEBUG nova.compute.manager [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] No waiting events found dispatching network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.428 182939 WARNING nova.compute.manager [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received unexpected event network-vif-plugged-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 for instance with vm_state deleted and task_state None.
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.428 182939 DEBUG nova.compute.manager [req-8f19108d-98ed-4779-ae41-db34fc435c52 req-fb9343d2-988f-4353-8ab7-11f9f8a2d95d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Received event network-vif-deleted-3b31ee50-1828-4f0b-b32d-f77ee76a8c63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.571 182939 DEBUG nova.compute.provider_tree [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.588 182939 DEBUG nova.scheduler.client.report [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.620 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.621 182939 DEBUG nova.compute.manager [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.693 182939 DEBUG nova.compute.manager [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.694 182939 DEBUG nova.network.neutron [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.716 182939 INFO nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.750 182939 DEBUG nova.compute.manager [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.924 182939 DEBUG nova.compute.manager [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.925 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.925 182939 INFO nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Creating image(s)
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.926 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "/var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.926 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.927 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.939 182939 DEBUG oslo_concurrency.processutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:40 compute-0 nova_compute[182935]: 2026-01-21 23:59:40.992 182939 DEBUG nova.policy [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e78a70a1d284a9d932d4a53b872df39', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.000 182939 DEBUG oslo_concurrency.processutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.001 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.002 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.012 182939 DEBUG oslo_concurrency.processutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.078 182939 DEBUG oslo_concurrency.processutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.079 182939 DEBUG oslo_concurrency.processutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.117 182939 DEBUG oslo_concurrency.processutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.118 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.119 182939 DEBUG oslo_concurrency.processutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.178 182939 DEBUG oslo_concurrency.processutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.179 182939 DEBUG nova.virt.disk.api [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Checking if we can resize image /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.181 182939 DEBUG oslo_concurrency.processutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.249 182939 DEBUG oslo_concurrency.processutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.250 182939 DEBUG nova.virt.disk.api [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Cannot resize image /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.250 182939 DEBUG nova.objects.instance [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'migration_context' on Instance uuid 91ae2c4a-ab10-4954-a382-a87fa6f89551 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.266 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.267 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Ensure instance console log exists: /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.267 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.267 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.268 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.272 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:41 compute-0 nova_compute[182935]: 2026-01-21 23:59:41.607 182939 DEBUG nova.network.neutron [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Successfully created port: c05b5089-7fab-41da-bf56-5cf234379b1d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 21 23:59:42 compute-0 nova_compute[182935]: 2026-01-21 23:59:42.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 21 23:59:43 compute-0 nova_compute[182935]: 2026-01-21 23:59:43.029 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:43 compute-0 nova_compute[182935]: 2026-01-21 23:59:43.165 182939 DEBUG nova.network.neutron [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Successfully updated port: c05b5089-7fab-41da-bf56-5cf234379b1d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 21 23:59:43 compute-0 nova_compute[182935]: 2026-01-21 23:59:43.184 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-91ae2c4a-ab10-4954-a382-a87fa6f89551" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:43 compute-0 nova_compute[182935]: 2026-01-21 23:59:43.184 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-91ae2c4a-ab10-4954-a382-a87fa6f89551" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:43 compute-0 nova_compute[182935]: 2026-01-21 23:59:43.185 182939 DEBUG nova.network.neutron [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 21 23:59:43 compute-0 nova_compute[182935]: 2026-01-21 23:59:43.261 182939 DEBUG nova.compute.manager [req-05343082-41f3-48f7-9541-4607b364e9a9 req-9237532e-cdb5-4197-8432-7a721d9de947 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received event network-changed-c05b5089-7fab-41da-bf56-5cf234379b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:43 compute-0 nova_compute[182935]: 2026-01-21 23:59:43.262 182939 DEBUG nova.compute.manager [req-05343082-41f3-48f7-9541-4607b364e9a9 req-9237532e-cdb5-4197-8432-7a721d9de947 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Refreshing instance network info cache due to event network-changed-c05b5089-7fab-41da-bf56-5cf234379b1d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 21 23:59:43 compute-0 nova_compute[182935]: 2026-01-21 23:59:43.262 182939 DEBUG oslo_concurrency.lockutils [req-05343082-41f3-48f7-9541-4607b364e9a9 req-9237532e-cdb5-4197-8432-7a721d9de947 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-91ae2c4a-ab10-4954-a382-a87fa6f89551" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 21 23:59:43 compute-0 nova_compute[182935]: 2026-01-21 23:59:43.386 182939 DEBUG nova.network.neutron [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.052 182939 DEBUG nova.network.neutron [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Updating instance_info_cache with network_info: [{"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.083 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-91ae2c4a-ab10-4954-a382-a87fa6f89551" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.083 182939 DEBUG nova.compute.manager [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Instance network_info: |[{"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.084 182939 DEBUG oslo_concurrency.lockutils [req-05343082-41f3-48f7-9541-4607b364e9a9 req-9237532e-cdb5-4197-8432-7a721d9de947 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-91ae2c4a-ab10-4954-a382-a87fa6f89551" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.084 182939 DEBUG nova.network.neutron [req-05343082-41f3-48f7-9541-4607b364e9a9 req-9237532e-cdb5-4197-8432-7a721d9de947 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Refreshing network info cache for port c05b5089-7fab-41da-bf56-5cf234379b1d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.087 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Start _get_guest_xml network_info=[{"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.091 182939 WARNING nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.097 182939 DEBUG nova.virt.libvirt.host [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.097 182939 DEBUG nova.virt.libvirt.host [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.103 182939 DEBUG nova.virt.libvirt.host [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.103 182939 DEBUG nova.virt.libvirt.host [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.104 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.105 182939 DEBUG nova.virt.hardware [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.105 182939 DEBUG nova.virt.hardware [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.105 182939 DEBUG nova.virt.hardware [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.106 182939 DEBUG nova.virt.hardware [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.106 182939 DEBUG nova.virt.hardware [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.106 182939 DEBUG nova.virt.hardware [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.106 182939 DEBUG nova.virt.hardware [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.107 182939 DEBUG nova.virt.hardware [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.107 182939 DEBUG nova.virt.hardware [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.107 182939 DEBUG nova.virt.hardware [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.107 182939 DEBUG nova.virt.hardware [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.111 182939 DEBUG nova.virt.libvirt.vif [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-429300056',display_name='tempest-tempest.common.compute-instance-429300056',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-429300056',id=79,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-0d80tr00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:40Z,user_data=None,user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=91ae2c4a-ab10-4954-a382-a87fa6f89551,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.112 182939 DEBUG nova.network.os_vif_util [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.113 182939 DEBUG nova.network.os_vif_util [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.114 182939 DEBUG nova.objects.instance [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 91ae2c4a-ab10-4954-a382-a87fa6f89551 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.131 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] End _get_guest_xml xml=<domain type="kvm">
Jan 21 23:59:45 compute-0 nova_compute[182935]:   <uuid>91ae2c4a-ab10-4954-a382-a87fa6f89551</uuid>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   <name>instance-0000004f</name>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   <metadata>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <nova:name>tempest-tempest.common.compute-instance-429300056</nova:name>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-21 23:59:45</nova:creationTime>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 21 23:59:45 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 21 23:59:45 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 21 23:59:45 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 21 23:59:45 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 21 23:59:45 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <nova:owner>
Jan 21 23:59:45 compute-0 nova_compute[182935]:         <nova:user uuid="3e78a70a1d284a9d932d4a53b872df39">tempest-ServerActionsTestJSON-78742637-project-member</nova:user>
Jan 21 23:59:45 compute-0 nova_compute[182935]:         <nova:project uuid="cccb624dbe6d4401a89e9cd254f91828">tempest-ServerActionsTestJSON-78742637</nova:project>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       </nova:owner>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <nova:ports>
Jan 21 23:59:45 compute-0 nova_compute[182935]:         <nova:port uuid="c05b5089-7fab-41da-bf56-5cf234379b1d">
Jan 21 23:59:45 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:         </nova:port>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       </nova:ports>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     </nova:instance>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   </metadata>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <system>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <entry name="serial">91ae2c4a-ab10-4954-a382-a87fa6f89551</entry>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <entry name="uuid">91ae2c4a-ab10-4954-a382-a87fa6f89551</entry>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     </system>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   </sysinfo>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   <os>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   </os>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   <features>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <acpi/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <apic/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   </features>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   </clock>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   </cpu>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   <devices>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.config"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     </disk>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:be:5a:da"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <target dev="tapc05b5089-7f"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     </interface>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/console.log" append="off"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     </serial>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <video>
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     </video>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     </rng>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 21 23:59:45 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 21 23:59:45 compute-0 nova_compute[182935]:     </memballoon>
Jan 21 23:59:45 compute-0 nova_compute[182935]:   </devices>
Jan 21 23:59:45 compute-0 nova_compute[182935]: </domain>
Jan 21 23:59:45 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.133 182939 DEBUG nova.compute.manager [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Preparing to wait for external event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.134 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.134 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.134 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.135 182939 DEBUG nova.virt.libvirt.vif [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-21T23:59:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-429300056',display_name='tempest-tempest.common.compute-instance-429300056',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-429300056',id=79,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-0d80tr00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:40Z,user_data=None,user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=91ae2c4a-ab10-4954-a382-a87fa6f89551,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.135 182939 DEBUG nova.network.os_vif_util [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.136 182939 DEBUG nova.network.os_vif_util [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.137 182939 DEBUG os_vif [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.138 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.138 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.139 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.143 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.144 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc05b5089-7f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.145 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc05b5089-7f, col_values=(('external_ids', {'iface-id': 'c05b5089-7fab-41da-bf56-5cf234379b1d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:5a:da', 'vm-uuid': '91ae2c4a-ab10-4954-a382-a87fa6f89551'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.147 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 NetworkManager[55139]: <info>  [1769039985.1485] manager: (tapc05b5089-7f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.151 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.152 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.153 182939 INFO os_vif [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f')
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.230 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.230 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.231 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No VIF found with MAC fa:16:3e:be:5a:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.231 182939 INFO nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Using config drive
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.246 182939 DEBUG oslo_concurrency.lockutils [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.246 182939 DEBUG oslo_concurrency.lockutils [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.246 182939 DEBUG oslo_concurrency.lockutils [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.246 182939 DEBUG oslo_concurrency.lockutils [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.247 182939 DEBUG oslo_concurrency.lockutils [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.257 182939 INFO nova.compute.manager [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Terminating instance
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.270 182939 DEBUG nova.compute.manager [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 21 23:59:45 compute-0 kernel: tapbce17837-92 (unregistering): left promiscuous mode
Jan 21 23:59:45 compute-0 NetworkManager[55139]: <info>  [1769039985.3008] device (tapbce17837-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.305 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 ovn_controller[95047]: 2026-01-21T23:59:45Z|00283|binding|INFO|Releasing lport bce17837-9218-4b02-868d-09dba821ce49 from this chassis (sb_readonly=0)
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.319 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 ovn_controller[95047]: 2026-01-21T23:59:45Z|00284|binding|INFO|Setting lport bce17837-9218-4b02-868d-09dba821ce49 down in Southbound
Jan 21 23:59:45 compute-0 ovn_controller[95047]: 2026-01-21T23:59:45Z|00285|binding|INFO|Removing iface tapbce17837-92 ovn-installed in OVS
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.322 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.330 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7e:09:d8 10.100.0.11'], port_security=['fa:16:3e:7e:09:d8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'ada4a724-2307-431d-8c29-075bfd90b43e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-58cd83db-dcb3-409c-a108-07601ce5f67a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '011e84f966444a668bd6c0f5674f551f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5f02e9fc-67d8-4ade-8ddf-f139c26fa610', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbb394e5-dc7d-4c83-b892-c42bee4b1312, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=bce17837-9218-4b02-868d-09dba821ce49) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.331 104408 INFO neutron.agent.ovn.metadata.agent [-] Port bce17837-9218-4b02-868d-09dba821ce49 in datapath 58cd83db-dcb3-409c-a108-07601ce5f67a unbound from our chassis
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.333 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 58cd83db-dcb3-409c-a108-07601ce5f67a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.333 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.334 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ae60daf8-396c-4f4a-9d96-64f1396da789]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.335 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a namespace which is not needed anymore
Jan 21 23:59:45 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000048.scope: Deactivated successfully.
Jan 21 23:59:45 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000048.scope: Consumed 16.732s CPU time.
Jan 21 23:59:45 compute-0 systemd-machined[154182]: Machine qemu-36-instance-00000048 terminated.
Jan 21 23:59:45 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222192]: [NOTICE]   (222197) : haproxy version is 2.8.14-c23fe91
Jan 21 23:59:45 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222192]: [NOTICE]   (222197) : path to executable is /usr/sbin/haproxy
Jan 21 23:59:45 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222192]: [WARNING]  (222197) : Exiting Master process...
Jan 21 23:59:45 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222192]: [ALERT]    (222197) : Current worker (222199) exited with code 143 (Terminated)
Jan 21 23:59:45 compute-0 neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a[222192]: [WARNING]  (222197) : All workers exited. Exiting... (0)
Jan 21 23:59:45 compute-0 systemd[1]: libpod-b1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0.scope: Deactivated successfully.
Jan 21 23:59:45 compute-0 podman[223013]: 2026-01-21 23:59:45.53062617 +0000 UTC m=+0.102528809 container died b1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.556 182939 DEBUG nova.compute.manager [req-a279551e-6ce9-4e95-bd0e-d97fb520c3b9 req-8666090c-fbcd-4d33-ae03-c0b6fda3ae82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-unplugged-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.556 182939 DEBUG oslo_concurrency.lockutils [req-a279551e-6ce9-4e95-bd0e-d97fb520c3b9 req-8666090c-fbcd-4d33-ae03-c0b6fda3ae82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.557 182939 DEBUG oslo_concurrency.lockutils [req-a279551e-6ce9-4e95-bd0e-d97fb520c3b9 req-8666090c-fbcd-4d33-ae03-c0b6fda3ae82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.557 182939 DEBUG oslo_concurrency.lockutils [req-a279551e-6ce9-4e95-bd0e-d97fb520c3b9 req-8666090c-fbcd-4d33-ae03-c0b6fda3ae82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.557 182939 DEBUG nova.compute.manager [req-a279551e-6ce9-4e95-bd0e-d97fb520c3b9 req-8666090c-fbcd-4d33-ae03-c0b6fda3ae82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] No waiting events found dispatching network-vif-unplugged-bce17837-9218-4b02-868d-09dba821ce49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.558 182939 DEBUG nova.compute.manager [req-a279551e-6ce9-4e95-bd0e-d97fb520c3b9 req-8666090c-fbcd-4d33-ae03-c0b6fda3ae82 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-unplugged-bce17837-9218-4b02-868d-09dba821ce49 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.559 182939 INFO nova.virt.libvirt.driver [-] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Instance destroyed successfully.
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.560 182939 DEBUG nova.objects.instance [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lazy-loading 'resources' on Instance uuid ada4a724-2307-431d-8c29-075bfd90b43e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0-userdata-shm.mount: Deactivated successfully.
Jan 21 23:59:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-8cab24be1591f89b523de445b4286b9a88a7db37453a604a0f4ad4000dcbacd5-merged.mount: Deactivated successfully.
Jan 21 23:59:45 compute-0 podman[223013]: 2026-01-21 23:59:45.583775639 +0000 UTC m=+0.155678318 container cleanup b1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.588 182939 DEBUG nova.virt.libvirt.vif [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:57:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1788045668',display_name='tempest-ServerStableDeviceRescueTest-server-1788045668',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1788045668',id=72,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:58:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='011e84f966444a668bd6c0f5674f551f',ramdisk_id='',reservation_id='r-piy4urn6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1256721315',owner_user_name='tempest-ServerStableDeviceRescueTest-1256721315-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-21T23:58:11Z,user_data=None,user_id='55710edfd4b24e368807c8b5087ec91c',uuid=ada4a724-2307-431d-8c29-075bfd90b43e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.589 182939 DEBUG nova.network.os_vif_util [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converting VIF {"id": "bce17837-9218-4b02-868d-09dba821ce49", "address": "fa:16:3e:7e:09:d8", "network": {"id": "58cd83db-dcb3-409c-a108-07601ce5f67a", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1658091487-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "011e84f966444a668bd6c0f5674f551f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbce17837-92", "ovs_interfaceid": "bce17837-9218-4b02-868d-09dba821ce49", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.590 182939 DEBUG nova.network.os_vif_util [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7e:09:d8,bridge_name='br-int',has_traffic_filtering=True,id=bce17837-9218-4b02-868d-09dba821ce49,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbce17837-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.590 182939 DEBUG os_vif [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7e:09:d8,bridge_name='br-int',has_traffic_filtering=True,id=bce17837-9218-4b02-868d-09dba821ce49,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbce17837-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 21 23:59:45 compute-0 systemd[1]: libpod-conmon-b1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0.scope: Deactivated successfully.
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.592 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.592 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbce17837-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.637 182939 INFO nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Creating config drive at /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.config
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.642 182939 DEBUG oslo_concurrency.processutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4xkgk7cu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.662 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.665 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 21 23:59:45 compute-0 podman[223064]: 2026-01-21 23:59:45.667530946 +0000 UTC m=+0.049094363 container remove b1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.668 182939 INFO os_vif [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7e:09:d8,bridge_name='br-int',has_traffic_filtering=True,id=bce17837-9218-4b02-868d-09dba821ce49,network=Network(58cd83db-dcb3-409c-a108-07601ce5f67a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbce17837-92')
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.668 182939 INFO nova.virt.libvirt.driver [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Deleting instance files /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e_del
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.669 182939 INFO nova.virt.libvirt.driver [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Deletion of /var/lib/nova/instances/ada4a724-2307-431d-8c29-075bfd90b43e_del complete
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.673 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9adce32d-e8e3-4b24-afd0-a53c4a0a7299]: (4, ('Wed Jan 21 11:59:45 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a (b1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0)\nb1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0\nWed Jan 21 11:59:45 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a (b1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0)\nb1766cce841ac9f78eba879b3652eac359febc3991d16faa5737f01e10b0b2f0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.675 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d33b7cfb-2f61-4522-b0f6-fe43959443c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.677 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58cd83db-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.678 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 kernel: tap58cd83db-d0: left promiscuous mode
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.697 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.702 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0fbaf7b2-c234-4e10-8065-ac499ffe5cf0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.723 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0e030117-7bf1-4caa-8439-e4dc7364a460]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.724 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d812ec9c-8f44-4057-bc02-84d3d117e51d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.748 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[49095ca4-0b71-419d-9a68-a23c8f07e118]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439404, 'reachable_time': 37310, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223083, 'error': None, 'target': 'ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 systemd[1]: run-netns-ovnmeta\x2d58cd83db\x2ddcb3\x2d409c\x2da108\x2d07601ce5f67a.mount: Deactivated successfully.
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.752 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-58cd83db-dcb3-409c-a108-07601ce5f67a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.752 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[3307f3e1-273f-4376-8163-c91255f9b8f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.769 182939 DEBUG oslo_concurrency.processutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4xkgk7cu" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.790 182939 INFO nova.compute.manager [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Took 0.52 seconds to destroy the instance on the hypervisor.
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.791 182939 DEBUG oslo.service.loopingcall [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.792 182939 DEBUG nova.compute.manager [-] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.792 182939 DEBUG nova.network.neutron [-] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 21 23:59:45 compute-0 podman[223084]: 2026-01-21 23:59:45.840985841 +0000 UTC m=+0.055355634 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 21 23:59:45 compute-0 kernel: tapc05b5089-7f: entered promiscuous mode
Jan 21 23:59:45 compute-0 systemd-udevd[222997]: Network interface NamePolicy= disabled on kernel command line.
Jan 21 23:59:45 compute-0 NetworkManager[55139]: <info>  [1769039985.8492] manager: (tapc05b5089-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Jan 21 23:59:45 compute-0 ovn_controller[95047]: 2026-01-21T23:59:45Z|00286|binding|INFO|Claiming lport c05b5089-7fab-41da-bf56-5cf234379b1d for this chassis.
Jan 21 23:59:45 compute-0 ovn_controller[95047]: 2026-01-21T23:59:45Z|00287|binding|INFO|c05b5089-7fab-41da-bf56-5cf234379b1d: Claiming fa:16:3e:be:5a:da 10.100.0.12
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.852 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 NetworkManager[55139]: <info>  [1769039985.8633] device (tapc05b5089-7f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 21 23:59:45 compute-0 NetworkManager[55139]: <info>  [1769039985.8642] device (tapc05b5089-7f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.864 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 nova_compute[182935]: 2026-01-21 23:59:45.870 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:45 compute-0 NetworkManager[55139]: <info>  [1769039985.8713] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Jan 21 23:59:45 compute-0 NetworkManager[55139]: <info>  [1769039985.8720] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.875 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:5a:da 10.100.0.12'], port_security=['fa:16:3e:be:5a:da 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91ae2c4a-ab10-4954-a382-a87fa6f89551', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '2', 'neutron:security_group_ids': '27f09566-bcbf-4d61-a2a7-838e97ade9dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=c05b5089-7fab-41da-bf56-5cf234379b1d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.876 104408 INFO neutron.agent.ovn.metadata.agent [-] Port c05b5089-7fab-41da-bf56-5cf234379b1d in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.877 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.889 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7535ab9d-ac24-4dfb-b9f4-733621049d02]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.890 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.893 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.893 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5b2d5041-a525-4bdd-af85-cea247c9fe67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.894 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[02be3134-b4c3-47fb-8eaa-fd25bec32633]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 systemd-machined[154182]: New machine qemu-40-instance-0000004f.
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.907 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[a894468f-d394-4f44-9e38-63735b4ff8d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.936 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[40b68404-4b01-49c4-8514-7ad17b4b87c1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 systemd[1]: Started Virtual Machine qemu-40-instance-0000004f.
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.971 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[94cfca69-45e8-40b8-9001-f5e5a90c05a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:45 compute-0 NetworkManager[55139]: <info>  [1769039985.9950] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/128)
Jan 21 23:59:45 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:45.994 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[db966e77-140c-4562-8a53-5bf6fb14a75d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.027 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.048 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[51cd0377-8913-449b-91ef-e0bbd7feac7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.051 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c81b1f13-7c79-4d80-96af-822c8c3bc0b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.054 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:46 compute-0 ovn_controller[95047]: 2026-01-21T23:59:46Z|00288|binding|INFO|Setting lport c05b5089-7fab-41da-bf56-5cf234379b1d ovn-installed in OVS
Jan 21 23:59:46 compute-0 ovn_controller[95047]: 2026-01-21T23:59:46Z|00289|binding|INFO|Setting lport c05b5089-7fab-41da-bf56-5cf234379b1d up in Southbound
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.070 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:46 compute-0 NetworkManager[55139]: <info>  [1769039986.0777] device (tap19c3e0c8-50): carrier: link connected
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.085 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[45081ef7-ed9f-4420-9870-f16cd78db483]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.106 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[50f90f6c-14dc-413b-88af-dba74b446018]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448881, 'reachable_time': 17955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223155, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.130 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9f90bfa6-0f3c-4f98-966f-16640fcfae3b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 448881, 'tstamp': 448881}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223156, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.155 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ed54baca-d359-42ae-9c8d-d9a7a96adb56]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448881, 'reachable_time': 17955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223157, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.204 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5e3e747a-a8dc-4640-865a-8f32da3ca58e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.274 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.291 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9659efce-0c15-4c93-a36c-7f8a569d1a08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.293 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.293 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.294 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:46 compute-0 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 21 23:59:46 compute-0 NetworkManager[55139]: <info>  [1769039986.2970] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.296 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.303 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.305 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.307 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.309 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[69de83f4-5fd5-47db-b6cb-62c1eca5df70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.309 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: global
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: defaults
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     log global
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 21 23:59:46 compute-0 ovn_metadata_agent[104403]: 2026-01-21 23:59:46.310 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 21 23:59:46 compute-0 ovn_controller[95047]: 2026-01-21T23:59:46Z|00290|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.344 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.524 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039986.5233014, 91ae2c4a-ab10-4954-a382-a87fa6f89551 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.525 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] VM Started (Lifecycle Event)
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.559 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.564 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039986.5251782, 91ae2c4a-ab10-4954-a382-a87fa6f89551 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.564 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] VM Paused (Lifecycle Event)
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.589 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.593 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.615 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.636 182939 DEBUG nova.network.neutron [-] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.645 182939 DEBUG nova.network.neutron [req-05343082-41f3-48f7-9541-4607b364e9a9 req-9237532e-cdb5-4197-8432-7a721d9de947 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Updated VIF entry in instance network info cache for port c05b5089-7fab-41da-bf56-5cf234379b1d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.647 182939 DEBUG nova.network.neutron [req-05343082-41f3-48f7-9541-4607b364e9a9 req-9237532e-cdb5-4197-8432-7a721d9de947 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Updating instance_info_cache with network_info: [{"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.656 182939 INFO nova.compute.manager [-] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Took 0.86 seconds to deallocate network for instance.
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.667 182939 DEBUG oslo_concurrency.lockutils [req-05343082-41f3-48f7-9541-4607b364e9a9 req-9237532e-cdb5-4197-8432-7a721d9de947 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-91ae2c4a-ab10-4954-a382-a87fa6f89551" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.736 182939 DEBUG nova.compute.manager [req-263275d2-caed-42c9-8768-fe97f567a75f req-40ca7296-3963-4b94-8d1f-2da2876bf171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-deleted-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.752 182939 DEBUG oslo_concurrency.lockutils [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.753 182939 DEBUG oslo_concurrency.lockutils [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:46 compute-0 podman[223196]: 2026-01-21 23:59:46.760837735 +0000 UTC m=+0.070732785 container create 49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 21 23:59:46 compute-0 systemd[1]: Started libpod-conmon-49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f.scope.
Jan 21 23:59:46 compute-0 podman[223196]: 2026-01-21 23:59:46.735108475 +0000 UTC m=+0.045003525 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 21 23:59:46 compute-0 systemd[1]: Started libcrun container.
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.831 182939 DEBUG nova.compute.provider_tree [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 21 23:59:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe14efd12ab65e00318d4c5e9eee472172e8870597e74738c243deff872ab3db/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.846 182939 DEBUG nova.scheduler.client.report [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 21 23:59:46 compute-0 podman[223196]: 2026-01-21 23:59:46.850046161 +0000 UTC m=+0.159941271 container init 49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 21 23:59:46 compute-0 podman[223196]: 2026-01-21 23:59:46.858389182 +0000 UTC m=+0.168284242 container start 49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 21 23:59:46 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223209]: [NOTICE]   (223215) : New worker (223217) forked
Jan 21 23:59:46 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223209]: [NOTICE]   (223215) : Loading success.
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.899 182939 DEBUG oslo_concurrency.lockutils [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:46 compute-0 nova_compute[182935]: 2026-01-21 23:59:46.929 182939 INFO nova.scheduler.client.report [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Deleted allocations for instance ada4a724-2307-431d-8c29-075bfd90b43e
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.017 182939 DEBUG oslo_concurrency.lockutils [None req-c599fc78-eb72-4a5b-88ef-92eef6e0bf70 55710edfd4b24e368807c8b5087ec91c 011e84f966444a668bd6c0f5674f551f - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.333 182939 DEBUG nova.compute.manager [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.334 182939 DEBUG oslo_concurrency.lockutils [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.335 182939 DEBUG oslo_concurrency.lockutils [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.335 182939 DEBUG oslo_concurrency.lockutils [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.336 182939 DEBUG nova.compute.manager [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Processing event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.336 182939 DEBUG nova.compute.manager [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.337 182939 DEBUG oslo_concurrency.lockutils [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.337 182939 DEBUG oslo_concurrency.lockutils [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.338 182939 DEBUG oslo_concurrency.lockutils [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.338 182939 DEBUG nova.compute.manager [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] No waiting events found dispatching network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.338 182939 WARNING nova.compute.manager [req-48dfb60a-0b98-47a3-b511-a91c03e9a96d req-06040c47-2f92-4941-806f-d20ed6c0aa7a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received unexpected event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d for instance with vm_state building and task_state spawning.
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.340 182939 DEBUG nova.compute.manager [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.345 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769039987.3449519, 91ae2c4a-ab10-4954-a382-a87fa6f89551 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.346 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] VM Resumed (Lifecycle Event)
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.349 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.353 182939 INFO nova.virt.libvirt.driver [-] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Instance spawned successfully.
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.353 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.370 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.378 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.382 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.382 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.382 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.383 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.383 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.383 182939 DEBUG nova.virt.libvirt.driver [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.418 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.464 182939 INFO nova.compute.manager [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Took 6.54 seconds to spawn the instance on the hypervisor.
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.465 182939 DEBUG nova.compute.manager [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.559 182939 INFO nova.compute.manager [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Took 7.22 seconds to build instance.
Jan 21 23:59:47 compute-0 nova_compute[182935]: 2026-01-21 23:59:47.584 182939 DEBUG oslo_concurrency.lockutils [None req-eeeecf3e-1433-4f10-9dc3-d5551db5855c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:48 compute-0 nova_compute[182935]: 2026-01-21 23:59:48.608 182939 DEBUG nova.compute.manager [req-9883f0ad-a5bc-4ad0-b39e-35a430315d17 req-d1817558-6eca-418d-a79e-8973e2d33072 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 21 23:59:48 compute-0 nova_compute[182935]: 2026-01-21 23:59:48.608 182939 DEBUG oslo_concurrency.lockutils [req-9883f0ad-a5bc-4ad0-b39e-35a430315d17 req-d1817558-6eca-418d-a79e-8973e2d33072 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 21 23:59:48 compute-0 nova_compute[182935]: 2026-01-21 23:59:48.609 182939 DEBUG oslo_concurrency.lockutils [req-9883f0ad-a5bc-4ad0-b39e-35a430315d17 req-d1817558-6eca-418d-a79e-8973e2d33072 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 21 23:59:48 compute-0 nova_compute[182935]: 2026-01-21 23:59:48.609 182939 DEBUG oslo_concurrency.lockutils [req-9883f0ad-a5bc-4ad0-b39e-35a430315d17 req-d1817558-6eca-418d-a79e-8973e2d33072 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "ada4a724-2307-431d-8c29-075bfd90b43e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 21 23:59:48 compute-0 nova_compute[182935]: 2026-01-21 23:59:48.609 182939 DEBUG nova.compute.manager [req-9883f0ad-a5bc-4ad0-b39e-35a430315d17 req-d1817558-6eca-418d-a79e-8973e2d33072 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] No waiting events found dispatching network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 21 23:59:48 compute-0 nova_compute[182935]: 2026-01-21 23:59:48.610 182939 WARNING nova.compute.manager [req-9883f0ad-a5bc-4ad0-b39e-35a430315d17 req-d1817558-6eca-418d-a79e-8973e2d33072 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Received unexpected event network-vif-plugged-bce17837-9218-4b02-868d-09dba821ce49 for instance with vm_state deleted and task_state None.
Jan 21 23:59:49 compute-0 podman[223226]: 2026-01-21 23:59:49.700854189 +0000 UTC m=+0.064577197 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 21 23:59:50 compute-0 nova_compute[182935]: 2026-01-21 23:59:50.625 182939 INFO nova.compute.manager [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Rebuilding instance
Jan 21 23:59:50 compute-0 nova_compute[182935]: 2026-01-21 23:59:50.638 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:50 compute-0 nova_compute[182935]: 2026-01-21 23:59:50.918 182939 DEBUG nova.compute.manager [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:50 compute-0 nova_compute[182935]: 2026-01-21 23:59:50.985 182939 DEBUG nova.objects.instance [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_requests' on Instance uuid 91ae2c4a-ab10-4954-a382-a87fa6f89551 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:51 compute-0 nova_compute[182935]: 2026-01-21 23:59:51.006 182939 DEBUG nova.objects.instance [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 91ae2c4a-ab10-4954-a382-a87fa6f89551 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:51 compute-0 nova_compute[182935]: 2026-01-21 23:59:51.026 182939 DEBUG nova.objects.instance [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'resources' on Instance uuid 91ae2c4a-ab10-4954-a382-a87fa6f89551 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:51 compute-0 nova_compute[182935]: 2026-01-21 23:59:51.043 182939 DEBUG nova.objects.instance [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'migration_context' on Instance uuid 91ae2c4a-ab10-4954-a382-a87fa6f89551 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 21 23:59:51 compute-0 nova_compute[182935]: 2026-01-21 23:59:51.059 182939 DEBUG nova.objects.instance [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 21 23:59:51 compute-0 nova_compute[182935]: 2026-01-21 23:59:51.063 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 21 23:59:51 compute-0 ovn_controller[95047]: 2026-01-21T23:59:51Z|00291|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 21 23:59:51 compute-0 nova_compute[182935]: 2026-01-21 23:59:51.286 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:53 compute-0 nova_compute[182935]: 2026-01-21 23:59:53.013 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039978.0042343, 30452704-b180-41c6-98c4-8b168b3bc5e9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 21 23:59:53 compute-0 nova_compute[182935]: 2026-01-21 23:59:53.014 182939 INFO nova.compute.manager [-] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] VM Stopped (Lifecycle Event)
Jan 21 23:59:53 compute-0 nova_compute[182935]: 2026-01-21 23:59:53.044 182939 DEBUG nova.compute.manager [None req-8cb3981b-1017-4619-a5d8-d9baa1cb3c6e - - - - - -] [instance: 30452704-b180-41c6-98c4-8b168b3bc5e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 21 23:59:55 compute-0 sshd-session[223245]: Invalid user git from 188.166.69.60 port 42292
Jan 21 23:59:55 compute-0 sshd-session[223245]: Connection closed by invalid user git 188.166.69.60 port 42292 [preauth]
Jan 21 23:59:55 compute-0 nova_compute[182935]: 2026-01-21 23:59:55.690 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:56 compute-0 nova_compute[182935]: 2026-01-21 23:59:56.290 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 21 23:59:57 compute-0 podman[223248]: 2026-01-21 23:59:57.720927773 +0000 UTC m=+0.089868525 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 21 23:59:57 compute-0 podman[223247]: 2026-01-21 23:59:57.750817942 +0000 UTC m=+0.114208491 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Jan 22 00:00:00 compute-0 ovn_controller[95047]: 2026-01-22T00:00:00Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:be:5a:da 10.100.0.12
Jan 22 00:00:00 compute-0 ovn_controller[95047]: 2026-01-22T00:00:00Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:be:5a:da 10.100.0.12
Jan 22 00:00:00 compute-0 nova_compute[182935]: 2026-01-22 00:00:00.549 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769039985.5460842, ada4a724-2307-431d-8c29-075bfd90b43e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:00 compute-0 nova_compute[182935]: 2026-01-22 00:00:00.549 182939 INFO nova.compute.manager [-] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] VM Stopped (Lifecycle Event)
Jan 22 00:00:00 compute-0 nova_compute[182935]: 2026-01-22 00:00:00.582 182939 DEBUG nova.compute.manager [None req-a2d454c0-05d6-40c2-866e-37fe3bea07a8 - - - - - -] [instance: ada4a724-2307-431d-8c29-075bfd90b43e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:00 compute-0 nova_compute[182935]: 2026-01-22 00:00:00.716 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:01 compute-0 nova_compute[182935]: 2026-01-22 00:00:01.130 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:00:01 compute-0 nova_compute[182935]: 2026-01-22 00:00:01.292 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:01 compute-0 nova_compute[182935]: 2026-01-22 00:00:01.346 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.194 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.195 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.196 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:03 compute-0 systemd[1]: Starting update of the root trust anchor for DNSSEC validation in unbound...
Jan 22 00:00:03 compute-0 systemd[1]: Starting Rotate log files...
Jan 22 00:00:03 compute-0 kernel: tapc05b5089-7f (unregistering): left promiscuous mode
Jan 22 00:00:03 compute-0 NetworkManager[55139]: <info>  [1769040003.3265] device (tapc05b5089-7f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:00:03 compute-0 systemd[1]: unbound-anchor.service: Deactivated successfully.
Jan 22 00:00:03 compute-0 systemd[1]: Finished update of the root trust anchor for DNSSEC validation in unbound.
Jan 22 00:00:03 compute-0 systemd[1]: logrotate.service: Deactivated successfully.
Jan 22 00:00:03 compute-0 systemd[1]: Finished Rotate log files.
Jan 22 00:00:03 compute-0 nova_compute[182935]: 2026-01-22 00:00:03.334 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:03 compute-0 ovn_controller[95047]: 2026-01-22T00:00:03Z|00292|binding|INFO|Releasing lport c05b5089-7fab-41da-bf56-5cf234379b1d from this chassis (sb_readonly=0)
Jan 22 00:00:03 compute-0 ovn_controller[95047]: 2026-01-22T00:00:03Z|00293|binding|INFO|Setting lport c05b5089-7fab-41da-bf56-5cf234379b1d down in Southbound
Jan 22 00:00:03 compute-0 ovn_controller[95047]: 2026-01-22T00:00:03Z|00294|binding|INFO|Removing iface tapc05b5089-7f ovn-installed in OVS
Jan 22 00:00:03 compute-0 nova_compute[182935]: 2026-01-22 00:00:03.338 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.349 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:5a:da 10.100.0.12'], port_security=['fa:16:3e:be:5a:da 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91ae2c4a-ab10-4954-a382-a87fa6f89551', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '4', 'neutron:security_group_ids': '27f09566-bcbf-4d61-a2a7-838e97ade9dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=c05b5089-7fab-41da-bf56-5cf234379b1d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.350 104408 INFO neutron.agent.ovn.metadata.agent [-] Port c05b5089-7fab-41da-bf56-5cf234379b1d in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.352 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.354 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4f194a4a-042a-4dde-a548-c45e795e7244]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.354 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore
Jan 22 00:00:03 compute-0 nova_compute[182935]: 2026-01-22 00:00:03.356 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:03 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Jan 22 00:00:03 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000004f.scope: Consumed 13.352s CPU time.
Jan 22 00:00:03 compute-0 systemd-machined[154182]: Machine qemu-40-instance-0000004f terminated.
Jan 22 00:00:03 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223209]: [NOTICE]   (223215) : haproxy version is 2.8.14-c23fe91
Jan 22 00:00:03 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223209]: [NOTICE]   (223215) : path to executable is /usr/sbin/haproxy
Jan 22 00:00:03 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223209]: [WARNING]  (223215) : Exiting Master process...
Jan 22 00:00:03 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223209]: [ALERT]    (223215) : Current worker (223217) exited with code 143 (Terminated)
Jan 22 00:00:03 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223209]: [WARNING]  (223215) : All workers exited. Exiting... (0)
Jan 22 00:00:03 compute-0 systemd[1]: libpod-49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f.scope: Deactivated successfully.
Jan 22 00:00:03 compute-0 podman[223332]: 2026-01-22 00:00:03.497290395 +0000 UTC m=+0.045141978 container died 49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:00:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f-userdata-shm.mount: Deactivated successfully.
Jan 22 00:00:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-fe14efd12ab65e00318d4c5e9eee472172e8870597e74738c243deff872ab3db-merged.mount: Deactivated successfully.
Jan 22 00:00:03 compute-0 podman[223332]: 2026-01-22 00:00:03.542269647 +0000 UTC m=+0.090121230 container cleanup 49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:00:03 compute-0 systemd[1]: libpod-conmon-49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f.scope: Deactivated successfully.
Jan 22 00:00:03 compute-0 podman[223362]: 2026-01-22 00:00:03.610232464 +0000 UTC m=+0.044350579 container remove 49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.616 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[19259dba-38b2-4ae3-bf5e-27830b0dff74]: (4, ('Thu Jan 22 12:00:03 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f)\n49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f\nThu Jan 22 12:00:03 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f)\n49eed279b4b6a6df7256b7bef1bd1aae94b4305baab93ce264ce8b396d51ad0f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.619 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b8ff7d-517d-44c0-a59b-735651ca9bd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.620 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:03 compute-0 nova_compute[182935]: 2026-01-22 00:00:03.622 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:03 compute-0 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 22 00:00:03 compute-0 nova_compute[182935]: 2026-01-22 00:00:03.639 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.644 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3e1f90ef-85d3-430e-ad1f-7b2687775f6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.660 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2a1e5a98-ad2a-4794-bfaa-474f489f767d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.661 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8d05fb-22c0-454c-b328-5584a78c7855]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.677 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[22e2157b-abfb-46ec-b1f0-222342cdeac3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448870, 'reachable_time': 24512, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223395, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.679 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:03.679 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[d307fd1c-6433-4c1b-b76c-2a8f036bfaec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:03 compute-0 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 22 00:00:03 compute-0 nova_compute[182935]: 2026-01-22 00:00:03.852 182939 DEBUG nova.compute.manager [req-e9ca85cc-a0b6-4232-b772-b05b15e7f347 req-e45f6e70-2d8b-480d-b8f0-980d33bc5578 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received event network-vif-unplugged-c05b5089-7fab-41da-bf56-5cf234379b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:03 compute-0 nova_compute[182935]: 2026-01-22 00:00:03.852 182939 DEBUG oslo_concurrency.lockutils [req-e9ca85cc-a0b6-4232-b772-b05b15e7f347 req-e45f6e70-2d8b-480d-b8f0-980d33bc5578 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:03 compute-0 nova_compute[182935]: 2026-01-22 00:00:03.854 182939 DEBUG oslo_concurrency.lockutils [req-e9ca85cc-a0b6-4232-b772-b05b15e7f347 req-e45f6e70-2d8b-480d-b8f0-980d33bc5578 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:03 compute-0 nova_compute[182935]: 2026-01-22 00:00:03.854 182939 DEBUG oslo_concurrency.lockutils [req-e9ca85cc-a0b6-4232-b772-b05b15e7f347 req-e45f6e70-2d8b-480d-b8f0-980d33bc5578 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:03 compute-0 nova_compute[182935]: 2026-01-22 00:00:03.855 182939 DEBUG nova.compute.manager [req-e9ca85cc-a0b6-4232-b772-b05b15e7f347 req-e45f6e70-2d8b-480d-b8f0-980d33bc5578 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] No waiting events found dispatching network-vif-unplugged-c05b5089-7fab-41da-bf56-5cf234379b1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:03 compute-0 nova_compute[182935]: 2026-01-22 00:00:03.855 182939 WARNING nova.compute.manager [req-e9ca85cc-a0b6-4232-b772-b05b15e7f347 req-e45f6e70-2d8b-480d-b8f0-980d33bc5578 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received unexpected event network-vif-unplugged-c05b5089-7fab-41da-bf56-5cf234379b1d for instance with vm_state active and task_state rebuilding.
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.146 182939 INFO nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Instance shutdown successfully after 13 seconds.
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.152 182939 INFO nova.virt.libvirt.driver [-] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Instance destroyed successfully.
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.158 182939 INFO nova.virt.libvirt.driver [-] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Instance destroyed successfully.
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.159 182939 DEBUG nova.virt.libvirt.vif [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-21T23:59:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-429300056',display_name='tempest-ServerActionsTestJSON-server-614528583',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-429300056',id=79,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-0d80tr00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-21T23:59:50Z,user_data=None,user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=91ae2c4a-ab10-4954-a382-a87fa6f89551,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.159 182939 DEBUG nova.network.os_vif_util [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.160 182939 DEBUG nova.network.os_vif_util [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.161 182939 DEBUG os_vif [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.163 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.163 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc05b5089-7f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.166 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.167 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.170 182939 INFO os_vif [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f')
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.171 182939 INFO nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Deleting instance files /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551_del
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.172 182939 INFO nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Deletion of /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551_del complete
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.421 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.422 182939 INFO nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Creating image(s)
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.423 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "/var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.423 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.424 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.437 182939 DEBUG oslo_concurrency.processutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.499 182939 DEBUG oslo_concurrency.processutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.501 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.501 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.513 182939 DEBUG oslo_concurrency.processutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.570 182939 DEBUG oslo_concurrency.processutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.571 182939 DEBUG oslo_concurrency.processutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.608 182939 DEBUG oslo_concurrency.processutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.609 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.609 182939 DEBUG oslo_concurrency.processutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.668 182939 DEBUG oslo_concurrency.processutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.669 182939 DEBUG nova.virt.disk.api [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Checking if we can resize image /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.669 182939 DEBUG oslo_concurrency.processutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.730 182939 DEBUG oslo_concurrency.processutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.731 182939 DEBUG nova.virt.disk.api [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Cannot resize image /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.731 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.731 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Ensure instance console log exists: /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.732 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.732 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.733 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.734 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Start _get_guest_xml network_info=[{"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.739 182939 WARNING nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.747 182939 DEBUG nova.virt.libvirt.host [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.748 182939 DEBUG nova.virt.libvirt.host [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.751 182939 DEBUG nova.virt.libvirt.host [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.751 182939 DEBUG nova.virt.libvirt.host [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.753 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.753 182939 DEBUG nova.virt.hardware [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.753 182939 DEBUG nova.virt.hardware [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.753 182939 DEBUG nova.virt.hardware [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.754 182939 DEBUG nova.virt.hardware [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.754 182939 DEBUG nova.virt.hardware [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.754 182939 DEBUG nova.virt.hardware [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.754 182939 DEBUG nova.virt.hardware [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.755 182939 DEBUG nova.virt.hardware [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.755 182939 DEBUG nova.virt.hardware [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.755 182939 DEBUG nova.virt.hardware [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.755 182939 DEBUG nova.virt.hardware [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.756 182939 DEBUG nova.objects.instance [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 91ae2c4a-ab10-4954-a382-a87fa6f89551 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.775 182939 DEBUG nova.virt.libvirt.vif [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:59:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-429300056',display_name='tempest-ServerActionsTestJSON-server-614528583',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-429300056',id=79,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-0d80tr00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:00:04Z,user_data=None,user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=91ae2c4a-ab10-4954-a382-a87fa6f89551,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.775 182939 DEBUG nova.network.os_vif_util [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.776 182939 DEBUG nova.network.os_vif_util [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.778 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:00:04 compute-0 nova_compute[182935]:   <uuid>91ae2c4a-ab10-4954-a382-a87fa6f89551</uuid>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   <name>instance-0000004f</name>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerActionsTestJSON-server-614528583</nova:name>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:00:04</nova:creationTime>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:00:04 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:00:04 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:00:04 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:00:04 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:00:04 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:00:04 compute-0 nova_compute[182935]:         <nova:user uuid="3e78a70a1d284a9d932d4a53b872df39">tempest-ServerActionsTestJSON-78742637-project-member</nova:user>
Jan 22 00:00:04 compute-0 nova_compute[182935]:         <nova:project uuid="cccb624dbe6d4401a89e9cd254f91828">tempest-ServerActionsTestJSON-78742637</nova:project>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="3e1dda74-3c6a-4d29-8792-32134d1c36c5"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:00:04 compute-0 nova_compute[182935]:         <nova:port uuid="c05b5089-7fab-41da-bf56-5cf234379b1d">
Jan 22 00:00:04 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <system>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <entry name="serial">91ae2c4a-ab10-4954-a382-a87fa6f89551</entry>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <entry name="uuid">91ae2c4a-ab10-4954-a382-a87fa6f89551</entry>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     </system>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   <os>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   </os>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   <features>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   </features>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.config"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:be:5a:da"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <target dev="tapc05b5089-7f"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/console.log" append="off"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <video>
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     </video>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:00:04 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:00:04 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:00:04 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:00:04 compute-0 nova_compute[182935]: </domain>
Jan 22 00:00:04 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.778 182939 DEBUG nova.compute.manager [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Preparing to wait for external event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.779 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.779 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.779 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.780 182939 DEBUG nova.virt.libvirt.vif [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:59:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-429300056',display_name='tempest-ServerActionsTestJSON-server-614528583',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-429300056',id=79,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-21T23:59:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-0d80tr00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:00:04Z,user_data=None,user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=91ae2c4a-ab10-4954-a382-a87fa6f89551,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.780 182939 DEBUG nova.network.os_vif_util [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.780 182939 DEBUG nova.network.os_vif_util [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.781 182939 DEBUG os_vif [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.781 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.782 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.782 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.784 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.784 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc05b5089-7f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.785 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc05b5089-7f, col_values=(('external_ids', {'iface-id': 'c05b5089-7fab-41da-bf56-5cf234379b1d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:5a:da', 'vm-uuid': '91ae2c4a-ab10-4954-a382-a87fa6f89551'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.786 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:04 compute-0 NetworkManager[55139]: <info>  [1769040004.7881] manager: (tapc05b5089-7f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.789 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.791 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.792 182939 INFO os_vif [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f')
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.866 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.866 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.866 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No VIF found with MAC fa:16:3e:be:5a:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.867 182939 INFO nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Using config drive
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.896 182939 DEBUG nova.objects.instance [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 91ae2c4a-ab10-4954-a382-a87fa6f89551 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:04 compute-0 nova_compute[182935]: 2026-01-22 00:00:04.944 182939 DEBUG nova.objects.instance [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'keypairs' on Instance uuid 91ae2c4a-ab10-4954-a382-a87fa6f89551 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:05 compute-0 nova_compute[182935]: 2026-01-22 00:00:05.575 182939 INFO nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Creating config drive at /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.config
Jan 22 00:00:05 compute-0 nova_compute[182935]: 2026-01-22 00:00:05.583 182939 DEBUG oslo_concurrency.processutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl1_0zf6o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:05 compute-0 nova_compute[182935]: 2026-01-22 00:00:05.718 182939 DEBUG oslo_concurrency.processutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl1_0zf6o" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:05 compute-0 kernel: tapc05b5089-7f: entered promiscuous mode
Jan 22 00:00:05 compute-0 NetworkManager[55139]: <info>  [1769040005.7816] manager: (tapc05b5089-7f): new Tun device (/org/freedesktop/NetworkManager/Devices/131)
Jan 22 00:00:05 compute-0 systemd-udevd[223313]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:00:05 compute-0 nova_compute[182935]: 2026-01-22 00:00:05.783 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:05 compute-0 ovn_controller[95047]: 2026-01-22T00:00:05Z|00295|binding|INFO|Claiming lport c05b5089-7fab-41da-bf56-5cf234379b1d for this chassis.
Jan 22 00:00:05 compute-0 ovn_controller[95047]: 2026-01-22T00:00:05Z|00296|binding|INFO|c05b5089-7fab-41da-bf56-5cf234379b1d: Claiming fa:16:3e:be:5a:da 10.100.0.12
Jan 22 00:00:05 compute-0 NetworkManager[55139]: <info>  [1769040005.7942] device (tapc05b5089-7f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:00:05 compute-0 ovn_controller[95047]: 2026-01-22T00:00:05Z|00297|binding|INFO|Setting lport c05b5089-7fab-41da-bf56-5cf234379b1d ovn-installed in OVS
Jan 22 00:00:05 compute-0 nova_compute[182935]: 2026-01-22 00:00:05.795 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:05 compute-0 NetworkManager[55139]: <info>  [1769040005.7966] device (tapc05b5089-7f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:00:05 compute-0 nova_compute[182935]: 2026-01-22 00:00:05.797 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:05 compute-0 systemd-machined[154182]: New machine qemu-41-instance-0000004f.
Jan 22 00:00:05 compute-0 systemd[1]: Started Virtual Machine qemu-41-instance-0000004f.
Jan 22 00:00:05 compute-0 ovn_controller[95047]: 2026-01-22T00:00:05Z|00298|binding|INFO|Setting lport c05b5089-7fab-41da-bf56-5cf234379b1d up in Southbound
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.844 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:5a:da 10.100.0.12'], port_security=['fa:16:3e:be:5a:da 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91ae2c4a-ab10-4954-a382-a87fa6f89551', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '5', 'neutron:security_group_ids': '27f09566-bcbf-4d61-a2a7-838e97ade9dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=c05b5089-7fab-41da-bf56-5cf234379b1d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.846 104408 INFO neutron.agent.ovn.metadata.agent [-] Port c05b5089-7fab-41da-bf56-5cf234379b1d in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.848 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.861 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[eb1bcc80-6650-40d7-9426-2c85fe8b743b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.862 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.864 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.864 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5c6863c0-f0eb-4c2f-8ec4-1b297d448526]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.865 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d38e5b-b31a-43c9-b459-4b8669531ea9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.876 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[9d6417aa-8db1-42cc-9bb4-112177213c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.903 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ac15235c-b384-4b5c-b362-60c9f0c914b7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.934 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c68d24-2e07-4c26-9024-13c33ba692f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.940 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ad760b96-9ab8-4090-98ec-c12f949c0f9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:05 compute-0 NetworkManager[55139]: <info>  [1769040005.9416] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/132)
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.971 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8669ba8d-ad40-4311-8fc6-1c465f0d9795]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:05.974 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c614f7-8fee-4fd6-8bda-63fa79d57265]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:05 compute-0 NetworkManager[55139]: <info>  [1769040005.9980] device (tap19c3e0c8-50): carrier: link connected
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.004 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[71f0ee64-fb01-4e39-a432-ce8461541dc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.022 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c58bb140-8578-4571-8399-651086a053c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450873, 'reachable_time': 29666, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223461, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.039 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cd4cb633-d6f1-4296-a7bb-120e13c418fb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450873, 'tstamp': 450873}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223462, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.055 182939 DEBUG nova.compute.manager [req-832b442f-3b09-4d22-9600-4f5c3119a3c0 req-7955e356-1bf5-440b-8dc1-414b2b321e32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.055 182939 DEBUG oslo_concurrency.lockutils [req-832b442f-3b09-4d22-9600-4f5c3119a3c0 req-7955e356-1bf5-440b-8dc1-414b2b321e32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.055 182939 DEBUG oslo_concurrency.lockutils [req-832b442f-3b09-4d22-9600-4f5c3119a3c0 req-7955e356-1bf5-440b-8dc1-414b2b321e32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.056 182939 DEBUG oslo_concurrency.lockutils [req-832b442f-3b09-4d22-9600-4f5c3119a3c0 req-7955e356-1bf5-440b-8dc1-414b2b321e32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.056 182939 DEBUG nova.compute.manager [req-832b442f-3b09-4d22-9600-4f5c3119a3c0 req-7955e356-1bf5-440b-8dc1-414b2b321e32 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Processing event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.058 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e4603fcb-306b-4ea5-8484-e279b067c09b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450873, 'reachable_time': 29666, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223463, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.090 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a1fe31e0-591f-4826-a132-04f05d52c772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.160 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[17ae72bb-d02a-45dc-a8f0-688ec49cab7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.161 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.162 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.162 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.164 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-0 NetworkManager[55139]: <info>  [1769040006.1654] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Jan 22 00:00:06 compute-0 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.167 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.169 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.170 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-0 ovn_controller[95047]: 2026-01-22T00:00:06Z|00299|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.172 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.173 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.174 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[32fb33a2-c151-47fd-be70-f78fb564ff04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.175 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:00:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:06.176 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.182 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.294 182939 DEBUG nova.compute.manager [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.295 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.297 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for 91ae2c4a-ab10-4954-a382-a87fa6f89551 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.297 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040006.2955134, 91ae2c4a-ab10-4954-a382-a87fa6f89551 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.297 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] VM Started (Lifecycle Event)
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.301 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.305 182939 INFO nova.virt.libvirt.driver [-] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Instance spawned successfully.
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.305 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.331 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.338 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.341 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.342 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.342 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.343 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.343 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.344 182939 DEBUG nova.virt.libvirt.driver [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.369 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.369 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040006.2992125, 91ae2c4a-ab10-4954-a382-a87fa6f89551 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.370 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] VM Paused (Lifecycle Event)
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.406 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.409 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040006.3005316, 91ae2c4a-ab10-4954-a382-a87fa6f89551 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.410 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] VM Resumed (Lifecycle Event)
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.434 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.436 182939 DEBUG nova.compute.manager [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.439 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.490 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.573 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.574 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.574 182939 DEBUG nova.objects.instance [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 00:00:06 compute-0 podman[223502]: 2026-01-22 00:00:06.575551397 +0000 UTC m=+0.067920487 container create f8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:00:06 compute-0 systemd[1]: Started libpod-conmon-f8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534.scope.
Jan 22 00:00:06 compute-0 podman[223502]: 2026-01-22 00:00:06.530864421 +0000 UTC m=+0.023233531 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:00:06 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:00:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/30b389e71b4001e39d7950e08a3f4b33c11075f8a29aa3f86f878b7f69548b7e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:00:06 compute-0 nova_compute[182935]: 2026-01-22 00:00:06.693 182939 DEBUG oslo_concurrency.lockutils [None req-4c01fe00-0e01-42a7-a221-51566cb72c4d 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:06 compute-0 podman[223502]: 2026-01-22 00:00:06.694182992 +0000 UTC m=+0.186552082 container init f8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:00:06 compute-0 podman[223502]: 2026-01-22 00:00:06.705828623 +0000 UTC m=+0.198197713 container start f8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 00:00:06 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223517]: [NOTICE]   (223521) : New worker (223523) forked
Jan 22 00:00:06 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223517]: [NOTICE]   (223521) : Loading success.
Jan 22 00:00:07 compute-0 nova_compute[182935]: 2026-01-22 00:00:07.367 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:08 compute-0 nova_compute[182935]: 2026-01-22 00:00:08.220 182939 DEBUG nova.compute.manager [req-7e8b2b3b-5fdb-43bd-bafa-2c715f2cfe3a req-a524e16d-3778-42b6-985f-5eeea60e8458 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:08 compute-0 nova_compute[182935]: 2026-01-22 00:00:08.220 182939 DEBUG oslo_concurrency.lockutils [req-7e8b2b3b-5fdb-43bd-bafa-2c715f2cfe3a req-a524e16d-3778-42b6-985f-5eeea60e8458 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:08 compute-0 nova_compute[182935]: 2026-01-22 00:00:08.221 182939 DEBUG oslo_concurrency.lockutils [req-7e8b2b3b-5fdb-43bd-bafa-2c715f2cfe3a req-a524e16d-3778-42b6-985f-5eeea60e8458 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:08 compute-0 nova_compute[182935]: 2026-01-22 00:00:08.221 182939 DEBUG oslo_concurrency.lockutils [req-7e8b2b3b-5fdb-43bd-bafa-2c715f2cfe3a req-a524e16d-3778-42b6-985f-5eeea60e8458 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:08 compute-0 nova_compute[182935]: 2026-01-22 00:00:08.221 182939 DEBUG nova.compute.manager [req-7e8b2b3b-5fdb-43bd-bafa-2c715f2cfe3a req-a524e16d-3778-42b6-985f-5eeea60e8458 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] No waiting events found dispatching network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:08 compute-0 nova_compute[182935]: 2026-01-22 00:00:08.222 182939 WARNING nova.compute.manager [req-7e8b2b3b-5fdb-43bd-bafa-2c715f2cfe3a req-a524e16d-3778-42b6-985f-5eeea60e8458 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received unexpected event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d for instance with vm_state active and task_state None.
Jan 22 00:00:08 compute-0 nova_compute[182935]: 2026-01-22 00:00:08.222 182939 DEBUG nova.compute.manager [req-7e8b2b3b-5fdb-43bd-bafa-2c715f2cfe3a req-a524e16d-3778-42b6-985f-5eeea60e8458 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:08 compute-0 nova_compute[182935]: 2026-01-22 00:00:08.222 182939 DEBUG oslo_concurrency.lockutils [req-7e8b2b3b-5fdb-43bd-bafa-2c715f2cfe3a req-a524e16d-3778-42b6-985f-5eeea60e8458 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:08 compute-0 nova_compute[182935]: 2026-01-22 00:00:08.222 182939 DEBUG oslo_concurrency.lockutils [req-7e8b2b3b-5fdb-43bd-bafa-2c715f2cfe3a req-a524e16d-3778-42b6-985f-5eeea60e8458 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:08 compute-0 nova_compute[182935]: 2026-01-22 00:00:08.223 182939 DEBUG oslo_concurrency.lockutils [req-7e8b2b3b-5fdb-43bd-bafa-2c715f2cfe3a req-a524e16d-3778-42b6-985f-5eeea60e8458 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:08 compute-0 nova_compute[182935]: 2026-01-22 00:00:08.223 182939 DEBUG nova.compute.manager [req-7e8b2b3b-5fdb-43bd-bafa-2c715f2cfe3a req-a524e16d-3778-42b6-985f-5eeea60e8458 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] No waiting events found dispatching network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:08 compute-0 nova_compute[182935]: 2026-01-22 00:00:08.223 182939 WARNING nova.compute.manager [req-7e8b2b3b-5fdb-43bd-bafa-2c715f2cfe3a req-a524e16d-3778-42b6-985f-5eeea60e8458 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received unexpected event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d for instance with vm_state active and task_state None.
Jan 22 00:00:08 compute-0 podman[223533]: 2026-01-22 00:00:08.698519662 +0000 UTC m=+0.061905971 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:00:08 compute-0 podman[223532]: 2026-01-22 00:00:08.755071113 +0000 UTC m=+0.111136756 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:00:09 compute-0 nova_compute[182935]: 2026-01-22 00:00:09.787 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:11 compute-0 nova_compute[182935]: 2026-01-22 00:00:11.297 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:13 compute-0 nova_compute[182935]: 2026-01-22 00:00:13.896 182939 DEBUG oslo_concurrency.lockutils [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:13 compute-0 nova_compute[182935]: 2026-01-22 00:00:13.897 182939 DEBUG oslo_concurrency.lockutils [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:13 compute-0 nova_compute[182935]: 2026-01-22 00:00:13.898 182939 DEBUG oslo_concurrency.lockutils [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:13 compute-0 nova_compute[182935]: 2026-01-22 00:00:13.898 182939 DEBUG oslo_concurrency.lockutils [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:13 compute-0 nova_compute[182935]: 2026-01-22 00:00:13.899 182939 DEBUG oslo_concurrency.lockutils [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:13 compute-0 nova_compute[182935]: 2026-01-22 00:00:13.913 182939 INFO nova.compute.manager [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Terminating instance
Jan 22 00:00:13 compute-0 nova_compute[182935]: 2026-01-22 00:00:13.928 182939 DEBUG nova.compute.manager [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:00:13 compute-0 kernel: tapc05b5089-7f (unregistering): left promiscuous mode
Jan 22 00:00:13 compute-0 NetworkManager[55139]: <info>  [1769040013.9584] device (tapc05b5089-7f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:00:13 compute-0 ovn_controller[95047]: 2026-01-22T00:00:13Z|00300|binding|INFO|Releasing lport c05b5089-7fab-41da-bf56-5cf234379b1d from this chassis (sb_readonly=0)
Jan 22 00:00:13 compute-0 ovn_controller[95047]: 2026-01-22T00:00:13Z|00301|binding|INFO|Setting lport c05b5089-7fab-41da-bf56-5cf234379b1d down in Southbound
Jan 22 00:00:13 compute-0 nova_compute[182935]: 2026-01-22 00:00:13.968 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:13 compute-0 ovn_controller[95047]: 2026-01-22T00:00:13Z|00302|binding|INFO|Removing iface tapc05b5089-7f ovn-installed in OVS
Jan 22 00:00:13 compute-0 nova_compute[182935]: 2026-01-22 00:00:13.970 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:13.977 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:5a:da 10.100.0.12'], port_security=['fa:16:3e:be:5a:da 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '91ae2c4a-ab10-4954-a382-a87fa6f89551', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '6', 'neutron:security_group_ids': '27f09566-bcbf-4d61-a2a7-838e97ade9dc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=c05b5089-7fab-41da-bf56-5cf234379b1d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:13.978 104408 INFO neutron.agent.ovn.metadata.agent [-] Port c05b5089-7fab-41da-bf56-5cf234379b1d in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis
Jan 22 00:00:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:13.979 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:00:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:13.981 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c7c09c-4047-474c-8e09-77fa9e51634d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:13.981 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore
Jan 22 00:00:13 compute-0 nova_compute[182935]: 2026-01-22 00:00:13.990 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:14 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000004f.scope: Deactivated successfully.
Jan 22 00:00:14 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d0000004f.scope: Consumed 8.268s CPU time.
Jan 22 00:00:14 compute-0 systemd-machined[154182]: Machine qemu-41-instance-0000004f terminated.
Jan 22 00:00:14 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223517]: [NOTICE]   (223521) : haproxy version is 2.8.14-c23fe91
Jan 22 00:00:14 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223517]: [NOTICE]   (223521) : path to executable is /usr/sbin/haproxy
Jan 22 00:00:14 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223517]: [WARNING]  (223521) : Exiting Master process...
Jan 22 00:00:14 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223517]: [ALERT]    (223521) : Current worker (223523) exited with code 143 (Terminated)
Jan 22 00:00:14 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[223517]: [WARNING]  (223521) : All workers exited. Exiting... (0)
Jan 22 00:00:14 compute-0 systemd[1]: libpod-f8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534.scope: Deactivated successfully.
Jan 22 00:00:14 compute-0 podman[223608]: 2026-01-22 00:00:14.141471978 +0000 UTC m=+0.057587277 container died f8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:00:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534-userdata-shm.mount: Deactivated successfully.
Jan 22 00:00:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-30b389e71b4001e39d7950e08a3f4b33c11075f8a29aa3f86f878b7f69548b7e-merged.mount: Deactivated successfully.
Jan 22 00:00:14 compute-0 podman[223608]: 2026-01-22 00:00:14.188790197 +0000 UTC m=+0.104905416 container cleanup f8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.196 182939 INFO nova.virt.libvirt.driver [-] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Instance destroyed successfully.
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.198 182939 DEBUG nova.objects.instance [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'resources' on Instance uuid 91ae2c4a-ab10-4954-a382-a87fa6f89551 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:14 compute-0 systemd[1]: libpod-conmon-f8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534.scope: Deactivated successfully.
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.216 182939 DEBUG nova.virt.libvirt.vif [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-21T23:59:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-429300056',display_name='tempest-ServerActionsTestJSON-server-614528583',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-429300056',id=79,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:00:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-0d80tr00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:00:06Z,user_data=None,user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=91ae2c4a-ab10-4954-a382-a87fa6f89551,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.217 182939 DEBUG nova.network.os_vif_util [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "c05b5089-7fab-41da-bf56-5cf234379b1d", "address": "fa:16:3e:be:5a:da", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc05b5089-7f", "ovs_interfaceid": "c05b5089-7fab-41da-bf56-5cf234379b1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.219 182939 DEBUG nova.network.os_vif_util [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.219 182939 DEBUG os_vif [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.221 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.222 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc05b5089-7f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.224 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.226 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.230 182939 INFO os_vif [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:5a:da,bridge_name='br-int',has_traffic_filtering=True,id=c05b5089-7fab-41da-bf56-5cf234379b1d,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc05b5089-7f')
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.232 182939 INFO nova.virt.libvirt.driver [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Deleting instance files /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551_del
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.234 182939 INFO nova.virt.libvirt.driver [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Deletion of /var/lib/nova/instances/91ae2c4a-ab10-4954-a382-a87fa6f89551_del complete
Jan 22 00:00:14 compute-0 podman[223653]: 2026-01-22 00:00:14.272931383 +0000 UTC m=+0.053492989 container remove f8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:00:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:14.280 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b2c254e0-344b-430f-8c93-9f45f4ac1b7c]: (4, ('Thu Jan 22 12:00:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (f8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534)\nf8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534\nThu Jan 22 12:00:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (f8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534)\nf8dcd580792d529ae86caefcc91f0f604aa0978c8affa6518c17bad328d61534\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:14.282 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[36f304da-fe3c-4bd2-9c2c-384c589b4832]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:14.284 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.287 182939 DEBUG nova.compute.manager [req-8428190a-40c9-4b89-be03-ade548453bef req-9d58fc3e-14c5-46d1-8ca7-b8b398239371 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received event network-vif-unplugged-c05b5089-7fab-41da-bf56-5cf234379b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.288 182939 DEBUG oslo_concurrency.lockutils [req-8428190a-40c9-4b89-be03-ade548453bef req-9d58fc3e-14c5-46d1-8ca7-b8b398239371 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:14 compute-0 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.288 182939 DEBUG oslo_concurrency.lockutils [req-8428190a-40c9-4b89-be03-ade548453bef req-9d58fc3e-14c5-46d1-8ca7-b8b398239371 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.288 182939 DEBUG oslo_concurrency.lockutils [req-8428190a-40c9-4b89-be03-ade548453bef req-9d58fc3e-14c5-46d1-8ca7-b8b398239371 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.289 182939 DEBUG nova.compute.manager [req-8428190a-40c9-4b89-be03-ade548453bef req-9d58fc3e-14c5-46d1-8ca7-b8b398239371 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] No waiting events found dispatching network-vif-unplugged-c05b5089-7fab-41da-bf56-5cf234379b1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.289 182939 DEBUG nova.compute.manager [req-8428190a-40c9-4b89-be03-ade548453bef req-9d58fc3e-14c5-46d1-8ca7-b8b398239371 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received event network-vif-unplugged-c05b5089-7fab-41da-bf56-5cf234379b1d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.290 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.312 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:14.316 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[668e29dc-eb48-465c-8a42-a6adbcdb7529]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:14.331 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[58340007-7f78-45a5-a53a-0f8ae260c834]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:14.332 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ab4bf48e-e02d-45c3-87e9-8283fec79207]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.338 182939 INFO nova.compute.manager [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.339 182939 DEBUG oslo.service.loopingcall [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.340 182939 DEBUG nova.compute.manager [-] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:00:14 compute-0 nova_compute[182935]: 2026-01-22 00:00:14.340 182939 DEBUG nova.network.neutron [-] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:00:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:14.358 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bb35ce6b-5c7f-4f91-9bdd-91cb5eff6f96]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450866, 'reachable_time': 27309, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223668, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:14.362 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:00:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:14.363 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[19acc882-5737-4324-adae-f06533891c53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:14 compute-0 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.221 182939 DEBUG nova.network.neutron [-] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.251 182939 INFO nova.compute.manager [-] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Took 1.91 seconds to deallocate network for instance.
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.326 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.358 182939 DEBUG nova.compute.manager [req-0365119f-61c8-409b-b5ea-53b0deef0725 req-754a2592-1ce7-4882-976e-77bb5dcee492 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received event network-vif-deleted-c05b5089-7fab-41da-bf56-5cf234379b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.364 182939 DEBUG oslo_concurrency.lockutils [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.364 182939 DEBUG oslo_concurrency.lockutils [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.401 182939 DEBUG nova.compute.manager [req-1e626937-4f45-497b-bff0-0ecfe58778c3 req-db20ed10-55c7-4b2b-a289-9c41aedaf6ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.402 182939 DEBUG oslo_concurrency.lockutils [req-1e626937-4f45-497b-bff0-0ecfe58778c3 req-db20ed10-55c7-4b2b-a289-9c41aedaf6ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.402 182939 DEBUG oslo_concurrency.lockutils [req-1e626937-4f45-497b-bff0-0ecfe58778c3 req-db20ed10-55c7-4b2b-a289-9c41aedaf6ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.403 182939 DEBUG oslo_concurrency.lockutils [req-1e626937-4f45-497b-bff0-0ecfe58778c3 req-db20ed10-55c7-4b2b-a289-9c41aedaf6ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.403 182939 DEBUG nova.compute.manager [req-1e626937-4f45-497b-bff0-0ecfe58778c3 req-db20ed10-55c7-4b2b-a289-9c41aedaf6ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] No waiting events found dispatching network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.404 182939 WARNING nova.compute.manager [req-1e626937-4f45-497b-bff0-0ecfe58778c3 req-db20ed10-55c7-4b2b-a289-9c41aedaf6ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Received unexpected event network-vif-plugged-c05b5089-7fab-41da-bf56-5cf234379b1d for instance with vm_state deleted and task_state None.
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.465 182939 DEBUG nova.compute.provider_tree [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.496 182939 DEBUG nova.scheduler.client.report [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.537 182939 DEBUG oslo_concurrency.lockutils [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.592 182939 INFO nova.scheduler.client.report [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Deleted allocations for instance 91ae2c4a-ab10-4954-a382-a87fa6f89551
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.693 182939 DEBUG oslo_concurrency.lockutils [None req-06ad347d-eaed-4f6b-accb-a0e41b608a72 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "91ae2c4a-ab10-4954-a382-a87fa6f89551" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:16 compute-0 podman[223669]: 2026-01-22 00:00:16.731226131 +0000 UTC m=+0.088178064 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:00:16 compute-0 nova_compute[182935]: 2026-01-22 00:00:16.856 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:17 compute-0 nova_compute[182935]: 2026-01-22 00:00:17.056 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:17.906 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:17 compute-0 nova_compute[182935]: 2026-01-22 00:00:17.906 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:17.909 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:00:19 compute-0 nova_compute[182935]: 2026-01-22 00:00:19.225 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:19.913 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:20 compute-0 podman[223694]: 2026-01-22 00:00:20.758385535 +0000 UTC m=+0.117186882 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:00:21 compute-0 nova_compute[182935]: 2026-01-22 00:00:21.329 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:00:23.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:00:24 compute-0 nova_compute[182935]: 2026-01-22 00:00:24.231 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:26 compute-0 nova_compute[182935]: 2026-01-22 00:00:26.333 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:27 compute-0 nova_compute[182935]: 2026-01-22 00:00:27.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:27 compute-0 nova_compute[182935]: 2026-01-22 00:00:27.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:00:27 compute-0 nova_compute[182935]: 2026-01-22 00:00:27.824 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:00:28 compute-0 podman[223713]: 2026-01-22 00:00:28.702634323 +0000 UTC m=+0.073846849 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 00:00:28 compute-0 podman[223714]: 2026-01-22 00:00:28.73694952 +0000 UTC m=+0.098530664 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:00:29 compute-0 nova_compute[182935]: 2026-01-22 00:00:29.193 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040014.1922765, 91ae2c4a-ab10-4954-a382-a87fa6f89551 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:29 compute-0 nova_compute[182935]: 2026-01-22 00:00:29.194 182939 INFO nova.compute.manager [-] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] VM Stopped (Lifecycle Event)
Jan 22 00:00:29 compute-0 nova_compute[182935]: 2026-01-22 00:00:29.222 182939 DEBUG nova.compute.manager [None req-cfe85d40-328b-4333-9851-2ff996ae0ff1 - - - - - -] [instance: 91ae2c4a-ab10-4954-a382-a87fa6f89551] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:29 compute-0 nova_compute[182935]: 2026-01-22 00:00:29.236 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:29 compute-0 nova_compute[182935]: 2026-01-22 00:00:29.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:29 compute-0 nova_compute[182935]: 2026-01-22 00:00:29.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:29 compute-0 nova_compute[182935]: 2026-01-22 00:00:29.825 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:29 compute-0 nova_compute[182935]: 2026-01-22 00:00:29.826 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:29 compute-0 nova_compute[182935]: 2026-01-22 00:00:29.826 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:29 compute-0 nova_compute[182935]: 2026-01-22 00:00:29.826 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:00:30 compute-0 nova_compute[182935]: 2026-01-22 00:00:30.045 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:00:30 compute-0 nova_compute[182935]: 2026-01-22 00:00:30.046 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5660MB free_disk=73.20291137695312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:00:30 compute-0 nova_compute[182935]: 2026-01-22 00:00:30.047 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:30 compute-0 nova_compute[182935]: 2026-01-22 00:00:30.047 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:30 compute-0 nova_compute[182935]: 2026-01-22 00:00:30.142 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:00:30 compute-0 nova_compute[182935]: 2026-01-22 00:00:30.142 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:00:30 compute-0 nova_compute[182935]: 2026-01-22 00:00:30.174 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:00:30 compute-0 nova_compute[182935]: 2026-01-22 00:00:30.199 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:00:30 compute-0 nova_compute[182935]: 2026-01-22 00:00:30.222 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:00:30 compute-0 nova_compute[182935]: 2026-01-22 00:00:30.222 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:31 compute-0 nova_compute[182935]: 2026-01-22 00:00:31.223 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:31 compute-0 nova_compute[182935]: 2026-01-22 00:00:31.223 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:31 compute-0 nova_compute[182935]: 2026-01-22 00:00:31.224 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:00:31 compute-0 nova_compute[182935]: 2026-01-22 00:00:31.378 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:32 compute-0 nova_compute[182935]: 2026-01-22 00:00:32.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:32 compute-0 nova_compute[182935]: 2026-01-22 00:00:32.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:34 compute-0 nova_compute[182935]: 2026-01-22 00:00:34.240 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:34 compute-0 nova_compute[182935]: 2026-01-22 00:00:34.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:36 compute-0 nova_compute[182935]: 2026-01-22 00:00:36.380 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.155 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquiring lock "7de8f448-0555-42c9-81e7-142c8735b7ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.156 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.200 182939 DEBUG nova.compute.manager [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.391 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.392 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.401 182939 DEBUG nova.virt.hardware [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.401 182939 INFO nova.compute.claims [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.599 182939 DEBUG nova.compute.provider_tree [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.629 182939 DEBUG nova.scheduler.client.report [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.661 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.662 182939 DEBUG nova.compute.manager [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.758 182939 DEBUG nova.compute.manager [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.759 182939 DEBUG nova.network.neutron [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.829 182939 INFO nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:00:37 compute-0 nova_compute[182935]: 2026-01-22 00:00:37.864 182939 DEBUG nova.compute.manager [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.091 182939 DEBUG nova.compute.manager [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.093 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.093 182939 INFO nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Creating image(s)
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.094 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquiring lock "/var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.095 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "/var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.095 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "/var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.116 182939 DEBUG oslo_concurrency.processutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.202 182939 DEBUG oslo_concurrency.processutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.204 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.205 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.223 182939 DEBUG oslo_concurrency.processutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.292 182939 DEBUG oslo_concurrency.processutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.293 182939 DEBUG oslo_concurrency.processutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.326 182939 DEBUG oslo_concurrency.processutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.328 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.328 182939 DEBUG oslo_concurrency.processutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.384 182939 DEBUG oslo_concurrency.processutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.386 182939 DEBUG nova.virt.disk.api [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Checking if we can resize image /var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.387 182939 DEBUG oslo_concurrency.processutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.416 182939 DEBUG nova.policy [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0cd8c717f19045fcafac6ddb87f30e2c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f4237454503a49a6a90b8deeb01dad23', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.446 182939 DEBUG oslo_concurrency.processutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.447 182939 DEBUG nova.virt.disk.api [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Cannot resize image /var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.448 182939 DEBUG nova.objects.instance [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lazy-loading 'migration_context' on Instance uuid 7de8f448-0555-42c9-81e7-142c8735b7ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.464 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.465 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Ensure instance console log exists: /var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.466 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.467 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:38 compute-0 nova_compute[182935]: 2026-01-22 00:00:38.467 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:39 compute-0 nova_compute[182935]: 2026-01-22 00:00:39.244 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:39 compute-0 podman[223771]: 2026-01-22 00:00:39.737249326 +0000 UTC m=+0.086031432 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:00:39 compute-0 podman[223770]: 2026-01-22 00:00:39.791382159 +0000 UTC m=+0.136811205 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 22 00:00:41 compute-0 nova_compute[182935]: 2026-01-22 00:00:41.383 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:42 compute-0 sshd-session[223821]: Invalid user git from 188.166.69.60 port 41360
Jan 22 00:00:42 compute-0 sshd-session[223821]: Connection closed by invalid user git 188.166.69.60 port 41360 [preauth]
Jan 22 00:00:43 compute-0 nova_compute[182935]: 2026-01-22 00:00:43.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:00:44 compute-0 nova_compute[182935]: 2026-01-22 00:00:44.247 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:44 compute-0 nova_compute[182935]: 2026-01-22 00:00:44.838 182939 DEBUG nova.network.neutron [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Successfully created port: e6a7cce8-73ea-4021-ba99-a6d25adf5e69 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:00:46 compute-0 nova_compute[182935]: 2026-01-22 00:00:46.385 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:47 compute-0 podman[223823]: 2026-01-22 00:00:47.70077489 +0000 UTC m=+0.066388610 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:00:48 compute-0 nova_compute[182935]: 2026-01-22 00:00:48.652 182939 DEBUG nova.network.neutron [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Successfully updated port: e6a7cce8-73ea-4021-ba99-a6d25adf5e69 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:00:48 compute-0 nova_compute[182935]: 2026-01-22 00:00:48.679 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquiring lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:00:48 compute-0 nova_compute[182935]: 2026-01-22 00:00:48.679 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquired lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:00:48 compute-0 nova_compute[182935]: 2026-01-22 00:00:48.680 182939 DEBUG nova.network.neutron [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:00:48 compute-0 nova_compute[182935]: 2026-01-22 00:00:48.971 182939 DEBUG nova.compute.manager [req-082e9b49-6cc1-4945-84d5-5bad586d256b req-d2274ce1-521b-458a-8706-1e8e3968fcb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Received event network-changed-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:48 compute-0 nova_compute[182935]: 2026-01-22 00:00:48.971 182939 DEBUG nova.compute.manager [req-082e9b49-6cc1-4945-84d5-5bad586d256b req-d2274ce1-521b-458a-8706-1e8e3968fcb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Refreshing instance network info cache due to event network-changed-e6a7cce8-73ea-4021-ba99-a6d25adf5e69. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:00:48 compute-0 nova_compute[182935]: 2026-01-22 00:00:48.972 182939 DEBUG oslo_concurrency.lockutils [req-082e9b49-6cc1-4945-84d5-5bad586d256b req-d2274ce1-521b-458a-8706-1e8e3968fcb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:00:49 compute-0 nova_compute[182935]: 2026-01-22 00:00:49.074 182939 DEBUG nova.network.neutron [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:00:49 compute-0 nova_compute[182935]: 2026-01-22 00:00:49.251 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.389 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.665 182939 DEBUG nova.network.neutron [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updating instance_info_cache with network_info: [{"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.690 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Releasing lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.691 182939 DEBUG nova.compute.manager [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Instance network_info: |[{"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.692 182939 DEBUG oslo_concurrency.lockutils [req-082e9b49-6cc1-4945-84d5-5bad586d256b req-d2274ce1-521b-458a-8706-1e8e3968fcb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.692 182939 DEBUG nova.network.neutron [req-082e9b49-6cc1-4945-84d5-5bad586d256b req-d2274ce1-521b-458a-8706-1e8e3968fcb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Refreshing network info cache for port e6a7cce8-73ea-4021-ba99-a6d25adf5e69 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.696 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Start _get_guest_xml network_info=[{"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.704 182939 WARNING nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.713 182939 DEBUG nova.virt.libvirt.host [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.714 182939 DEBUG nova.virt.libvirt.host [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:00:51 compute-0 podman[223848]: 2026-01-22 00:00:51.713683451 +0000 UTC m=+0.083674846 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.722 182939 DEBUG nova.virt.libvirt.host [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.723 182939 DEBUG nova.virt.libvirt.host [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.724 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.725 182939 DEBUG nova.virt.hardware [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.726 182939 DEBUG nova.virt.hardware [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.726 182939 DEBUG nova.virt.hardware [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.726 182939 DEBUG nova.virt.hardware [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.727 182939 DEBUG nova.virt.hardware [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.727 182939 DEBUG nova.virt.hardware [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.728 182939 DEBUG nova.virt.hardware [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.728 182939 DEBUG nova.virt.hardware [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.729 182939 DEBUG nova.virt.hardware [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.729 182939 DEBUG nova.virt.hardware [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.730 182939 DEBUG nova.virt.hardware [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.736 182939 DEBUG nova.virt.libvirt.vif [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:00:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-426630729',display_name='tempest-AttachInterfacesUnderV243Test-server-426630729',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-426630729',id=82,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBClkjO2PipE/NzIMrXoJQUC/50ZpPPhY/ZWehwNZdobkecSlDQNqypIlTd1kwP4aPQ/TFbBbIK8DQe4xqInLO12K8Of3dJmEdbpzxSXVzfNZd6u/D6zsg34JOvXcUBHbiQ==',key_name='tempest-keypair-1337512338',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f4237454503a49a6a90b8deeb01dad23',ramdisk_id='',reservation_id='r-w7diqr50',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-1363841808',owner_user_name='tempest-AttachInterfacesUnderV243Test-1363841808-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:00:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0cd8c717f19045fcafac6ddb87f30e2c',uuid=7de8f448-0555-42c9-81e7-142c8735b7ad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.736 182939 DEBUG nova.network.os_vif_util [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Converting VIF {"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.738 182939 DEBUG nova.network.os_vif_util [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:13:b4,bridge_name='br-int',has_traffic_filtering=True,id=e6a7cce8-73ea-4021-ba99-a6d25adf5e69,network=Network(6a985fad-fdc1-4c5b-a880-f267b92a4ef8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6a7cce8-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.739 182939 DEBUG nova.objects.instance [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7de8f448-0555-42c9-81e7-142c8735b7ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.769 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:00:51 compute-0 nova_compute[182935]:   <uuid>7de8f448-0555-42c9-81e7-142c8735b7ad</uuid>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   <name>instance-00000052</name>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <nova:name>tempest-AttachInterfacesUnderV243Test-server-426630729</nova:name>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:00:51</nova:creationTime>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:00:51 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:00:51 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:00:51 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:00:51 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:00:51 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:00:51 compute-0 nova_compute[182935]:         <nova:user uuid="0cd8c717f19045fcafac6ddb87f30e2c">tempest-AttachInterfacesUnderV243Test-1363841808-project-member</nova:user>
Jan 22 00:00:51 compute-0 nova_compute[182935]:         <nova:project uuid="f4237454503a49a6a90b8deeb01dad23">tempest-AttachInterfacesUnderV243Test-1363841808</nova:project>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:00:51 compute-0 nova_compute[182935]:         <nova:port uuid="e6a7cce8-73ea-4021-ba99-a6d25adf5e69">
Jan 22 00:00:51 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <system>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <entry name="serial">7de8f448-0555-42c9-81e7-142c8735b7ad</entry>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <entry name="uuid">7de8f448-0555-42c9-81e7-142c8735b7ad</entry>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     </system>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   <os>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   </os>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   <features>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   </features>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk.config"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:86:13:b4"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <target dev="tape6a7cce8-73"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/console.log" append="off"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <video>
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     </video>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:00:51 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:00:51 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:00:51 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:00:51 compute-0 nova_compute[182935]: </domain>
Jan 22 00:00:51 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.771 182939 DEBUG nova.compute.manager [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Preparing to wait for external event network-vif-plugged-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.772 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquiring lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.772 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.773 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.774 182939 DEBUG nova.virt.libvirt.vif [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:00:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-426630729',display_name='tempest-AttachInterfacesUnderV243Test-server-426630729',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-426630729',id=82,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBClkjO2PipE/NzIMrXoJQUC/50ZpPPhY/ZWehwNZdobkecSlDQNqypIlTd1kwP4aPQ/TFbBbIK8DQe4xqInLO12K8Of3dJmEdbpzxSXVzfNZd6u/D6zsg34JOvXcUBHbiQ==',key_name='tempest-keypair-1337512338',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f4237454503a49a6a90b8deeb01dad23',ramdisk_id='',reservation_id='r-w7diqr50',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-1363841808',owner_user_name='tempest-AttachInterfacesUnderV243Test-1363841808-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:00:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0cd8c717f19045fcafac6ddb87f30e2c',uuid=7de8f448-0555-42c9-81e7-142c8735b7ad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.775 182939 DEBUG nova.network.os_vif_util [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Converting VIF {"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.776 182939 DEBUG nova.network.os_vif_util [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:13:b4,bridge_name='br-int',has_traffic_filtering=True,id=e6a7cce8-73ea-4021-ba99-a6d25adf5e69,network=Network(6a985fad-fdc1-4c5b-a880-f267b92a4ef8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6a7cce8-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.776 182939 DEBUG os_vif [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:13:b4,bridge_name='br-int',has_traffic_filtering=True,id=e6a7cce8-73ea-4021-ba99-a6d25adf5e69,network=Network(6a985fad-fdc1-4c5b-a880-f267b92a4ef8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6a7cce8-73') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.777 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.778 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.779 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.786 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.786 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape6a7cce8-73, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.787 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape6a7cce8-73, col_values=(('external_ids', {'iface-id': 'e6a7cce8-73ea-4021-ba99-a6d25adf5e69', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:13:b4', 'vm-uuid': '7de8f448-0555-42c9-81e7-142c8735b7ad'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.789 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:51 compute-0 NetworkManager[55139]: <info>  [1769040051.7913] manager: (tape6a7cce8-73): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.793 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.801 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.802 182939 INFO os_vif [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:13:b4,bridge_name='br-int',has_traffic_filtering=True,id=e6a7cce8-73ea-4021-ba99-a6d25adf5e69,network=Network(6a985fad-fdc1-4c5b-a880-f267b92a4ef8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6a7cce8-73')
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.866 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.867 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.867 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] No VIF found with MAC fa:16:3e:86:13:b4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:00:51 compute-0 nova_compute[182935]: 2026-01-22 00:00:51.868 182939 INFO nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Using config drive
Jan 22 00:00:52 compute-0 nova_compute[182935]: 2026-01-22 00:00:52.934 182939 INFO nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Creating config drive at /var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk.config
Jan 22 00:00:52 compute-0 nova_compute[182935]: 2026-01-22 00:00:52.939 182939 DEBUG oslo_concurrency.processutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn2f7xv_y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.098 182939 DEBUG oslo_concurrency.processutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn2f7xv_y" returned: 0 in 0.158s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:00:53 compute-0 kernel: tape6a7cce8-73: entered promiscuous mode
Jan 22 00:00:53 compute-0 NetworkManager[55139]: <info>  [1769040053.2100] manager: (tape6a7cce8-73): new Tun device (/org/freedesktop/NetworkManager/Devices/135)
Jan 22 00:00:53 compute-0 ovn_controller[95047]: 2026-01-22T00:00:53Z|00303|binding|INFO|Claiming lport e6a7cce8-73ea-4021-ba99-a6d25adf5e69 for this chassis.
Jan 22 00:00:53 compute-0 ovn_controller[95047]: 2026-01-22T00:00:53Z|00304|binding|INFO|e6a7cce8-73ea-4021-ba99-a6d25adf5e69: Claiming fa:16:3e:86:13:b4 10.100.0.10
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.212 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.220 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.224 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.236 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:13:b4 10.100.0.10'], port_security=['fa:16:3e:86:13:b4 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '7de8f448-0555-42c9-81e7-142c8735b7ad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a985fad-fdc1-4c5b-a880-f267b92a4ef8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f4237454503a49a6a90b8deeb01dad23', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6a419fe9-5ed3-49bf-a3d7-e29fe63eb872', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e4b550d-2592-4fc9-9a0d-718798a79652, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=e6a7cce8-73ea-4021-ba99-a6d25adf5e69) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.239 104408 INFO neutron.agent.ovn.metadata.agent [-] Port e6a7cce8-73ea-4021-ba99-a6d25adf5e69 in datapath 6a985fad-fdc1-4c5b-a880-f267b92a4ef8 bound to our chassis
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.241 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a985fad-fdc1-4c5b-a880-f267b92a4ef8
Jan 22 00:00:53 compute-0 systemd-machined[154182]: New machine qemu-42-instance-00000052.
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.263 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7fc8d9-a223-4396-a1de-8958c3adc9d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.264 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a985fad-f1 in ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.268 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a985fad-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.268 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd976a3-ae8f-403d-8c8f-ecf26e1c19ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.270 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[05bc35c7-b956-460c-a4a2-93eaf9d0942f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.290 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[0ca12db6-447c-4271-ab24-2552f7b87c8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.295 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:53 compute-0 ovn_controller[95047]: 2026-01-22T00:00:53Z|00305|binding|INFO|Setting lport e6a7cce8-73ea-4021-ba99-a6d25adf5e69 ovn-installed in OVS
Jan 22 00:00:53 compute-0 ovn_controller[95047]: 2026-01-22T00:00:53Z|00306|binding|INFO|Setting lport e6a7cce8-73ea-4021-ba99-a6d25adf5e69 up in Southbound
Jan 22 00:00:53 compute-0 systemd[1]: Started Virtual Machine qemu-42-instance-00000052.
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.301 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.317 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[927bc574-b99c-42c2-9a08-3093fb5b04cf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 systemd-udevd[223891]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:00:53 compute-0 NetworkManager[55139]: <info>  [1769040053.3588] device (tape6a7cce8-73): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:00:53 compute-0 NetworkManager[55139]: <info>  [1769040053.3613] device (tape6a7cce8-73): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.383 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[ede67e15-267a-48d2-b60f-b17fffc3a2a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.390 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[213cd187-4985-4c57-a413-af166c80cd3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 NetworkManager[55139]: <info>  [1769040053.3933] manager: (tap6a985fad-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/136)
Jan 22 00:00:53 compute-0 systemd-udevd[223897]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.455 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[226f0de7-2f0b-491b-b07d-586b420a7b2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.459 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7bccb39f-7e58-46c7-8aa5-ee065e320215]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 NetworkManager[55139]: <info>  [1769040053.5020] device (tap6a985fad-f0): carrier: link connected
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.510 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[38a093fe-29bf-4069-a828-9e4d5f45a8cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.538 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a238368e-c7ce-4ff4-97e2-ea74c9f4cd3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a985fad-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:62:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455624, 'reachable_time': 17321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223922, 'error': None, 'target': 'ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.561 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[04bdb73b-e471-4d65-8fdb-34ade13909ff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:62ea'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455624, 'tstamp': 455624}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223927, 'error': None, 'target': 'ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.585 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bb165f4c-f0a2-4045-aa90-656f9814f2b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a985fad-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:62:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455624, 'reachable_time': 17321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223928, 'error': None, 'target': 'ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.632 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[56dd120a-75cc-427c-ad03-c2b690c789f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.637 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040053.6370354, 7de8f448-0555-42c9-81e7-142c8735b7ad => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.638 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] VM Started (Lifecycle Event)
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.670 182939 DEBUG nova.compute.manager [req-e6c119c0-72ed-4947-9f12-05297dea856c req-aca5abad-71d3-44a6-b869-d17654cefc97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Received event network-vif-plugged-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.671 182939 DEBUG oslo_concurrency.lockutils [req-e6c119c0-72ed-4947-9f12-05297dea856c req-aca5abad-71d3-44a6-b869-d17654cefc97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.671 182939 DEBUG oslo_concurrency.lockutils [req-e6c119c0-72ed-4947-9f12-05297dea856c req-aca5abad-71d3-44a6-b869-d17654cefc97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.671 182939 DEBUG oslo_concurrency.lockutils [req-e6c119c0-72ed-4947-9f12-05297dea856c req-aca5abad-71d3-44a6-b869-d17654cefc97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.671 182939 DEBUG nova.compute.manager [req-e6c119c0-72ed-4947-9f12-05297dea856c req-aca5abad-71d3-44a6-b869-d17654cefc97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Processing event network-vif-plugged-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.672 182939 DEBUG nova.compute.manager [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.678 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.685 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.691 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.695 182939 INFO nova.virt.libvirt.driver [-] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Instance spawned successfully.
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.697 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.729 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.730 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040053.6371698, 7de8f448-0555-42c9-81e7-142c8735b7ad => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.730 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] VM Paused (Lifecycle Event)
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.740 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.741 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.742 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.742 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.743 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.743 182939 DEBUG nova.virt.libvirt.driver [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.745 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fc23513f-356b-4a01-babe-f1b1765d26d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.747 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a985fad-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.748 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.749 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a985fad-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:53 compute-0 kernel: tap6a985fad-f0: entered promiscuous mode
Jan 22 00:00:53 compute-0 NetworkManager[55139]: <info>  [1769040053.7539] manager: (tap6a985fad-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.756 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a985fad-f0, col_values=(('external_ids', {'iface-id': 'e1ac08be-ab54-4ca7-b9c2-89769dc0457a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:00:53 compute-0 ovn_controller[95047]: 2026-01-22T00:00:53Z|00307|binding|INFO|Releasing lport e1ac08be-ab54-4ca7-b9c2-89769dc0457a from this chassis (sb_readonly=0)
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.759 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.760 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a985fad-fdc1-4c5b-a880-f267b92a4ef8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a985fad-fdc1-4c5b-a880-f267b92a4ef8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.762 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9027dbcb-a02b-4977-89da-b18455d7b01b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.763 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-6a985fad-fdc1-4c5b-a880-f267b92a4ef8
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/6a985fad-fdc1-4c5b-a880-f267b92a4ef8.pid.haproxy
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 6a985fad-fdc1-4c5b-a880-f267b92a4ef8
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:00:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:00:53.764 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8', 'env', 'PROCESS_TAG=haproxy-6a985fad-fdc1-4c5b-a880-f267b92a4ef8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a985fad-fdc1-4c5b-a880-f267b92a4ef8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.767 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.769 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.774 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040053.6777499, 7de8f448-0555-42c9-81e7-142c8735b7ad => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.775 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] VM Resumed (Lifecycle Event)
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.815 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.820 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.844 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.866 182939 INFO nova.compute.manager [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Took 15.77 seconds to spawn the instance on the hypervisor.
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.866 182939 DEBUG nova.compute.manager [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:00:53 compute-0 nova_compute[182935]: 2026-01-22 00:00:53.983 182939 INFO nova.compute.manager [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Took 16.67 seconds to build instance.
Jan 22 00:00:54 compute-0 nova_compute[182935]: 2026-01-22 00:00:54.015 182939 DEBUG oslo_concurrency.lockutils [None req-e81dca65-f133-4f4f-977e-feac713b9d6e 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:54 compute-0 podman[223961]: 2026-01-22 00:00:54.222635348 +0000 UTC m=+0.068515501 container create 0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:00:54 compute-0 systemd[1]: Started libpod-conmon-0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4.scope.
Jan 22 00:00:54 compute-0 podman[223961]: 2026-01-22 00:00:54.179309205 +0000 UTC m=+0.025189358 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:00:54 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:00:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65d4cc71c81674c71ee4fe1a4556ddf833f4cb533618c443009f7cca5cb6d987/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:00:54 compute-0 podman[223961]: 2026-01-22 00:00:54.325509714 +0000 UTC m=+0.171389887 container init 0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:00:54 compute-0 podman[223961]: 2026-01-22 00:00:54.331102278 +0000 UTC m=+0.176982421 container start 0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:00:54 compute-0 neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8[223976]: [NOTICE]   (223980) : New worker (223982) forked
Jan 22 00:00:54 compute-0 neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8[223976]: [NOTICE]   (223980) : Loading success.
Jan 22 00:00:54 compute-0 nova_compute[182935]: 2026-01-22 00:00:54.775 182939 DEBUG nova.network.neutron [req-082e9b49-6cc1-4945-84d5-5bad586d256b req-d2274ce1-521b-458a-8706-1e8e3968fcb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updated VIF entry in instance network info cache for port e6a7cce8-73ea-4021-ba99-a6d25adf5e69. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:00:54 compute-0 nova_compute[182935]: 2026-01-22 00:00:54.776 182939 DEBUG nova.network.neutron [req-082e9b49-6cc1-4945-84d5-5bad586d256b req-d2274ce1-521b-458a-8706-1e8e3968fcb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updating instance_info_cache with network_info: [{"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:00:55 compute-0 nova_compute[182935]: 2026-01-22 00:00:55.202 182939 DEBUG oslo_concurrency.lockutils [req-082e9b49-6cc1-4945-84d5-5bad586d256b req-d2274ce1-521b-458a-8706-1e8e3968fcb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:00:55 compute-0 nova_compute[182935]: 2026-01-22 00:00:55.996 182939 DEBUG nova.compute.manager [req-54a5d3a0-6cc4-4bb2-b4aa-a27af922ed83 req-c7f6c582-5c31-4b69-a8d6-4227b2c4b505 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Received event network-vif-plugged-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:55 compute-0 nova_compute[182935]: 2026-01-22 00:00:55.997 182939 DEBUG oslo_concurrency.lockutils [req-54a5d3a0-6cc4-4bb2-b4aa-a27af922ed83 req-c7f6c582-5c31-4b69-a8d6-4227b2c4b505 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:00:55 compute-0 nova_compute[182935]: 2026-01-22 00:00:55.998 182939 DEBUG oslo_concurrency.lockutils [req-54a5d3a0-6cc4-4bb2-b4aa-a27af922ed83 req-c7f6c582-5c31-4b69-a8d6-4227b2c4b505 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:00:55 compute-0 nova_compute[182935]: 2026-01-22 00:00:55.998 182939 DEBUG oslo_concurrency.lockutils [req-54a5d3a0-6cc4-4bb2-b4aa-a27af922ed83 req-c7f6c582-5c31-4b69-a8d6-4227b2c4b505 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:00:55 compute-0 nova_compute[182935]: 2026-01-22 00:00:55.999 182939 DEBUG nova.compute.manager [req-54a5d3a0-6cc4-4bb2-b4aa-a27af922ed83 req-c7f6c582-5c31-4b69-a8d6-4227b2c4b505 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] No waiting events found dispatching network-vif-plugged-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:00:56 compute-0 nova_compute[182935]: 2026-01-22 00:00:56.000 182939 WARNING nova.compute.manager [req-54a5d3a0-6cc4-4bb2-b4aa-a27af922ed83 req-c7f6c582-5c31-4b69-a8d6-4227b2c4b505 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Received unexpected event network-vif-plugged-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 for instance with vm_state active and task_state None.
Jan 22 00:00:56 compute-0 nova_compute[182935]: 2026-01-22 00:00:56.391 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:56 compute-0 nova_compute[182935]: 2026-01-22 00:00:56.790 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:57 compute-0 sshd-session[223991]: Received disconnect from 91.224.92.190 port 15556:11:  [preauth]
Jan 22 00:00:57 compute-0 sshd-session[223991]: Disconnected from authenticating user root 91.224.92.190 port 15556 [preauth]
Jan 22 00:00:57 compute-0 nova_compute[182935]: 2026-01-22 00:00:57.860 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:57 compute-0 NetworkManager[55139]: <info>  [1769040057.8659] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Jan 22 00:00:57 compute-0 NetworkManager[55139]: <info>  [1769040057.8677] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Jan 22 00:00:58 compute-0 nova_compute[182935]: 2026-01-22 00:00:58.020 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:58 compute-0 ovn_controller[95047]: 2026-01-22T00:00:58Z|00308|binding|INFO|Releasing lport e1ac08be-ab54-4ca7-b9c2-89769dc0457a from this chassis (sb_readonly=0)
Jan 22 00:00:58 compute-0 nova_compute[182935]: 2026-01-22 00:00:58.044 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:58 compute-0 nova_compute[182935]: 2026-01-22 00:00:58.836 182939 DEBUG nova.compute.manager [req-6f72c9e6-ae4d-431c-b664-86ba2550fa18 req-d0db0c16-94af-4e9d-afd1-90d6fd6abca3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Received event network-changed-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:00:58 compute-0 nova_compute[182935]: 2026-01-22 00:00:58.838 182939 DEBUG nova.compute.manager [req-6f72c9e6-ae4d-431c-b664-86ba2550fa18 req-d0db0c16-94af-4e9d-afd1-90d6fd6abca3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Refreshing instance network info cache due to event network-changed-e6a7cce8-73ea-4021-ba99-a6d25adf5e69. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:00:58 compute-0 nova_compute[182935]: 2026-01-22 00:00:58.838 182939 DEBUG oslo_concurrency.lockutils [req-6f72c9e6-ae4d-431c-b664-86ba2550fa18 req-d0db0c16-94af-4e9d-afd1-90d6fd6abca3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:00:58 compute-0 nova_compute[182935]: 2026-01-22 00:00:58.839 182939 DEBUG oslo_concurrency.lockutils [req-6f72c9e6-ae4d-431c-b664-86ba2550fa18 req-d0db0c16-94af-4e9d-afd1-90d6fd6abca3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:00:58 compute-0 nova_compute[182935]: 2026-01-22 00:00:58.839 182939 DEBUG nova.network.neutron [req-6f72c9e6-ae4d-431c-b664-86ba2550fa18 req-d0db0c16-94af-4e9d-afd1-90d6fd6abca3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Refreshing network info cache for port e6a7cce8-73ea-4021-ba99-a6d25adf5e69 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:00:58 compute-0 nova_compute[182935]: 2026-01-22 00:00:58.957 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:00:59 compute-0 podman[223994]: 2026-01-22 00:00:59.753246554 +0000 UTC m=+0.110635074 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, distribution-scope=public, vcs-type=git, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 00:00:59 compute-0 podman[223995]: 2026-01-22 00:00:59.798933914 +0000 UTC m=+0.151194400 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute)
Jan 22 00:01:00 compute-0 nova_compute[182935]: 2026-01-22 00:01:00.521 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:01 compute-0 nova_compute[182935]: 2026-01-22 00:01:01.397 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:01 compute-0 CROND[224034]: (root) CMD (run-parts /etc/cron.hourly)
Jan 22 00:01:01 compute-0 run-parts[224037]: (/etc/cron.hourly) starting 0anacron
Jan 22 00:01:01 compute-0 anacron[224045]: Anacron started on 2026-01-22
Jan 22 00:01:01 compute-0 anacron[224045]: Job `cron.monthly' locked by another anacron - skipping
Jan 22 00:01:01 compute-0 anacron[224045]: Normal exit (0 jobs run)
Jan 22 00:01:01 compute-0 run-parts[224047]: (/etc/cron.hourly) finished 0anacron
Jan 22 00:01:01 compute-0 CROND[224033]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 22 00:01:01 compute-0 nova_compute[182935]: 2026-01-22 00:01:01.800 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:02 compute-0 nova_compute[182935]: 2026-01-22 00:01:02.370 182939 DEBUG nova.network.neutron [req-6f72c9e6-ae4d-431c-b664-86ba2550fa18 req-d0db0c16-94af-4e9d-afd1-90d6fd6abca3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updated VIF entry in instance network info cache for port e6a7cce8-73ea-4021-ba99-a6d25adf5e69. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:01:02 compute-0 nova_compute[182935]: 2026-01-22 00:01:02.370 182939 DEBUG nova.network.neutron [req-6f72c9e6-ae4d-431c-b664-86ba2550fa18 req-d0db0c16-94af-4e9d-afd1-90d6fd6abca3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updating instance_info_cache with network_info: [{"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:02 compute-0 nova_compute[182935]: 2026-01-22 00:01:02.404 182939 DEBUG oslo_concurrency.lockutils [req-6f72c9e6-ae4d-431c-b664-86ba2550fa18 req-d0db0c16-94af-4e9d-afd1-90d6fd6abca3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:03.194 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:03.197 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:03.198 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:05 compute-0 nova_compute[182935]: 2026-01-22 00:01:05.434 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:06 compute-0 nova_compute[182935]: 2026-01-22 00:01:06.401 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:06 compute-0 ovn_controller[95047]: 2026-01-22T00:01:06Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:86:13:b4 10.100.0.10
Jan 22 00:01:06 compute-0 ovn_controller[95047]: 2026-01-22T00:01:06Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:86:13:b4 10.100.0.10
Jan 22 00:01:06 compute-0 nova_compute[182935]: 2026-01-22 00:01:06.804 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:10 compute-0 podman[224065]: 2026-01-22 00:01:10.733494098 +0000 UTC m=+0.091362860 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:01:10 compute-0 podman[224064]: 2026-01-22 00:01:10.795775267 +0000 UTC m=+0.153602418 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:01:11 compute-0 nova_compute[182935]: 2026-01-22 00:01:11.405 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:11 compute-0 nova_compute[182935]: 2026-01-22 00:01:11.806 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:15 compute-0 ovn_controller[95047]: 2026-01-22T00:01:15Z|00309|binding|INFO|Releasing lport e1ac08be-ab54-4ca7-b9c2-89769dc0457a from this chassis (sb_readonly=0)
Jan 22 00:01:15 compute-0 nova_compute[182935]: 2026-01-22 00:01:15.439 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:16 compute-0 nova_compute[182935]: 2026-01-22 00:01:16.429 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:16 compute-0 nova_compute[182935]: 2026-01-22 00:01:16.809 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:18 compute-0 nova_compute[182935]: 2026-01-22 00:01:18.178 182939 DEBUG nova.objects.instance [None req-1a8db5a9-50c1-4692-a54b-86fc72a9164a 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lazy-loading 'flavor' on Instance uuid 7de8f448-0555-42c9-81e7-142c8735b7ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:01:18 compute-0 nova_compute[182935]: 2026-01-22 00:01:18.233 182939 DEBUG oslo_concurrency.lockutils [None req-1a8db5a9-50c1-4692-a54b-86fc72a9164a 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquiring lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:01:18 compute-0 nova_compute[182935]: 2026-01-22 00:01:18.234 182939 DEBUG oslo_concurrency.lockutils [None req-1a8db5a9-50c1-4692-a54b-86fc72a9164a 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquired lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:01:18 compute-0 podman[224110]: 2026-01-22 00:01:18.705268159 +0000 UTC m=+0.075787425 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:01:19 compute-0 ovn_controller[95047]: 2026-01-22T00:01:19Z|00310|binding|INFO|Releasing lport e1ac08be-ab54-4ca7-b9c2-89769dc0457a from this chassis (sb_readonly=0)
Jan 22 00:01:19 compute-0 nova_compute[182935]: 2026-01-22 00:01:19.208 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:21 compute-0 nova_compute[182935]: 2026-01-22 00:01:21.177 182939 DEBUG nova.network.neutron [None req-1a8db5a9-50c1-4692-a54b-86fc72a9164a 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:01:21 compute-0 nova_compute[182935]: 2026-01-22 00:01:21.323 182939 DEBUG nova.compute.manager [req-c4af6dc6-5c73-47cf-9fb0-87f8d3f7d0b5 req-d9c8b4cd-7d7e-4af7-99f6-428ba0620bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Received event network-changed-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:21 compute-0 nova_compute[182935]: 2026-01-22 00:01:21.324 182939 DEBUG nova.compute.manager [req-c4af6dc6-5c73-47cf-9fb0-87f8d3f7d0b5 req-d9c8b4cd-7d7e-4af7-99f6-428ba0620bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Refreshing instance network info cache due to event network-changed-e6a7cce8-73ea-4021-ba99-a6d25adf5e69. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:01:21 compute-0 nova_compute[182935]: 2026-01-22 00:01:21.324 182939 DEBUG oslo_concurrency.lockutils [req-c4af6dc6-5c73-47cf-9fb0-87f8d3f7d0b5 req-d9c8b4cd-7d7e-4af7-99f6-428ba0620bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:01:21 compute-0 nova_compute[182935]: 2026-01-22 00:01:21.433 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:21 compute-0 nova_compute[182935]: 2026-01-22 00:01:21.812 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:22 compute-0 podman[224133]: 2026-01-22 00:01:22.68866847 +0000 UTC m=+0.060613181 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:01:23 compute-0 nova_compute[182935]: 2026-01-22 00:01:23.456 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:23.457 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:01:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:23.459 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:01:23 compute-0 nova_compute[182935]: 2026-01-22 00:01:23.512 182939 DEBUG nova.network.neutron [None req-1a8db5a9-50c1-4692-a54b-86fc72a9164a 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updating instance_info_cache with network_info: [{"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:23 compute-0 nova_compute[182935]: 2026-01-22 00:01:23.540 182939 DEBUG oslo_concurrency.lockutils [None req-1a8db5a9-50c1-4692-a54b-86fc72a9164a 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Releasing lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:23 compute-0 nova_compute[182935]: 2026-01-22 00:01:23.541 182939 DEBUG nova.compute.manager [None req-1a8db5a9-50c1-4692-a54b-86fc72a9164a 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Jan 22 00:01:23 compute-0 nova_compute[182935]: 2026-01-22 00:01:23.541 182939 DEBUG nova.compute.manager [None req-1a8db5a9-50c1-4692-a54b-86fc72a9164a 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] network_info to inject: |[{"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Jan 22 00:01:23 compute-0 nova_compute[182935]: 2026-01-22 00:01:23.546 182939 DEBUG oslo_concurrency.lockutils [req-c4af6dc6-5c73-47cf-9fb0-87f8d3f7d0b5 req-d9c8b4cd-7d7e-4af7-99f6-428ba0620bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:01:23 compute-0 nova_compute[182935]: 2026-01-22 00:01:23.547 182939 DEBUG nova.network.neutron [req-c4af6dc6-5c73-47cf-9fb0-87f8d3f7d0b5 req-d9c8b4cd-7d7e-4af7-99f6-428ba0620bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Refreshing network info cache for port e6a7cce8-73ea-4021-ba99-a6d25adf5e69 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:01:25 compute-0 nova_compute[182935]: 2026-01-22 00:01:25.393 182939 DEBUG nova.objects.instance [None req-95580eed-aa5c-43a1-9fda-bb5e54525f71 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lazy-loading 'flavor' on Instance uuid 7de8f448-0555-42c9-81e7-142c8735b7ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:01:25 compute-0 nova_compute[182935]: 2026-01-22 00:01:25.442 182939 DEBUG oslo_concurrency.lockutils [None req-95580eed-aa5c-43a1-9fda-bb5e54525f71 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquiring lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:01:25 compute-0 sshd-session[224153]: Invalid user git from 188.166.69.60 port 36582
Jan 22 00:01:26 compute-0 nova_compute[182935]: 2026-01-22 00:01:26.032 182939 DEBUG nova.network.neutron [req-c4af6dc6-5c73-47cf-9fb0-87f8d3f7d0b5 req-d9c8b4cd-7d7e-4af7-99f6-428ba0620bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updated VIF entry in instance network info cache for port e6a7cce8-73ea-4021-ba99-a6d25adf5e69. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:01:26 compute-0 nova_compute[182935]: 2026-01-22 00:01:26.033 182939 DEBUG nova.network.neutron [req-c4af6dc6-5c73-47cf-9fb0-87f8d3f7d0b5 req-d9c8b4cd-7d7e-4af7-99f6-428ba0620bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updating instance_info_cache with network_info: [{"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:26 compute-0 nova_compute[182935]: 2026-01-22 00:01:26.056 182939 DEBUG oslo_concurrency.lockutils [req-c4af6dc6-5c73-47cf-9fb0-87f8d3f7d0b5 req-d9c8b4cd-7d7e-4af7-99f6-428ba0620bf9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:26 compute-0 nova_compute[182935]: 2026-01-22 00:01:26.057 182939 DEBUG oslo_concurrency.lockutils [None req-95580eed-aa5c-43a1-9fda-bb5e54525f71 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquired lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:01:26 compute-0 sshd-session[224153]: Connection closed by invalid user git 188.166.69.60 port 36582 [preauth]
Jan 22 00:01:26 compute-0 nova_compute[182935]: 2026-01-22 00:01:26.435 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:26 compute-0 nova_compute[182935]: 2026-01-22 00:01:26.815 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:27 compute-0 nova_compute[182935]: 2026-01-22 00:01:27.561 182939 DEBUG nova.network.neutron [None req-95580eed-aa5c-43a1-9fda-bb5e54525f71 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:01:27 compute-0 ovn_controller[95047]: 2026-01-22T00:01:27Z|00311|binding|INFO|Releasing lport e1ac08be-ab54-4ca7-b9c2-89769dc0457a from this chassis (sb_readonly=0)
Jan 22 00:01:27 compute-0 nova_compute[182935]: 2026-01-22 00:01:27.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:27 compute-0 nova_compute[182935]: 2026-01-22 00:01:27.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:01:27 compute-0 nova_compute[182935]: 2026-01-22 00:01:27.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:01:27 compute-0 nova_compute[182935]: 2026-01-22 00:01:27.810 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:28 compute-0 nova_compute[182935]: 2026-01-22 00:01:28.185 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:01:28 compute-0 nova_compute[182935]: 2026-01-22 00:01:28.971 182939 DEBUG nova.compute.manager [req-ce202295-fd84-4ca8-8529-c1f1cf6c6a73 req-6d4984a7-cffe-48da-ba8b-83807c61d3f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Received event network-changed-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:28 compute-0 nova_compute[182935]: 2026-01-22 00:01:28.972 182939 DEBUG nova.compute.manager [req-ce202295-fd84-4ca8-8529-c1f1cf6c6a73 req-6d4984a7-cffe-48da-ba8b-83807c61d3f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Refreshing instance network info cache due to event network-changed-e6a7cce8-73ea-4021-ba99-a6d25adf5e69. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:01:28 compute-0 nova_compute[182935]: 2026-01-22 00:01:28.972 182939 DEBUG oslo_concurrency.lockutils [req-ce202295-fd84-4ca8-8529-c1f1cf6c6a73 req-6d4984a7-cffe-48da-ba8b-83807c61d3f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:01:30 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:30.461 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:30 compute-0 nova_compute[182935]: 2026-01-22 00:01:30.565 182939 DEBUG nova.network.neutron [None req-95580eed-aa5c-43a1-9fda-bb5e54525f71 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updating instance_info_cache with network_info: [{"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:30 compute-0 nova_compute[182935]: 2026-01-22 00:01:30.590 182939 DEBUG oslo_concurrency.lockutils [None req-95580eed-aa5c-43a1-9fda-bb5e54525f71 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Releasing lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:30 compute-0 nova_compute[182935]: 2026-01-22 00:01:30.590 182939 DEBUG nova.compute.manager [None req-95580eed-aa5c-43a1-9fda-bb5e54525f71 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Jan 22 00:01:30 compute-0 nova_compute[182935]: 2026-01-22 00:01:30.591 182939 DEBUG nova.compute.manager [None req-95580eed-aa5c-43a1-9fda-bb5e54525f71 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] network_info to inject: |[{"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Jan 22 00:01:30 compute-0 nova_compute[182935]: 2026-01-22 00:01:30.593 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:01:30 compute-0 nova_compute[182935]: 2026-01-22 00:01:30.593 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:01:30 compute-0 nova_compute[182935]: 2026-01-22 00:01:30.594 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7de8f448-0555-42c9-81e7-142c8735b7ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:01:30 compute-0 podman[224155]: 2026-01-22 00:01:30.730108858 +0000 UTC m=+0.095114440 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter)
Jan 22 00:01:30 compute-0 podman[224156]: 2026-01-22 00:01:30.743938021 +0000 UTC m=+0.094519287 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:01:31 compute-0 nova_compute[182935]: 2026-01-22 00:01:31.439 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:31 compute-0 nova_compute[182935]: 2026-01-22 00:01:31.818 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.041 182939 DEBUG oslo_concurrency.lockutils [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquiring lock "7de8f448-0555-42c9-81e7-142c8735b7ad" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.042 182939 DEBUG oslo_concurrency.lockutils [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.043 182939 DEBUG oslo_concurrency.lockutils [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquiring lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.043 182939 DEBUG oslo_concurrency.lockutils [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.044 182939 DEBUG oslo_concurrency.lockutils [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.060 182939 INFO nova.compute.manager [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Terminating instance
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.073 182939 DEBUG nova.compute.manager [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:01:32 compute-0 kernel: tape6a7cce8-73 (unregistering): left promiscuous mode
Jan 22 00:01:32 compute-0 NetworkManager[55139]: <info>  [1769040092.1112] device (tape6a7cce8-73): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.171 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:32 compute-0 ovn_controller[95047]: 2026-01-22T00:01:32Z|00312|binding|INFO|Releasing lport e6a7cce8-73ea-4021-ba99-a6d25adf5e69 from this chassis (sb_readonly=0)
Jan 22 00:01:32 compute-0 ovn_controller[95047]: 2026-01-22T00:01:32Z|00313|binding|INFO|Setting lport e6a7cce8-73ea-4021-ba99-a6d25adf5e69 down in Southbound
Jan 22 00:01:32 compute-0 ovn_controller[95047]: 2026-01-22T00:01:32Z|00314|binding|INFO|Removing iface tape6a7cce8-73 ovn-installed in OVS
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.177 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.188 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.189 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:13:b4 10.100.0.10'], port_security=['fa:16:3e:86:13:b4 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '7de8f448-0555-42c9-81e7-142c8735b7ad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a985fad-fdc1-4c5b-a880-f267b92a4ef8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f4237454503a49a6a90b8deeb01dad23', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6a419fe9-5ed3-49bf-a3d7-e29fe63eb872', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e4b550d-2592-4fc9-9a0d-718798a79652, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=e6a7cce8-73ea-4021-ba99-a6d25adf5e69) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.191 104408 INFO neutron.agent.ovn.metadata.agent [-] Port e6a7cce8-73ea-4021-ba99-a6d25adf5e69 in datapath 6a985fad-fdc1-4c5b-a880-f267b92a4ef8 unbound from our chassis
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.192 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a985fad-fdc1-4c5b-a880-f267b92a4ef8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.194 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[280eaeed-5f96-46a2-9261-ae944370f42b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.194 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8 namespace which is not needed anymore
Jan 22 00:01:32 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000052.scope: Deactivated successfully.
Jan 22 00:01:32 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000052.scope: Consumed 15.006s CPU time.
Jan 22 00:01:32 compute-0 systemd-machined[154182]: Machine qemu-42-instance-00000052 terminated.
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.312 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.319 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:32 compute-0 neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8[223976]: [NOTICE]   (223980) : haproxy version is 2.8.14-c23fe91
Jan 22 00:01:32 compute-0 neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8[223976]: [NOTICE]   (223980) : path to executable is /usr/sbin/haproxy
Jan 22 00:01:32 compute-0 neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8[223976]: [WARNING]  (223980) : Exiting Master process...
Jan 22 00:01:32 compute-0 neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8[223976]: [ALERT]    (223980) : Current worker (223982) exited with code 143 (Terminated)
Jan 22 00:01:32 compute-0 neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8[223976]: [WARNING]  (223980) : All workers exited. Exiting... (0)
Jan 22 00:01:32 compute-0 systemd[1]: libpod-0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4.scope: Deactivated successfully.
Jan 22 00:01:32 compute-0 podman[224223]: 2026-01-22 00:01:32.364142584 +0000 UTC m=+0.064798440 container died 0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.374 182939 INFO nova.virt.libvirt.driver [-] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Instance destroyed successfully.
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.375 182939 DEBUG nova.objects.instance [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lazy-loading 'resources' on Instance uuid 7de8f448-0555-42c9-81e7-142c8735b7ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.393 182939 DEBUG nova.virt.libvirt.vif [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:00:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-426630729',display_name='tempest-AttachInterfacesUnderV243Test-server-426630729',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-426630729',id=82,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBClkjO2PipE/NzIMrXoJQUC/50ZpPPhY/ZWehwNZdobkecSlDQNqypIlTd1kwP4aPQ/TFbBbIK8DQe4xqInLO12K8Of3dJmEdbpzxSXVzfNZd6u/D6zsg34JOvXcUBHbiQ==',key_name='tempest-keypair-1337512338',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:00:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f4237454503a49a6a90b8deeb01dad23',ramdisk_id='',reservation_id='r-w7diqr50',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesUnderV243Test-1363841808',owner_user_name='tempest-AttachInterfacesUnderV243Test-1363841808-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:01:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0cd8c717f19045fcafac6ddb87f30e2c',uuid=7de8f448-0555-42c9-81e7-142c8735b7ad,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.394 182939 DEBUG nova.network.os_vif_util [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Converting VIF {"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.395 182939 DEBUG nova.network.os_vif_util [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:13:b4,bridge_name='br-int',has_traffic_filtering=True,id=e6a7cce8-73ea-4021-ba99-a6d25adf5e69,network=Network(6a985fad-fdc1-4c5b-a880-f267b92a4ef8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6a7cce8-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.395 182939 DEBUG os_vif [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:13:b4,bridge_name='br-int',has_traffic_filtering=True,id=e6a7cce8-73ea-4021-ba99-a6d25adf5e69,network=Network(6a985fad-fdc1-4c5b-a880-f267b92a4ef8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6a7cce8-73') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.397 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.398 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6a7cce8-73, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.401 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.402 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.406 182939 INFO os_vif [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:13:b4,bridge_name='br-int',has_traffic_filtering=True,id=e6a7cce8-73ea-4021-ba99-a6d25adf5e69,network=Network(6a985fad-fdc1-4c5b-a880-f267b92a4ef8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6a7cce8-73')
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.407 182939 INFO nova.virt.libvirt.driver [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Deleting instance files /var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad_del
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.408 182939 INFO nova.virt.libvirt.driver [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Deletion of /var/lib/nova/instances/7de8f448-0555-42c9-81e7-142c8735b7ad_del complete
Jan 22 00:01:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-65d4cc71c81674c71ee4fe1a4556ddf833f4cb533618c443009f7cca5cb6d987-merged.mount: Deactivated successfully.
Jan 22 00:01:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4-userdata-shm.mount: Deactivated successfully.
Jan 22 00:01:32 compute-0 podman[224223]: 2026-01-22 00:01:32.418292838 +0000 UTC m=+0.118948684 container cleanup 0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:01:32 compute-0 systemd[1]: libpod-conmon-0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4.scope: Deactivated successfully.
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.495 182939 INFO nova.compute.manager [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.496 182939 DEBUG oslo.service.loopingcall [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.496 182939 DEBUG nova.compute.manager [-] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.497 182939 DEBUG nova.network.neutron [-] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:01:32 compute-0 podman[224266]: 2026-01-22 00:01:32.505459026 +0000 UTC m=+0.057798132 container remove 0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.511 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d50d6491-f0b4-49e8-85cd-684125f73635]: (4, ('Thu Jan 22 12:01:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8 (0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4)\n0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4\nThu Jan 22 12:01:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8 (0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4)\n0cdff01ea145d8dd95a3da518dc87adf198a526d948d1b7a58ba00d6251644a4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.514 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fc8c1d5a-b706-4410-be8f-baf5baa5e0dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.515 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a985fad-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.518 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:32 compute-0 kernel: tap6a985fad-f0: left promiscuous mode
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.545 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.550 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4620ad46-5451-411c-a4aa-35ae82189e40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.570 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc33f1b-69b0-4eaf-9a4f-df36423bc055]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.572 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[877ed181-1f65-4f29-8edb-fdc14adde9b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.594 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1b00a0bc-10f8-4c8b-9250-450b9c522924]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455611, 'reachable_time': 43411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224281, 'error': None, 'target': 'ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.598 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a985fad-fdc1-4c5b-a880-f267b92a4ef8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:01:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:01:32.598 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[6d713c42-2f12-4fc5-8f09-8713e168ccf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:01:32 compute-0 systemd[1]: run-netns-ovnmeta\x2d6a985fad\x2dfdc1\x2d4c5b\x2da880\x2df267b92a4ef8.mount: Deactivated successfully.
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.996 182939 DEBUG nova.compute.manager [req-f46f7a64-28f2-4108-96f0-e4d27a210f09 req-47ad52ab-8b1f-4d7b-99df-30f9cfce264e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Received event network-vif-unplugged-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.997 182939 DEBUG oslo_concurrency.lockutils [req-f46f7a64-28f2-4108-96f0-e4d27a210f09 req-47ad52ab-8b1f-4d7b-99df-30f9cfce264e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.998 182939 DEBUG oslo_concurrency.lockutils [req-f46f7a64-28f2-4108-96f0-e4d27a210f09 req-47ad52ab-8b1f-4d7b-99df-30f9cfce264e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.998 182939 DEBUG oslo_concurrency.lockutils [req-f46f7a64-28f2-4108-96f0-e4d27a210f09 req-47ad52ab-8b1f-4d7b-99df-30f9cfce264e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.999 182939 DEBUG nova.compute.manager [req-f46f7a64-28f2-4108-96f0-e4d27a210f09 req-47ad52ab-8b1f-4d7b-99df-30f9cfce264e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] No waiting events found dispatching network-vif-unplugged-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:01:32 compute-0 nova_compute[182935]: 2026-01-22 00:01:32.999 182939 DEBUG nova.compute.manager [req-f46f7a64-28f2-4108-96f0-e4d27a210f09 req-47ad52ab-8b1f-4d7b-99df-30f9cfce264e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Received event network-vif-unplugged-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.197 182939 DEBUG nova.network.neutron [-] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.246 182939 INFO nova.compute.manager [-] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Took 1.75 seconds to deallocate network for instance.
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.371 182939 DEBUG nova.compute.manager [req-ea3f3fe8-2648-4334-b28e-1133c450151d req-887fc671-70f3-45fc-ab74-ff0376b5ace0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Received event network-vif-deleted-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.385 182939 DEBUG oslo_concurrency.lockutils [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.385 182939 DEBUG oslo_concurrency.lockutils [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.474 182939 DEBUG nova.compute.provider_tree [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.493 182939 DEBUG nova.scheduler.client.report [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.537 182939 DEBUG oslo_concurrency.lockutils [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.581 182939 INFO nova.scheduler.client.report [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Deleted allocations for instance 7de8f448-0555-42c9-81e7-142c8735b7ad
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.723 182939 DEBUG oslo_concurrency.lockutils [None req-48c36cf1-e8b2-479b-9097-655d697a1d90 0cd8c717f19045fcafac6ddb87f30e2c f4237454503a49a6a90b8deeb01dad23 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.745 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updating instance_info_cache with network_info: [{"id": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "address": "fa:16:3e:86:13:b4", "network": {"id": "6a985fad-fdc1-4c5b-a880-f267b92a4ef8", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1110067781-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f4237454503a49a6a90b8deeb01dad23", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6a7cce8-73", "ovs_interfaceid": "e6a7cce8-73ea-4021-ba99-a6d25adf5e69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.802 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.802 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.803 182939 DEBUG oslo_concurrency.lockutils [req-ce202295-fd84-4ca8-8529-c1f1cf6c6a73 req-6d4984a7-cffe-48da-ba8b-83807c61d3f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.803 182939 DEBUG nova.network.neutron [req-ce202295-fd84-4ca8-8529-c1f1cf6c6a73 req-6d4984a7-cffe-48da-ba8b-83807c61d3f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Refreshing network info cache for port e6a7cce8-73ea-4021-ba99-a6d25adf5e69 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.804 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.804 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.805 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.805 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.805 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.805 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.805 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.827 182939 DEBUG nova.compute.utils [req-ce202295-fd84-4ca8-8529-c1f1cf6c6a73 req-6d4984a7-cffe-48da-ba8b-83807c61d3f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Can not refresh info_cache because instance was not found refresh_info_cache_for_instance /usr/lib/python3.9/site-packages/nova/compute/utils.py:1010
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.831 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.832 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.832 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:34 compute-0 nova_compute[182935]: 2026-01-22 00:01:34.832 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.039 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.041 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5689MB free_disk=73.20284652709961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.041 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.041 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.118 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.119 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.169 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.208 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.284 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.284 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.343 182939 DEBUG nova.compute.manager [req-259529a3-1680-4bce-9f64-ef9f7fd45234 req-431d8999-d2b6-4a6a-a5ee-882186fd7324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Received event network-vif-plugged-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.344 182939 DEBUG oslo_concurrency.lockutils [req-259529a3-1680-4bce-9f64-ef9f7fd45234 req-431d8999-d2b6-4a6a-a5ee-882186fd7324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.344 182939 DEBUG oslo_concurrency.lockutils [req-259529a3-1680-4bce-9f64-ef9f7fd45234 req-431d8999-d2b6-4a6a-a5ee-882186fd7324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.344 182939 DEBUG oslo_concurrency.lockutils [req-259529a3-1680-4bce-9f64-ef9f7fd45234 req-431d8999-d2b6-4a6a-a5ee-882186fd7324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7de8f448-0555-42c9-81e7-142c8735b7ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.344 182939 DEBUG nova.compute.manager [req-259529a3-1680-4bce-9f64-ef9f7fd45234 req-431d8999-d2b6-4a6a-a5ee-882186fd7324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] No waiting events found dispatching network-vif-plugged-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.345 182939 WARNING nova.compute.manager [req-259529a3-1680-4bce-9f64-ef9f7fd45234 req-431d8999-d2b6-4a6a-a5ee-882186fd7324 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Received unexpected event network-vif-plugged-e6a7cce8-73ea-4021-ba99-a6d25adf5e69 for instance with vm_state deleted and task_state None.
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.533 182939 INFO nova.network.neutron [req-ce202295-fd84-4ca8-8529-c1f1cf6c6a73 req-6d4984a7-cffe-48da-ba8b-83807c61d3f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Port e6a7cce8-73ea-4021-ba99-a6d25adf5e69 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.533 182939 DEBUG nova.network.neutron [req-ce202295-fd84-4ca8-8529-c1f1cf6c6a73 req-6d4984a7-cffe-48da-ba8b-83807c61d3f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:01:35 compute-0 nova_compute[182935]: 2026-01-22 00:01:35.558 182939 DEBUG oslo_concurrency.lockutils [req-ce202295-fd84-4ca8-8529-c1f1cf6c6a73 req-6d4984a7-cffe-48da-ba8b-83807c61d3f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7de8f448-0555-42c9-81e7-142c8735b7ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:01:36 compute-0 nova_compute[182935]: 2026-01-22 00:01:36.440 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:37 compute-0 nova_compute[182935]: 2026-01-22 00:01:37.400 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:40 compute-0 nova_compute[182935]: 2026-01-22 00:01:40.240 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:40 compute-0 nova_compute[182935]: 2026-01-22 00:01:40.841 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:40 compute-0 nova_compute[182935]: 2026-01-22 00:01:40.982 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:41 compute-0 nova_compute[182935]: 2026-01-22 00:01:41.443 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:41 compute-0 podman[224286]: 2026-01-22 00:01:41.736662255 +0000 UTC m=+0.093504212 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:01:41 compute-0 podman[224285]: 2026-01-22 00:01:41.794888927 +0000 UTC m=+0.153541027 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 22 00:01:42 compute-0 nova_compute[182935]: 2026-01-22 00:01:42.276 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:42 compute-0 nova_compute[182935]: 2026-01-22 00:01:42.277 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:42 compute-0 nova_compute[182935]: 2026-01-22 00:01:42.401 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:44 compute-0 nova_compute[182935]: 2026-01-22 00:01:44.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:01:46 compute-0 nova_compute[182935]: 2026-01-22 00:01:46.445 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:47 compute-0 nova_compute[182935]: 2026-01-22 00:01:47.372 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040092.3709834, 7de8f448-0555-42c9-81e7-142c8735b7ad => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:01:47 compute-0 nova_compute[182935]: 2026-01-22 00:01:47.373 182939 INFO nova.compute.manager [-] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] VM Stopped (Lifecycle Event)
Jan 22 00:01:47 compute-0 nova_compute[182935]: 2026-01-22 00:01:47.402 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:47 compute-0 nova_compute[182935]: 2026-01-22 00:01:47.459 182939 DEBUG nova.compute.manager [None req-48ddd55c-cf15-4fc8-943d-2d98fcb923eb - - - - - -] [instance: 7de8f448-0555-42c9-81e7-142c8735b7ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:01:47 compute-0 nova_compute[182935]: 2026-01-22 00:01:47.861 182939 DEBUG nova.compute.manager [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 22 00:01:48 compute-0 nova_compute[182935]: 2026-01-22 00:01:48.181 182939 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:48 compute-0 nova_compute[182935]: 2026-01-22 00:01:48.181 182939 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:48 compute-0 nova_compute[182935]: 2026-01-22 00:01:48.314 182939 DEBUG nova.objects.instance [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:01:48 compute-0 nova_compute[182935]: 2026-01-22 00:01:48.428 182939 DEBUG nova.virt.hardware [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:01:48 compute-0 nova_compute[182935]: 2026-01-22 00:01:48.428 182939 INFO nova.compute.claims [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:01:48 compute-0 nova_compute[182935]: 2026-01-22 00:01:48.429 182939 DEBUG nova.objects.instance [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'resources' on Instance uuid 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:01:48 compute-0 nova_compute[182935]: 2026-01-22 00:01:48.603 182939 DEBUG nova.objects.instance [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:01:48 compute-0 nova_compute[182935]: 2026-01-22 00:01:48.760 182939 INFO nova.compute.resource_tracker [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating resource usage from migration e2b58971-2cd8-417e-8990-aa80e80966b9
Jan 22 00:01:48 compute-0 nova_compute[182935]: 2026-01-22 00:01:48.760 182939 DEBUG nova.compute.resource_tracker [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Starting to track incoming migration e2b58971-2cd8-417e-8990-aa80e80966b9 with flavor ff01ccba-ad51-439f-9037-926190d6dc0f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 22 00:01:48 compute-0 nova_compute[182935]: 2026-01-22 00:01:48.879 182939 DEBUG nova.compute.provider_tree [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:01:48 compute-0 nova_compute[182935]: 2026-01-22 00:01:48.961 182939 DEBUG nova.scheduler.client.report [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:01:49 compute-0 nova_compute[182935]: 2026-01-22 00:01:49.226 182939 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 1.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:49 compute-0 nova_compute[182935]: 2026-01-22 00:01:49.226 182939 INFO nova.compute.manager [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Migrating
Jan 22 00:01:49 compute-0 podman[224335]: 2026-01-22 00:01:49.677690137 +0000 UTC m=+0.055078598 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:01:51 compute-0 nova_compute[182935]: 2026-01-22 00:01:51.449 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:52 compute-0 nova_compute[182935]: 2026-01-22 00:01:52.433 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:53 compute-0 podman[224359]: 2026-01-22 00:01:53.715204029 +0000 UTC m=+0.069698018 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:01:54 compute-0 sshd-session[224378]: Accepted publickey for nova from 192.168.122.101 port 45528 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:01:54 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 00:01:54 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 00:01:54 compute-0 systemd-logind[784]: New session 49 of user nova.
Jan 22 00:01:54 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 00:01:54 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 22 00:01:54 compute-0 systemd[224382]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:01:54 compute-0 systemd[224382]: Queued start job for default target Main User Target.
Jan 22 00:01:54 compute-0 systemd[224382]: Created slice User Application Slice.
Jan 22 00:01:54 compute-0 systemd[224382]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:01:54 compute-0 systemd[224382]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 00:01:54 compute-0 systemd[224382]: Reached target Paths.
Jan 22 00:01:54 compute-0 systemd[224382]: Reached target Timers.
Jan 22 00:01:54 compute-0 systemd[224382]: Starting D-Bus User Message Bus Socket...
Jan 22 00:01:54 compute-0 systemd[224382]: Starting Create User's Volatile Files and Directories...
Jan 22 00:01:54 compute-0 systemd[224382]: Listening on D-Bus User Message Bus Socket.
Jan 22 00:01:54 compute-0 systemd[224382]: Reached target Sockets.
Jan 22 00:01:54 compute-0 systemd[224382]: Finished Create User's Volatile Files and Directories.
Jan 22 00:01:54 compute-0 systemd[224382]: Reached target Basic System.
Jan 22 00:01:54 compute-0 systemd[224382]: Reached target Main User Target.
Jan 22 00:01:54 compute-0 systemd[224382]: Startup finished in 163ms.
Jan 22 00:01:54 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 22 00:01:54 compute-0 systemd[1]: Started Session 49 of User nova.
Jan 22 00:01:54 compute-0 sshd-session[224378]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:01:54 compute-0 sshd-session[224397]: Received disconnect from 192.168.122.101 port 45528:11: disconnected by user
Jan 22 00:01:54 compute-0 sshd-session[224397]: Disconnected from user nova 192.168.122.101 port 45528
Jan 22 00:01:54 compute-0 sshd-session[224378]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:01:54 compute-0 systemd-logind[784]: Session 49 logged out. Waiting for processes to exit.
Jan 22 00:01:54 compute-0 systemd[1]: session-49.scope: Deactivated successfully.
Jan 22 00:01:54 compute-0 systemd-logind[784]: Removed session 49.
Jan 22 00:01:54 compute-0 sshd-session[224399]: Accepted publickey for nova from 192.168.122.101 port 45532 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:01:54 compute-0 systemd-logind[784]: New session 51 of user nova.
Jan 22 00:01:54 compute-0 systemd[1]: Started Session 51 of User nova.
Jan 22 00:01:54 compute-0 sshd-session[224399]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:01:54 compute-0 sshd-session[224402]: Received disconnect from 192.168.122.101 port 45532:11: disconnected by user
Jan 22 00:01:54 compute-0 sshd-session[224402]: Disconnected from user nova 192.168.122.101 port 45532
Jan 22 00:01:54 compute-0 sshd-session[224399]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:01:54 compute-0 systemd[1]: session-51.scope: Deactivated successfully.
Jan 22 00:01:54 compute-0 systemd-logind[784]: Session 51 logged out. Waiting for processes to exit.
Jan 22 00:01:54 compute-0 systemd-logind[784]: Removed session 51.
Jan 22 00:01:56 compute-0 nova_compute[182935]: 2026-01-22 00:01:56.449 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:57 compute-0 nova_compute[182935]: 2026-01-22 00:01:57.435 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:01:58 compute-0 sshd-session[224404]: Accepted publickey for nova from 192.168.122.101 port 45534 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:01:58 compute-0 systemd-logind[784]: New session 52 of user nova.
Jan 22 00:01:58 compute-0 systemd[1]: Started Session 52 of User nova.
Jan 22 00:01:58 compute-0 sshd-session[224404]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:01:58 compute-0 nova_compute[182935]: 2026-01-22 00:01:58.498 182939 DEBUG nova.compute.manager [req-e37f033f-6a7e-4df1-b130-b5812c0c2fed req-2b2198e2-0cde-4ba0-9efd-a18eb52d9700 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:01:58 compute-0 nova_compute[182935]: 2026-01-22 00:01:58.499 182939 DEBUG oslo_concurrency.lockutils [req-e37f033f-6a7e-4df1-b130-b5812c0c2fed req-2b2198e2-0cde-4ba0-9efd-a18eb52d9700 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:01:58 compute-0 nova_compute[182935]: 2026-01-22 00:01:58.499 182939 DEBUG oslo_concurrency.lockutils [req-e37f033f-6a7e-4df1-b130-b5812c0c2fed req-2b2198e2-0cde-4ba0-9efd-a18eb52d9700 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:01:58 compute-0 nova_compute[182935]: 2026-01-22 00:01:58.499 182939 DEBUG oslo_concurrency.lockutils [req-e37f033f-6a7e-4df1-b130-b5812c0c2fed req-2b2198e2-0cde-4ba0-9efd-a18eb52d9700 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:01:58 compute-0 nova_compute[182935]: 2026-01-22 00:01:58.499 182939 DEBUG nova.compute.manager [req-e37f033f-6a7e-4df1-b130-b5812c0c2fed req-2b2198e2-0cde-4ba0-9efd-a18eb52d9700 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:01:58 compute-0 nova_compute[182935]: 2026-01-22 00:01:58.500 182939 WARNING nova.compute.manager [req-e37f033f-6a7e-4df1-b130-b5812c0c2fed req-2b2198e2-0cde-4ba0-9efd-a18eb52d9700 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state active and task_state resize_migrating.
Jan 22 00:01:58 compute-0 sshd-session[224407]: Received disconnect from 192.168.122.101 port 45534:11: disconnected by user
Jan 22 00:01:58 compute-0 sshd-session[224407]: Disconnected from user nova 192.168.122.101 port 45534
Jan 22 00:01:58 compute-0 sshd-session[224404]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:01:58 compute-0 systemd[1]: session-52.scope: Deactivated successfully.
Jan 22 00:01:58 compute-0 systemd-logind[784]: Session 52 logged out. Waiting for processes to exit.
Jan 22 00:01:58 compute-0 systemd-logind[784]: Removed session 52.
Jan 22 00:01:58 compute-0 sshd-session[224409]: Accepted publickey for nova from 192.168.122.101 port 45538 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:01:58 compute-0 systemd-logind[784]: New session 53 of user nova.
Jan 22 00:01:58 compute-0 systemd[1]: Started Session 53 of User nova.
Jan 22 00:01:58 compute-0 sshd-session[224409]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:01:58 compute-0 sshd-session[224412]: Received disconnect from 192.168.122.101 port 45538:11: disconnected by user
Jan 22 00:01:58 compute-0 sshd-session[224412]: Disconnected from user nova 192.168.122.101 port 45538
Jan 22 00:01:58 compute-0 sshd-session[224409]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:01:58 compute-0 systemd[1]: session-53.scope: Deactivated successfully.
Jan 22 00:01:58 compute-0 systemd-logind[784]: Session 53 logged out. Waiting for processes to exit.
Jan 22 00:01:58 compute-0 systemd-logind[784]: Removed session 53.
Jan 22 00:01:59 compute-0 sshd-session[224414]: Accepted publickey for nova from 192.168.122.101 port 45544 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:01:59 compute-0 systemd-logind[784]: New session 54 of user nova.
Jan 22 00:01:59 compute-0 systemd[1]: Started Session 54 of User nova.
Jan 22 00:01:59 compute-0 sshd-session[224414]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:01:59 compute-0 sshd-session[224417]: Received disconnect from 192.168.122.101 port 45544:11: disconnected by user
Jan 22 00:01:59 compute-0 sshd-session[224417]: Disconnected from user nova 192.168.122.101 port 45544
Jan 22 00:01:59 compute-0 sshd-session[224414]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:01:59 compute-0 systemd[1]: session-54.scope: Deactivated successfully.
Jan 22 00:01:59 compute-0 systemd-logind[784]: Session 54 logged out. Waiting for processes to exit.
Jan 22 00:01:59 compute-0 systemd-logind[784]: Removed session 54.
Jan 22 00:02:00 compute-0 nova_compute[182935]: 2026-01-22 00:02:00.696 182939 DEBUG nova.compute.manager [req-46d530aa-55da-468c-a837-20f197261fa7 req-3a24c15b-d757-4c82-bfa4-ca1658b19637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:00 compute-0 nova_compute[182935]: 2026-01-22 00:02:00.696 182939 DEBUG oslo_concurrency.lockutils [req-46d530aa-55da-468c-a837-20f197261fa7 req-3a24c15b-d757-4c82-bfa4-ca1658b19637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:00 compute-0 nova_compute[182935]: 2026-01-22 00:02:00.696 182939 DEBUG oslo_concurrency.lockutils [req-46d530aa-55da-468c-a837-20f197261fa7 req-3a24c15b-d757-4c82-bfa4-ca1658b19637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:00 compute-0 nova_compute[182935]: 2026-01-22 00:02:00.697 182939 DEBUG oslo_concurrency.lockutils [req-46d530aa-55da-468c-a837-20f197261fa7 req-3a24c15b-d757-4c82-bfa4-ca1658b19637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:00 compute-0 nova_compute[182935]: 2026-01-22 00:02:00.697 182939 DEBUG nova.compute.manager [req-46d530aa-55da-468c-a837-20f197261fa7 req-3a24c15b-d757-4c82-bfa4-ca1658b19637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:00 compute-0 nova_compute[182935]: 2026-01-22 00:02:00.697 182939 WARNING nova.compute.manager [req-46d530aa-55da-468c-a837-20f197261fa7 req-3a24c15b-d757-4c82-bfa4-ca1658b19637 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state active and task_state resize_migrated.
Jan 22 00:02:01 compute-0 nova_compute[182935]: 2026-01-22 00:02:01.187 182939 INFO nova.network.neutron [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating port 5965ccd1-7d75-4079-ade6-e1859a860162 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 22 00:02:01 compute-0 nova_compute[182935]: 2026-01-22 00:02:01.451 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:01 compute-0 podman[224419]: 2026-01-22 00:02:01.695836464 +0000 UTC m=+0.065212010 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter)
Jan 22 00:02:01 compute-0 podman[224420]: 2026-01-22 00:02:01.715862457 +0000 UTC m=+0.076631006 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:02:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:02.117 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:02:02 compute-0 nova_compute[182935]: 2026-01-22 00:02:02.118 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:02.119 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:02:02 compute-0 nova_compute[182935]: 2026-01-22 00:02:02.438 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:02 compute-0 nova_compute[182935]: 2026-01-22 00:02:02.921 182939 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:02:02 compute-0 nova_compute[182935]: 2026-01-22 00:02:02.922 182939 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:02:02 compute-0 nova_compute[182935]: 2026-01-22 00:02:02.922 182939 DEBUG nova.network.neutron [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:02:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:03.121 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:03.196 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:03.196 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:03.196 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:03 compute-0 nova_compute[182935]: 2026-01-22 00:02:03.659 182939 DEBUG nova.compute.manager [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-changed-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:03 compute-0 nova_compute[182935]: 2026-01-22 00:02:03.659 182939 DEBUG nova.compute.manager [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Refreshing instance network info cache due to event network-changed-5965ccd1-7d75-4079-ade6-e1859a860162. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:02:03 compute-0 nova_compute[182935]: 2026-01-22 00:02:03.660 182939 DEBUG oslo_concurrency.lockutils [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:02:06 compute-0 nova_compute[182935]: 2026-01-22 00:02:06.453 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:06 compute-0 nova_compute[182935]: 2026-01-22 00:02:06.796 182939 DEBUG nova.network.neutron [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:06 compute-0 nova_compute[182935]: 2026-01-22 00:02:06.998 182939 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:02:07 compute-0 nova_compute[182935]: 2026-01-22 00:02:07.001 182939 DEBUG oslo_concurrency.lockutils [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:02:07 compute-0 nova_compute[182935]: 2026-01-22 00:02:07.002 182939 DEBUG nova.network.neutron [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Refreshing network info cache for port 5965ccd1-7d75-4079-ade6-e1859a860162 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:02:07 compute-0 nova_compute[182935]: 2026-01-22 00:02:07.440 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.351 182939 DEBUG nova.network.neutron [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updated VIF entry in instance network info cache for port 5965ccd1-7d75-4079-ade6-e1859a860162. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.352 182939 DEBUG nova.network.neutron [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.461 182939 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.463 182939 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.463 182939 INFO nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Creating image(s)
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.464 182939 DEBUG nova.objects.instance [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.507 182939 DEBUG oslo_concurrency.lockutils [req-14d3b801-7b18-4665-a6db-313da29827d8 req-d6f79cb1-b135-4e0b-a3d3-5960a59503de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.579 182939 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.646 182939 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.647 182939 DEBUG nova.virt.disk.api [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Checking if we can resize image /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.647 182939 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.706 182939 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.707 182939 DEBUG nova.virt.disk.api [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Cannot resize image /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.868 182939 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.869 182939 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Ensure instance console log exists: /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.870 182939 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.870 182939 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.871 182939 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.874 182939 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Start _get_guest_xml network_info=[{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c9:be:ae"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.881 182939 WARNING nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.888 182939 DEBUG nova.virt.libvirt.host [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.888 182939 DEBUG nova.virt.libvirt.host [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.893 182939 DEBUG nova.virt.libvirt.host [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.893 182939 DEBUG nova.virt.libvirt.host [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.895 182939 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.895 182939 DEBUG nova.virt.hardware [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ff01ccba-ad51-439f-9037-926190d6dc0f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.895 182939 DEBUG nova.virt.hardware [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.896 182939 DEBUG nova.virt.hardware [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.896 182939 DEBUG nova.virt.hardware [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.896 182939 DEBUG nova.virt.hardware [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.896 182939 DEBUG nova.virt.hardware [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.897 182939 DEBUG nova.virt.hardware [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.897 182939 DEBUG nova.virt.hardware [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.897 182939 DEBUG nova.virt.hardware [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.897 182939 DEBUG nova.virt.hardware [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.897 182939 DEBUG nova.virt.hardware [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:02:08 compute-0 nova_compute[182935]: 2026-01-22 00:02:08.898 182939 DEBUG nova.objects.instance [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:02:10 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 00:02:10 compute-0 systemd[224382]: Activating special unit Exit the Session...
Jan 22 00:02:10 compute-0 systemd[224382]: Stopped target Main User Target.
Jan 22 00:02:10 compute-0 systemd[224382]: Stopped target Basic System.
Jan 22 00:02:10 compute-0 systemd[224382]: Stopped target Paths.
Jan 22 00:02:10 compute-0 systemd[224382]: Stopped target Sockets.
Jan 22 00:02:10 compute-0 systemd[224382]: Stopped target Timers.
Jan 22 00:02:10 compute-0 systemd[224382]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:02:10 compute-0 systemd[224382]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 00:02:10 compute-0 systemd[224382]: Closed D-Bus User Message Bus Socket.
Jan 22 00:02:10 compute-0 systemd[224382]: Stopped Create User's Volatile Files and Directories.
Jan 22 00:02:10 compute-0 systemd[224382]: Removed slice User Application Slice.
Jan 22 00:02:10 compute-0 sshd-session[224466]: Invalid user git from 188.166.69.60 port 60324
Jan 22 00:02:10 compute-0 systemd[224382]: Reached target Shutdown.
Jan 22 00:02:10 compute-0 systemd[224382]: Finished Exit the Session.
Jan 22 00:02:10 compute-0 systemd[224382]: Reached target Exit the Session.
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.727 182939 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:10 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 00:02:10 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 00:02:10 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 00:02:10 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 00:02:10 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 00:02:10 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 00:02:10 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.792 182939 DEBUG oslo_concurrency.processutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.794 182939 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.795 182939 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.796 182939 DEBUG oslo_concurrency.lockutils [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.797 182939 DEBUG nova.virt.libvirt.vif [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-803720403',display_name='tempest-ServerActionsTestJSON-server-803720403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-803720403',id=84,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:01:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-zph0mh3t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:02:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c9:be:ae"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.798 182939 DEBUG nova.network.os_vif_util [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c9:be:ae"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.799 182939 DEBUG nova.network.os_vif_util [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.801 182939 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:02:10 compute-0 nova_compute[182935]:   <uuid>8009ab5e-1bf8-4d17-8ff2-b62b25eeff20</uuid>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   <name>instance-00000054</name>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   <memory>196608</memory>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerActionsTestJSON-server-803720403</nova:name>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:02:08</nova:creationTime>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <nova:flavor name="m1.micro">
Jan 22 00:02:10 compute-0 nova_compute[182935]:         <nova:memory>192</nova:memory>
Jan 22 00:02:10 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:02:10 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:02:10 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:02:10 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:02:10 compute-0 nova_compute[182935]:         <nova:user uuid="3e78a70a1d284a9d932d4a53b872df39">tempest-ServerActionsTestJSON-78742637-project-member</nova:user>
Jan 22 00:02:10 compute-0 nova_compute[182935]:         <nova:project uuid="cccb624dbe6d4401a89e9cd254f91828">tempest-ServerActionsTestJSON-78742637</nova:project>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:02:10 compute-0 nova_compute[182935]:         <nova:port uuid="5965ccd1-7d75-4079-ade6-e1859a860162">
Jan 22 00:02:10 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <system>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <entry name="serial">8009ab5e-1bf8-4d17-8ff2-b62b25eeff20</entry>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <entry name="uuid">8009ab5e-1bf8-4d17-8ff2-b62b25eeff20</entry>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     </system>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   <os>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   </os>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   <features>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   </features>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/disk.config"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:c9:be:ae"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <target dev="tap5965ccd1-7d"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/console.log" append="off"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <video>
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     </video>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:02:10 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:02:10 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:02:10 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:02:10 compute-0 nova_compute[182935]: </domain>
Jan 22 00:02:10 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.803 182939 DEBUG nova.virt.libvirt.vif [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-803720403',display_name='tempest-ServerActionsTestJSON-server-803720403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-803720403',id=84,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:01:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-zph0mh3t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:02:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c9:be:ae"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.803 182939 DEBUG nova.network.os_vif_util [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-10713966-network", "vif_mac": "fa:16:3e:c9:be:ae"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.803 182939 DEBUG nova.network.os_vif_util [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.804 182939 DEBUG os_vif [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.805 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.805 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.806 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.808 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.809 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5965ccd1-7d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.809 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5965ccd1-7d, col_values=(('external_ids', {'iface-id': '5965ccd1-7d75-4079-ade6-e1859a860162', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:be:ae', 'vm-uuid': '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.811 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:10 compute-0 NetworkManager[55139]: <info>  [1769040130.8122] manager: (tap5965ccd1-7d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.814 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.821 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:10 compute-0 nova_compute[182935]: 2026-01-22 00:02:10.821 182939 INFO os_vif [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d')
Jan 22 00:02:10 compute-0 sshd-session[224466]: Connection closed by invalid user git 188.166.69.60 port 60324 [preauth]
Jan 22 00:02:11 compute-0 nova_compute[182935]: 2026-01-22 00:02:11.455 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:11 compute-0 nova_compute[182935]: 2026-01-22 00:02:11.863 182939 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:02:11 compute-0 nova_compute[182935]: 2026-01-22 00:02:11.864 182939 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:02:11 compute-0 nova_compute[182935]: 2026-01-22 00:02:11.864 182939 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] No VIF found with MAC fa:16:3e:c9:be:ae, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:02:11 compute-0 nova_compute[182935]: 2026-01-22 00:02:11.865 182939 INFO nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Using config drive
Jan 22 00:02:11 compute-0 kernel: tap5965ccd1-7d: entered promiscuous mode
Jan 22 00:02:11 compute-0 NetworkManager[55139]: <info>  [1769040131.9539] manager: (tap5965ccd1-7d): new Tun device (/org/freedesktop/NetworkManager/Devices/141)
Jan 22 00:02:11 compute-0 nova_compute[182935]: 2026-01-22 00:02:11.958 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:11 compute-0 ovn_controller[95047]: 2026-01-22T00:02:11Z|00315|binding|INFO|Claiming lport 5965ccd1-7d75-4079-ade6-e1859a860162 for this chassis.
Jan 22 00:02:11 compute-0 ovn_controller[95047]: 2026-01-22T00:02:11Z|00316|binding|INFO|5965ccd1-7d75-4079-ade6-e1859a860162: Claiming fa:16:3e:c9:be:ae 10.100.0.4
Jan 22 00:02:11 compute-0 nova_compute[182935]: 2026-01-22 00:02:11.965 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:11 compute-0 nova_compute[182935]: 2026-01-22 00:02:11.972 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:11 compute-0 nova_compute[182935]: 2026-01-22 00:02:11.982 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:11 compute-0 NetworkManager[55139]: <info>  [1769040131.9832] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Jan 22 00:02:11 compute-0 NetworkManager[55139]: <info>  [1769040131.9839] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Jan 22 00:02:11 compute-0 systemd-udevd[224508]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:02:12 compute-0 NetworkManager[55139]: <info>  [1769040132.0142] device (tap5965ccd1-7d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:02:12 compute-0 NetworkManager[55139]: <info>  [1769040132.0160] device (tap5965ccd1-7d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:02:12 compute-0 systemd-machined[154182]: New machine qemu-43-instance-00000054.
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.030 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:be:ae 10.100.0.4'], port_security=['fa:16:3e:c9:be:ae 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=5965ccd1-7d75-4079-ade6-e1859a860162) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.031 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 5965ccd1-7d75-4079-ade6-e1859a860162 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 bound to our chassis
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.032 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:02:12 compute-0 podman[224483]: 2026-01-22 00:02:12.033699394 +0000 UTC m=+0.080353806 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:02:12 compute-0 systemd[1]: Started Virtual Machine qemu-43-instance-00000054.
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.048 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[99a96574-ee40-4842-9dc4-1ef612b03697]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.050 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap19c3e0c8-51 in ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.055 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap19c3e0c8-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.056 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ed568099-f364-45af-9b0f-bb94b084402c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.058 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1f3035e9-c367-43df-a9fc-81f33a6974ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.078 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[11b8af6e-63f4-4900-a654-0f3b40090613]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.109 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[41bf1e49-5666-4e4a-9b01-380c54cd9329]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.145 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8bba1bbe-c2ea-4102-917d-68faf0d3b525]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 NetworkManager[55139]: <info>  [1769040132.1600] manager: (tap19c3e0c8-50): new Veth device (/org/freedesktop/NetworkManager/Devices/144)
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.159 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3f6d8df9-2baf-45c2-9aa8-892dd7b41da2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 systemd-udevd[224525]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.199 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:12 compute-0 podman[224482]: 2026-01-22 00:02:12.214034744 +0000 UTC m=+0.265617544 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 00:02:12 compute-0 ovn_controller[95047]: 2026-01-22T00:02:12Z|00317|binding|INFO|Setting lport 5965ccd1-7d75-4079-ade6-e1859a860162 ovn-installed in OVS
Jan 22 00:02:12 compute-0 ovn_controller[95047]: 2026-01-22T00:02:12Z|00318|binding|INFO|Setting lport 5965ccd1-7d75-4079-ade6-e1859a860162 up in Southbound
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.216 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.230 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2ef26c-03b7-4a42-be6f-5a29205db268]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.233 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[e60dccff-c6aa-42e1-a979-d9a25163f5f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 NetworkManager[55139]: <info>  [1769040132.2634] device (tap19c3e0c8-50): carrier: link connected
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.270 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ba9c8a-c9cd-43cb-a022-9a1160329319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.289 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d944b6ac-7412-4499-8398-b55c42e5a3e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463500, 'reachable_time': 44883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224572, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.308 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[40565d77-149c-42fe-beeb-10aa6f663550]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3ab0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463500, 'tstamp': 463500}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224573, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.327 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c6948136-7557-44b5-a00e-2adf1ce8e118]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap19c3e0c8-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:3a:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463500, 'reachable_time': 44883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224574, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.359 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[aee02ac0-65eb-48c3-bd92-269b4c90e987]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.422 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[274e1c12-a996-4240-b798-b4d55f3e9a80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.427 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.427 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.428 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19c3e0c8-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:12 compute-0 kernel: tap19c3e0c8-50: entered promiscuous mode
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.430 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:12 compute-0 NetworkManager[55139]: <info>  [1769040132.4308] manager: (tap19c3e0c8-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.433 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.433 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap19c3e0c8-50, col_values=(('external_ids', {'iface-id': '1b7e9589-a667-4684-99c2-2699b19c29bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.434 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:12 compute-0 ovn_controller[95047]: 2026-01-22T00:02:12Z|00319|binding|INFO|Releasing lport 1b7e9589-a667-4684-99c2-2699b19c29bb from this chassis (sb_readonly=0)
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.446 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.447 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.448 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b16e7bbc-1854-4cc8-ba8a-438fe6740515]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.449 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.pid.haproxy
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 19c3e0c8-5563-479c-995a-ab38d8b8c7f7
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:02:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:12.450 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'env', 'PROCESS_TAG=haproxy-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/19c3e0c8-5563-479c-995a-ab38d8b8c7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.512 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040132.5121038, 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.515 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] VM Resumed (Lifecycle Event)
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.517 182939 DEBUG nova.compute.manager [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.521 182939 INFO nova.virt.libvirt.driver [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance running successfully.
Jan 22 00:02:12 compute-0 virtqemud[182477]: argument unsupported: QEMU guest agent is not configured
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.524 182939 DEBUG nova.virt.libvirt.guest [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.524 182939 DEBUG nova.virt.libvirt.driver [None req-82b81f5c-f48d-4084-8608-de7c023cc37c 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 22 00:02:12 compute-0 podman[224611]: 2026-01-22 00:02:12.868417758 +0000 UTC m=+0.074042084 container create 1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.875 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:12 compute-0 nova_compute[182935]: 2026-01-22 00:02:12.883 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:02:12 compute-0 podman[224611]: 2026-01-22 00:02:12.825435003 +0000 UTC m=+0.031059379 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:02:12 compute-0 systemd[1]: Started libpod-conmon-1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2.scope.
Jan 22 00:02:12 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:02:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d31bf62c8004bcc6e8a713419ccbddfd57eb871effef8578e2778dd7c8bda6f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:02:12 compute-0 podman[224611]: 2026-01-22 00:02:12.996920301 +0000 UTC m=+0.202544697 container init 1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:02:13 compute-0 podman[224611]: 2026-01-22 00:02:13.005982199 +0000 UTC m=+0.211606525 container start 1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 00:02:13 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224626]: [NOTICE]   (224630) : New worker (224633) forked
Jan 22 00:02:13 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224626]: [NOTICE]   (224630) : Loading success.
Jan 22 00:02:13 compute-0 nova_compute[182935]: 2026-01-22 00:02:13.108 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 22 00:02:13 compute-0 nova_compute[182935]: 2026-01-22 00:02:13.110 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040132.5132387, 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:02:13 compute-0 nova_compute[182935]: 2026-01-22 00:02:13.110 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] VM Started (Lifecycle Event)
Jan 22 00:02:13 compute-0 nova_compute[182935]: 2026-01-22 00:02:13.935 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:13 compute-0 nova_compute[182935]: 2026-01-22 00:02:13.941 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:02:15 compute-0 nova_compute[182935]: 2026-01-22 00:02:15.129 182939 DEBUG nova.compute.manager [req-f636f8b4-4d9f-4782-af5b-5b18a23d8251 req-97f2580b-8171-4774-bc0c-f823d5563665 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:15 compute-0 nova_compute[182935]: 2026-01-22 00:02:15.130 182939 DEBUG oslo_concurrency.lockutils [req-f636f8b4-4d9f-4782-af5b-5b18a23d8251 req-97f2580b-8171-4774-bc0c-f823d5563665 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:15 compute-0 nova_compute[182935]: 2026-01-22 00:02:15.131 182939 DEBUG oslo_concurrency.lockutils [req-f636f8b4-4d9f-4782-af5b-5b18a23d8251 req-97f2580b-8171-4774-bc0c-f823d5563665 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:15 compute-0 nova_compute[182935]: 2026-01-22 00:02:15.131 182939 DEBUG oslo_concurrency.lockutils [req-f636f8b4-4d9f-4782-af5b-5b18a23d8251 req-97f2580b-8171-4774-bc0c-f823d5563665 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:15 compute-0 nova_compute[182935]: 2026-01-22 00:02:15.131 182939 DEBUG nova.compute.manager [req-f636f8b4-4d9f-4782-af5b-5b18a23d8251 req-97f2580b-8171-4774-bc0c-f823d5563665 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:15 compute-0 nova_compute[182935]: 2026-01-22 00:02:15.132 182939 WARNING nova.compute.manager [req-f636f8b4-4d9f-4782-af5b-5b18a23d8251 req-97f2580b-8171-4774-bc0c-f823d5563665 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state resized and task_state None.
Jan 22 00:02:15 compute-0 nova_compute[182935]: 2026-01-22 00:02:15.852 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:16 compute-0 nova_compute[182935]: 2026-01-22 00:02:16.458 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:16 compute-0 nova_compute[182935]: 2026-01-22 00:02:16.513 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquiring lock "e216ca9d-2882-457b-955e-b7a7cd7213d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:16 compute-0 nova_compute[182935]: 2026-01-22 00:02:16.514 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:16 compute-0 nova_compute[182935]: 2026-01-22 00:02:16.734 182939 DEBUG nova.compute.manager [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:02:17 compute-0 nova_compute[182935]: 2026-01-22 00:02:17.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.043 182939 DEBUG nova.compute.manager [req-7def03ce-9877-4921-b35c-d1af811766c0 req-1c19c4c8-f7a1-4e84-a36e-4bf6049572ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.044 182939 DEBUG oslo_concurrency.lockutils [req-7def03ce-9877-4921-b35c-d1af811766c0 req-1c19c4c8-f7a1-4e84-a36e-4bf6049572ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.045 182939 DEBUG oslo_concurrency.lockutils [req-7def03ce-9877-4921-b35c-d1af811766c0 req-1c19c4c8-f7a1-4e84-a36e-4bf6049572ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.046 182939 DEBUG oslo_concurrency.lockutils [req-7def03ce-9877-4921-b35c-d1af811766c0 req-1c19c4c8-f7a1-4e84-a36e-4bf6049572ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.047 182939 DEBUG nova.compute.manager [req-7def03ce-9877-4921-b35c-d1af811766c0 req-1c19c4c8-f7a1-4e84-a36e-4bf6049572ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.047 182939 WARNING nova.compute.manager [req-7def03ce-9877-4921-b35c-d1af811766c0 req-1c19c4c8-f7a1-4e84-a36e-4bf6049572ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.676 182939 DEBUG nova.network.neutron [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Port 5965ccd1-7d75-4079-ade6-e1859a860162 binding to destination host compute-0.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.677 182939 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.677 182939 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquired lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.678 182939 DEBUG nova.network.neutron [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.817 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.818 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.828 182939 DEBUG nova.virt.hardware [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:02:18 compute-0 nova_compute[182935]: 2026-01-22 00:02:18.828 182939 INFO nova.compute.claims [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:02:19 compute-0 nova_compute[182935]: 2026-01-22 00:02:19.609 182939 DEBUG nova.compute.provider_tree [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:02:19 compute-0 nova_compute[182935]: 2026-01-22 00:02:19.699 182939 DEBUG nova.scheduler.client.report [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:02:20 compute-0 nova_compute[182935]: 2026-01-22 00:02:20.149 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:20 compute-0 nova_compute[182935]: 2026-01-22 00:02:20.150 182939 DEBUG nova.compute.manager [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:02:20 compute-0 podman[224642]: 2026-01-22 00:02:20.715238981 +0000 UTC m=+0.077050566 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:02:20 compute-0 nova_compute[182935]: 2026-01-22 00:02:20.856 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:21 compute-0 nova_compute[182935]: 2026-01-22 00:02:21.042 182939 DEBUG nova.network.neutron [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:21 compute-0 nova_compute[182935]: 2026-01-22 00:02:21.080 182939 DEBUG nova.compute.manager [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:02:21 compute-0 nova_compute[182935]: 2026-01-22 00:02:21.081 182939 DEBUG nova.network.neutron [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:02:21 compute-0 nova_compute[182935]: 2026-01-22 00:02:21.346 182939 DEBUG nova.policy [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cc334793f1e0484084ad779dd9ef0596', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1b63a2653b604354979ee32dbb6cd6c6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:02:21 compute-0 nova_compute[182935]: 2026-01-22 00:02:21.390 182939 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Releasing lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:02:21 compute-0 nova_compute[182935]: 2026-01-22 00:02:21.392 182939 INFO nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:02:21 compute-0 nova_compute[182935]: 2026-01-22 00:02:21.456 182939 DEBUG nova.virt.libvirt.driver [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Creating tmpfile /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20/tmp_5h3yba5 to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618
Jan 22 00:02:21 compute-0 nova_compute[182935]: 2026-01-22 00:02:21.508 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:21 compute-0 nova_compute[182935]: 2026-01-22 00:02:21.593 182939 DEBUG nova.compute.manager [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:02:21 compute-0 kernel: tap5965ccd1-7d (unregistering): left promiscuous mode
Jan 22 00:02:21 compute-0 NetworkManager[55139]: <info>  [1769040141.8240] device (tap5965ccd1-7d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:02:21 compute-0 nova_compute[182935]: 2026-01-22 00:02:21.844 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:21 compute-0 ovn_controller[95047]: 2026-01-22T00:02:21Z|00320|binding|INFO|Releasing lport 5965ccd1-7d75-4079-ade6-e1859a860162 from this chassis (sb_readonly=0)
Jan 22 00:02:21 compute-0 ovn_controller[95047]: 2026-01-22T00:02:21Z|00321|binding|INFO|Setting lport 5965ccd1-7d75-4079-ade6-e1859a860162 down in Southbound
Jan 22 00:02:21 compute-0 ovn_controller[95047]: 2026-01-22T00:02:21Z|00322|binding|INFO|Removing iface tap5965ccd1-7d ovn-installed in OVS
Jan 22 00:02:21 compute-0 nova_compute[182935]: 2026-01-22 00:02:21.849 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:21 compute-0 nova_compute[182935]: 2026-01-22 00:02:21.882 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:21 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000054.scope: Deactivated successfully.
Jan 22 00:02:21 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000054.scope: Consumed 9.812s CPU time.
Jan 22 00:02:21 compute-0 systemd-machined[154182]: Machine qemu-43-instance-00000054 terminated.
Jan 22 00:02:21 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:21.942 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:be:ae 10.100.0.4'], port_security=['fa:16:3e:c9:be:ae 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cccb624dbe6d4401a89e9cd254f91828', 'neutron:revision_number': '8', 'neutron:security_group_ids': '6d59a7e5-ecca-4ec2-a40e-386acabc1d66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.240', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cb5ae5b-fb9e-4b4d-8960-35191db09308, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=5965ccd1-7d75-4079-ade6-e1859a860162) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:02:21 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:21.943 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 5965ccd1-7d75-4079-ade6-e1859a860162 in datapath 19c3e0c8-5563-479c-995a-ab38d8b8c7f7 unbound from our chassis
Jan 22 00:02:21 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:21.945 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 19c3e0c8-5563-479c-995a-ab38d8b8c7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:02:21 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:21.947 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[45e754d1-8688-418f-8e91-03ad44e0e2c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:21 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:21.948 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 namespace which is not needed anymore
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.105 182939 INFO nova.virt.libvirt.driver [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Instance destroyed successfully.
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.106 182939 DEBUG nova.objects.instance [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'resources' on Instance uuid 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:02:22 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224626]: [NOTICE]   (224630) : haproxy version is 2.8.14-c23fe91
Jan 22 00:02:22 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224626]: [NOTICE]   (224630) : path to executable is /usr/sbin/haproxy
Jan 22 00:02:22 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224626]: [WARNING]  (224630) : Exiting Master process...
Jan 22 00:02:22 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224626]: [WARNING]  (224630) : Exiting Master process...
Jan 22 00:02:22 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224626]: [ALERT]    (224630) : Current worker (224633) exited with code 143 (Terminated)
Jan 22 00:02:22 compute-0 neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7[224626]: [WARNING]  (224630) : All workers exited. Exiting... (0)
Jan 22 00:02:22 compute-0 systemd[1]: libpod-1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2.scope: Deactivated successfully.
Jan 22 00:02:22 compute-0 podman[224695]: 2026-01-22 00:02:22.137015927 +0000 UTC m=+0.061452360 container died 1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 00:02:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2-userdata-shm.mount: Deactivated successfully.
Jan 22 00:02:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-1d31bf62c8004bcc6e8a713419ccbddfd57eb871effef8578e2778dd7c8bda6f-merged.mount: Deactivated successfully.
Jan 22 00:02:22 compute-0 podman[224695]: 2026-01-22 00:02:22.185832292 +0000 UTC m=+0.110268775 container cleanup 1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:02:22 compute-0 systemd[1]: libpod-conmon-1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2.scope: Deactivated successfully.
Jan 22 00:02:22 compute-0 podman[224731]: 2026-01-22 00:02:22.267273233 +0000 UTC m=+0.047323240 container remove 1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:02:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:22.274 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5f72faa4-9c3b-499e-8320-e44e5b843834]: (4, ('Thu Jan 22 12:02:22 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2)\n1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2\nThu Jan 22 12:02:22 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 (1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2)\n1b0193a4f02d12384fd4a5a493d0cec866c1dcaf7dc3376f227d7b4758bc27c2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:22.277 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e6f35b14-bd45-4129-8e2c-c0063fcd93bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:22.278 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19c3e0c8-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.281 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:22 compute-0 kernel: tap19c3e0c8-50: left promiscuous mode
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.297 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:22.301 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ca3886-4fbb-4c47-9bcd-79043736ac20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:22.320 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e24eb989-1150-45c9-8875-b3567a56f603]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:22.322 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c6b82116-34a8-410f-93c7-d8e4d27feb0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:22.342 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[52d25df2-aef0-42e7-9a1a-482e1fe596f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463487, 'reachable_time': 21528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224750, 'error': None, 'target': 'ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:22.346 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-19c3e0c8-5563-479c-995a-ab38d8b8c7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:02:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:22.346 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[ca488ac1-060c-4cca-a145-edeca9564d5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:22 compute-0 systemd[1]: run-netns-ovnmeta\x2d19c3e0c8\x2d5563\x2d479c\x2d995a\x2dab38d8b8c7f7.mount: Deactivated successfully.
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.354 182939 DEBUG nova.virt.libvirt.vif [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-803720403',display_name='tempest-ServerActionsTestJSON-server-803720403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-803720403',id=84,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM2ugiUux7DYMlN8dY8gue1BzsfXbOKOqdPq/gJUxFgjYtiZRKn0Il7yH7vkt/FF0n0nQ57uKZ7FjQwDvGcLpEHkhrK3RTLhPWsztjfiNHjhjKK0S86T4k3kzP0rpeoh4Q==',key_name='tempest-keypair-452781070',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:02:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cccb624dbe6d4401a89e9cd254f91828',ramdisk_id='',reservation_id='r-zph0mh3t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-78742637',owner_user_name='tempest-ServerActionsTestJSON-78742637-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:02:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e78a70a1d284a9d932d4a53b872df39',uuid=8009ab5e-1bf8-4d17-8ff2-b62b25eeff20,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.355 182939 DEBUG nova.network.os_vif_util [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converting VIF {"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.356 182939 DEBUG nova.network.os_vif_util [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.356 182939 DEBUG os_vif [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.358 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.359 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5965ccd1-7d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.362 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.365 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.371 182939 INFO os_vif [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:be:ae,bridge_name='br-int',has_traffic_filtering=True,id=5965ccd1-7d75-4079-ade6-e1859a860162,network=Network(19c3e0c8-5563-479c-995a-ab38d8b8c7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5965ccd1-7d')
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.372 182939 INFO nova.virt.libvirt.driver [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Deleting instance files /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_del
Jan 22 00:02:22 compute-0 nova_compute[182935]: 2026-01-22 00:02:22.383 182939 INFO nova.virt.libvirt.driver [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Deletion of /var/lib/nova/instances/8009ab5e-1bf8-4d17-8ff2-b62b25eeff20_del complete
Jan 22 00:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:23.315 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}56c7ea009b801d9698f0c834e3db9692471ec425d445fb9ddff9e80c5e7b5f2e" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 22 00:02:23 compute-0 nova_compute[182935]: 2026-01-22 00:02:23.338 182939 DEBUG nova.network.neutron [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Successfully created port: 637a179d-ce7a-481e-ba08-5430bd76b13b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:02:23 compute-0 nova_compute[182935]: 2026-01-22 00:02:23.897 182939 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:23 compute-0 nova_compute[182935]: 2026-01-22 00:02:23.897 182939 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.032 182939 DEBUG nova.compute.manager [req-17c74439-1b42-48d2-85dd-63db0c790ad9 req-4939c9bc-337a-49e1-92cc-ea86f35154a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.032 182939 DEBUG oslo_concurrency.lockutils [req-17c74439-1b42-48d2-85dd-63db0c790ad9 req-4939c9bc-337a-49e1-92cc-ea86f35154a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.033 182939 DEBUG oslo_concurrency.lockutils [req-17c74439-1b42-48d2-85dd-63db0c790ad9 req-4939c9bc-337a-49e1-92cc-ea86f35154a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.033 182939 DEBUG oslo_concurrency.lockutils [req-17c74439-1b42-48d2-85dd-63db0c790ad9 req-4939c9bc-337a-49e1-92cc-ea86f35154a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.033 182939 DEBUG nova.compute.manager [req-17c74439-1b42-48d2-85dd-63db0c790ad9 req-4939c9bc-337a-49e1-92cc-ea86f35154a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.034 182939 WARNING nova.compute.manager [req-17c74439-1b42-48d2-85dd-63db0c790ad9 req-4939c9bc-337a-49e1-92cc-ea86f35154a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-unplugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.110 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Thu, 22 Jan 2026 00:02:23 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-7d3354da-d608-4467-bd5a-7fca79d73a04 x-openstack-request-id: req-7d3354da-d608-4467-bd5a-7fca79d73a04 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.110 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "c3389c03-89c4-4ff5-9e03-1a99d41713d4", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c3389c03-89c4-4ff5-9e03-1a99d41713d4"}]}, {"id": "ff01ccba-ad51-439f-9037-926190d6dc0f", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/ff01ccba-ad51-439f-9037-926190d6dc0f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/ff01ccba-ad51-439f-9037-926190d6dc0f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.110 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-7d3354da-d608-4467-bd5a-7fca79d73a04 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.113 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/ff01ccba-ad51-439f-9037-926190d6dc0f -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}56c7ea009b801d9698f0c834e3db9692471ec425d445fb9ddff9e80c5e7b5f2e" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.436 182939 DEBUG nova.objects.instance [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lazy-loading 'migration_context' on Instance uuid 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.477 182939 DEBUG nova.compute.manager [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.480 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.481 182939 INFO nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Creating image(s)
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.482 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquiring lock "/var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.482 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "/var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.483 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "/var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.498 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 496 Content-Type: application/json Date: Thu, 22 Jan 2026 00:02:24 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-e8ca1522-c5f3-4dc9-a367-fa4fb148adb5 x-openstack-request-id: req-e8ca1522-c5f3-4dc9-a367-fa4fb148adb5 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.498 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "ff01ccba-ad51-439f-9037-926190d6dc0f", "name": "m1.micro", "ram": 192, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/ff01ccba-ad51-439f-9037-926190d6dc0f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/ff01ccba-ad51-439f-9037-926190d6dc0f"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.499 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/ff01ccba-ad51-439f-9037-926190d6dc0f used request id req-e8ca1522-c5f3-4dc9-a367-fa4fb148adb5 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20' (instance-00000054)
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.502 182939 DEBUG oslo_concurrency.processutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager [-] Unable to discover resources: Domain not found: no domain with matching uuid '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20' (instance-00000054): libvirt.libvirtError: Domain not found: no domain with matching uuid '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20' (instance-00000054)
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager Traceback (most recent call last):
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager   File "/usr/lib/python3.9/site-packages/ceilometer/polling/manager.py", line 604, in discover
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager     discovered = discoverer.discover(self, param)
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager   File "/usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py", line 119, in discover
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager     return self.discover_libvirt_polling(manager, param=None)
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager     return self(f, *args, **kw)
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager     do = self.iter(retry_state=retry_state)
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager     return fut.result()
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager     return self.__get_result()
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager     raise self._exception
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager     result = fn(*args, **kwargs)
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager   File "/usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py", line 189, in discover_libvirt_polling
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager     dom_state = domain.state()[0]
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager   File "/usr/lib64/python3.9/site-packages/libvirt.py", line 3266, in state
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager     raise libvirtError('virDomainGetState() failed')
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager libvirt.libvirtError: Domain not found: no domain with matching uuid '8009ab5e-1bf8-4d17-8ff2-b62b25eeff20' (instance-00000054)
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.502 12 ERROR ceilometer.polling.manager 
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.508 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.509 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.510 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.510 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.510 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.510 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.510 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.510 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.510 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.510 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.511 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.511 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.511 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.511 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:02:24.511 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.563 182939 DEBUG oslo_concurrency.processutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.564 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.564 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.575 182939 DEBUG oslo_concurrency.processutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.653 182939 DEBUG oslo_concurrency.processutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.654 182939 DEBUG oslo_concurrency.processutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:24 compute-0 podman[224755]: 2026-01-22 00:02:24.688258413 +0000 UTC m=+0.060760094 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.720 182939 DEBUG oslo_concurrency.processutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk 1073741824" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.722 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.723 182939 DEBUG oslo_concurrency.processutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.791 182939 DEBUG oslo_concurrency.processutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.793 182939 DEBUG nova.virt.disk.api [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Checking if we can resize image /var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.793 182939 DEBUG oslo_concurrency.processutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.857 182939 DEBUG oslo_concurrency.processutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.859 182939 DEBUG nova.virt.disk.api [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Cannot resize image /var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:02:24 compute-0 nova_compute[182935]: 2026-01-22 00:02:24.860 182939 DEBUG nova.objects.instance [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lazy-loading 'migration_context' on Instance uuid e216ca9d-2882-457b-955e-b7a7cd7213d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:02:25 compute-0 nova_compute[182935]: 2026-01-22 00:02:25.034 182939 DEBUG nova.compute.provider_tree [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:02:25 compute-0 nova_compute[182935]: 2026-01-22 00:02:25.144 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:02:25 compute-0 nova_compute[182935]: 2026-01-22 00:02:25.144 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Ensure instance console log exists: /var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:02:25 compute-0 nova_compute[182935]: 2026-01-22 00:02:25.145 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:25 compute-0 nova_compute[182935]: 2026-01-22 00:02:25.145 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:25 compute-0 nova_compute[182935]: 2026-01-22 00:02:25.145 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:25 compute-0 nova_compute[182935]: 2026-01-22 00:02:25.375 182939 DEBUG nova.scheduler.client.report [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:02:26 compute-0 nova_compute[182935]: 2026-01-22 00:02:26.036 182939 DEBUG oslo_concurrency.lockutils [None req-9197a828-795b-4e73-bc6e-c4fcf95c17d1 3e78a70a1d284a9d932d4a53b872df39 cccb624dbe6d4401a89e9cd254f91828 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 2.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:26 compute-0 nova_compute[182935]: 2026-01-22 00:02:26.511 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:27 compute-0 nova_compute[182935]: 2026-01-22 00:02:27.363 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:27 compute-0 nova_compute[182935]: 2026-01-22 00:02:27.937 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:27 compute-0 nova_compute[182935]: 2026-01-22 00:02:27.937 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:02:27 compute-0 nova_compute[182935]: 2026-01-22 00:02:27.938 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:02:28 compute-0 nova_compute[182935]: 2026-01-22 00:02:28.209 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 00:02:28 compute-0 nova_compute[182935]: 2026-01-22 00:02:28.209 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.465 182939 DEBUG nova.compute.manager [req-c9cfe757-df6a-4d3f-b073-7cea67ba766c req-4471aeac-152f-439f-bd40-986f5418cc66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.466 182939 DEBUG oslo_concurrency.lockutils [req-c9cfe757-df6a-4d3f-b073-7cea67ba766c req-4471aeac-152f-439f-bd40-986f5418cc66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.466 182939 DEBUG oslo_concurrency.lockutils [req-c9cfe757-df6a-4d3f-b073-7cea67ba766c req-4471aeac-152f-439f-bd40-986f5418cc66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.466 182939 DEBUG oslo_concurrency.lockutils [req-c9cfe757-df6a-4d3f-b073-7cea67ba766c req-4471aeac-152f-439f-bd40-986f5418cc66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.466 182939 DEBUG nova.compute.manager [req-c9cfe757-df6a-4d3f-b073-7cea67ba766c req-4471aeac-152f-439f-bd40-986f5418cc66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.467 182939 WARNING nova.compute.manager [req-c9cfe757-df6a-4d3f-b073-7cea67ba766c req-4471aeac-152f-439f-bd40-986f5418cc66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.637 182939 DEBUG nova.network.neutron [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Successfully updated port: 637a179d-ce7a-481e-ba08-5430bd76b13b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.782 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquiring lock "refresh_cache-e216ca9d-2882-457b-955e-b7a7cd7213d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.782 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquired lock "refresh_cache-e216ca9d-2882-457b-955e-b7a7cd7213d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.782 182939 DEBUG nova.network.neutron [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.949 182939 DEBUG nova.compute.manager [req-af970e60-5657-4bf5-b27f-3318b86bb453 req-e4b74120-2a7a-4f2d-b845-d8384eafb94c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received event network-changed-637a179d-ce7a-481e-ba08-5430bd76b13b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.949 182939 DEBUG nova.compute.manager [req-af970e60-5657-4bf5-b27f-3318b86bb453 req-e4b74120-2a7a-4f2d-b845-d8384eafb94c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Refreshing instance network info cache due to event network-changed-637a179d-ce7a-481e-ba08-5430bd76b13b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:02:29 compute-0 nova_compute[182935]: 2026-01-22 00:02:29.950 182939 DEBUG oslo_concurrency.lockutils [req-af970e60-5657-4bf5-b27f-3318b86bb453 req-e4b74120-2a7a-4f2d-b845-d8384eafb94c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-e216ca9d-2882-457b-955e-b7a7cd7213d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:02:30 compute-0 nova_compute[182935]: 2026-01-22 00:02:30.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:30 compute-0 nova_compute[182935]: 2026-01-22 00:02:30.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:02:31 compute-0 nova_compute[182935]: 2026-01-22 00:02:31.515 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:31 compute-0 nova_compute[182935]: 2026-01-22 00:02:31.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.075 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.076 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.076 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.077 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:02:32 compute-0 podman[224786]: 2026-01-22 00:02:32.239677745 +0000 UTC m=+0.081863721 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, config_id=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 00:02:32 compute-0 podman[224787]: 2026-01-22 00:02:32.267517375 +0000 UTC m=+0.096977285 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.364 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.390 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.392 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5636MB free_disk=73.20215606689453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.392 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.393 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.412 182939 DEBUG nova.network.neutron [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.584 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance e216ca9d-2882-457b-955e-b7a7cd7213d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.585 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.585 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.634 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:02:32 compute-0 nova_compute[182935]: 2026-01-22 00:02:32.979 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:02:33 compute-0 nova_compute[182935]: 2026-01-22 00:02:33.445 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:02:33 compute-0 nova_compute[182935]: 2026-01-22 00:02:33.446 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:33 compute-0 nova_compute[182935]: 2026-01-22 00:02:33.763 182939 DEBUG nova.network.neutron [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Updating instance_info_cache with network_info: [{"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:33 compute-0 nova_compute[182935]: 2026-01-22 00:02:33.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:33 compute-0 nova_compute[182935]: 2026-01-22 00:02:33.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:33 compute-0 nova_compute[182935]: 2026-01-22 00:02:33.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.442 182939 DEBUG nova.compute.manager [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-changed-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.442 182939 DEBUG nova.compute.manager [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Refreshing instance network info cache due to event network-changed-5965ccd1-7d75-4079-ade6-e1859a860162. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.443 182939 DEBUG oslo_concurrency.lockutils [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.443 182939 DEBUG oslo_concurrency.lockutils [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.444 182939 DEBUG nova.network.neutron [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Refreshing network info cache for port 5965ccd1-7d75-4079-ade6-e1859a860162 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.447 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Releasing lock "refresh_cache-e216ca9d-2882-457b-955e-b7a7cd7213d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.447 182939 DEBUG nova.compute.manager [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Instance network_info: |[{"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.448 182939 DEBUG oslo_concurrency.lockutils [req-af970e60-5657-4bf5-b27f-3318b86bb453 req-e4b74120-2a7a-4f2d-b845-d8384eafb94c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-e216ca9d-2882-457b-955e-b7a7cd7213d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.449 182939 DEBUG nova.network.neutron [req-af970e60-5657-4bf5-b27f-3318b86bb453 req-e4b74120-2a7a-4f2d-b845-d8384eafb94c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Refreshing network info cache for port 637a179d-ce7a-481e-ba08-5430bd76b13b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.453 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Start _get_guest_xml network_info=[{"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.461 182939 WARNING nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.467 182939 DEBUG nova.virt.libvirt.host [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.468 182939 DEBUG nova.virt.libvirt.host [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.476 182939 DEBUG nova.virt.libvirt.host [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.477 182939 DEBUG nova.virt.libvirt.host [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.479 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.479 182939 DEBUG nova.virt.hardware [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.480 182939 DEBUG nova.virt.hardware [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.481 182939 DEBUG nova.virt.hardware [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.481 182939 DEBUG nova.virt.hardware [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.482 182939 DEBUG nova.virt.hardware [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.482 182939 DEBUG nova.virt.hardware [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.482 182939 DEBUG nova.virt.hardware [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.483 182939 DEBUG nova.virt.hardware [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.483 182939 DEBUG nova.virt.hardware [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.484 182939 DEBUG nova.virt.hardware [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.484 182939 DEBUG nova.virt.hardware [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.490 182939 DEBUG nova.virt.libvirt.vif [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:02:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-803124597',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-803124597',id=87,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1b63a2653b604354979ee32dbb6cd6c6',ramdisk_id='',reservation_id='r-m6owog58',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1840931641',owner_user_name='tempest-AttachInterfacesV270Test-1840931641-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:02:21Z,user_data=None,user_id='cc334793f1e0484084ad779dd9ef0596',uuid=e216ca9d-2882-457b-955e-b7a7cd7213d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.491 182939 DEBUG nova.network.os_vif_util [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Converting VIF {"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.492 182939 DEBUG nova.network.os_vif_util [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:3b:f8,bridge_name='br-int',has_traffic_filtering=True,id=637a179d-ce7a-481e-ba08-5430bd76b13b,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637a179d-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.493 182939 DEBUG nova.objects.instance [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lazy-loading 'pci_devices' on Instance uuid e216ca9d-2882-457b-955e-b7a7cd7213d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.837 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:02:34 compute-0 nova_compute[182935]:   <uuid>e216ca9d-2882-457b-955e-b7a7cd7213d2</uuid>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   <name>instance-00000057</name>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <nova:name>tempest-AttachInterfacesV270Test-server-803124597</nova:name>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:02:34</nova:creationTime>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:02:34 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:02:34 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:02:34 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:02:34 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:02:34 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:02:34 compute-0 nova_compute[182935]:         <nova:user uuid="cc334793f1e0484084ad779dd9ef0596">tempest-AttachInterfacesV270Test-1840931641-project-member</nova:user>
Jan 22 00:02:34 compute-0 nova_compute[182935]:         <nova:project uuid="1b63a2653b604354979ee32dbb6cd6c6">tempest-AttachInterfacesV270Test-1840931641</nova:project>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:02:34 compute-0 nova_compute[182935]:         <nova:port uuid="637a179d-ce7a-481e-ba08-5430bd76b13b">
Jan 22 00:02:34 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <system>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <entry name="serial">e216ca9d-2882-457b-955e-b7a7cd7213d2</entry>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <entry name="uuid">e216ca9d-2882-457b-955e-b7a7cd7213d2</entry>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     </system>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   <os>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   </os>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   <features>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   </features>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk.config"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:11:3b:f8"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <target dev="tap637a179d-ce"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/console.log" append="off"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <video>
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     </video>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:02:34 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:02:34 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:02:34 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:02:34 compute-0 nova_compute[182935]: </domain>
Jan 22 00:02:34 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.839 182939 DEBUG nova.compute.manager [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Preparing to wait for external event network-vif-plugged-637a179d-ce7a-481e-ba08-5430bd76b13b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.840 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquiring lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.840 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.840 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.842 182939 DEBUG nova.virt.libvirt.vif [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:02:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-803124597',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-803124597',id=87,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1b63a2653b604354979ee32dbb6cd6c6',ramdisk_id='',reservation_id='r-m6owog58',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1840931641',owner_user_name='tempest-AttachInterfacesV270Test-1840931641-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:02:21Z,user_data=None,user_id='cc334793f1e0484084ad779dd9ef0596',uuid=e216ca9d-2882-457b-955e-b7a7cd7213d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.842 182939 DEBUG nova.network.os_vif_util [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Converting VIF {"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.844 182939 DEBUG nova.network.os_vif_util [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:3b:f8,bridge_name='br-int',has_traffic_filtering=True,id=637a179d-ce7a-481e-ba08-5430bd76b13b,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637a179d-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.844 182939 DEBUG os_vif [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:3b:f8,bridge_name='br-int',has_traffic_filtering=True,id=637a179d-ce7a-481e-ba08-5430bd76b13b,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637a179d-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.845 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.846 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.846 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.851 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.852 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap637a179d-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.853 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap637a179d-ce, col_values=(('external_ids', {'iface-id': '637a179d-ce7a-481e-ba08-5430bd76b13b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:3b:f8', 'vm-uuid': 'e216ca9d-2882-457b-955e-b7a7cd7213d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.855 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.856 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:02:34 compute-0 NetworkManager[55139]: <info>  [1769040154.8571] manager: (tap637a179d-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.863 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.864 182939 INFO os_vif [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:3b:f8,bridge_name='br-int',has_traffic_filtering=True,id=637a179d-ce7a-481e-ba08-5430bd76b13b,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637a179d-ce')
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.936 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:34 compute-0 nova_compute[182935]: 2026-01-22 00:02:34.937 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:35 compute-0 nova_compute[182935]: 2026-01-22 00:02:35.537 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:02:35 compute-0 nova_compute[182935]: 2026-01-22 00:02:35.537 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:02:35 compute-0 nova_compute[182935]: 2026-01-22 00:02:35.538 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] No VIF found with MAC fa:16:3e:11:3b:f8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:02:35 compute-0 nova_compute[182935]: 2026-01-22 00:02:35.538 182939 INFO nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Using config drive
Jan 22 00:02:36 compute-0 nova_compute[182935]: 2026-01-22 00:02:36.518 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:36 compute-0 nova_compute[182935]: 2026-01-22 00:02:36.529 182939 INFO nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Creating config drive at /var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk.config
Jan 22 00:02:36 compute-0 nova_compute[182935]: 2026-01-22 00:02:36.536 182939 DEBUG oslo_concurrency.processutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsegfd2j1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:02:36 compute-0 nova_compute[182935]: 2026-01-22 00:02:36.667 182939 DEBUG oslo_concurrency.processutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsegfd2j1" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:02:36 compute-0 nova_compute[182935]: 2026-01-22 00:02:36.704 182939 DEBUG nova.network.neutron [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updated VIF entry in instance network info cache for port 5965ccd1-7d75-4079-ade6-e1859a860162. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:02:36 compute-0 nova_compute[182935]: 2026-01-22 00:02:36.706 182939 DEBUG nova.network.neutron [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Updating instance_info_cache with network_info: [{"id": "5965ccd1-7d75-4079-ade6-e1859a860162", "address": "fa:16:3e:c9:be:ae", "network": {"id": "19c3e0c8-5563-479c-995a-ab38d8b8c7f7", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-10713966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cccb624dbe6d4401a89e9cd254f91828", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5965ccd1-7d", "ovs_interfaceid": "5965ccd1-7d75-4079-ade6-e1859a860162", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:36 compute-0 kernel: tap637a179d-ce: entered promiscuous mode
Jan 22 00:02:36 compute-0 ovn_controller[95047]: 2026-01-22T00:02:36Z|00323|binding|INFO|Claiming lport 637a179d-ce7a-481e-ba08-5430bd76b13b for this chassis.
Jan 22 00:02:36 compute-0 ovn_controller[95047]: 2026-01-22T00:02:36Z|00324|binding|INFO|637a179d-ce7a-481e-ba08-5430bd76b13b: Claiming fa:16:3e:11:3b:f8 10.100.0.7
Jan 22 00:02:36 compute-0 nova_compute[182935]: 2026-01-22 00:02:36.766 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:36 compute-0 NetworkManager[55139]: <info>  [1769040156.7683] manager: (tap637a179d-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/147)
Jan 22 00:02:36 compute-0 ovn_controller[95047]: 2026-01-22T00:02:36Z|00325|binding|INFO|Setting lport 637a179d-ce7a-481e-ba08-5430bd76b13b ovn-installed in OVS
Jan 22 00:02:36 compute-0 nova_compute[182935]: 2026-01-22 00:02:36.777 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:36 compute-0 nova_compute[182935]: 2026-01-22 00:02:36.779 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:36 compute-0 nova_compute[182935]: 2026-01-22 00:02:36.783 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:36 compute-0 nova_compute[182935]: 2026-01-22 00:02:36.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:36 compute-0 systemd-machined[154182]: New machine qemu-44-instance-00000057.
Jan 22 00:02:36 compute-0 systemd-udevd[224848]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:02:36 compute-0 systemd[1]: Started Virtual Machine qemu-44-instance-00000057.
Jan 22 00:02:36 compute-0 NetworkManager[55139]: <info>  [1769040156.8413] device (tap637a179d-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:02:36 compute-0 NetworkManager[55139]: <info>  [1769040156.8427] device (tap637a179d-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:02:36 compute-0 ovn_controller[95047]: 2026-01-22T00:02:36Z|00326|binding|INFO|Setting lport 637a179d-ce7a-481e-ba08-5430bd76b13b up in Southbound
Jan 22 00:02:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:36.926 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:3b:f8 10.100.0.7'], port_security=['fa:16:3e:11:3b:f8 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e216ca9d-2882-457b-955e-b7a7cd7213d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e6baf5c-6213-46a4-a17e-5ee389013511', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1b63a2653b604354979ee32dbb6cd6c6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '574903b5-96b8-417f-a4ae-c6746a9fa810', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e01e65b5-3b0b-4206-b1f8-00c526cd5319, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=637a179d-ce7a-481e-ba08-5430bd76b13b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:02:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:36.927 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 637a179d-ce7a-481e-ba08-5430bd76b13b in datapath 2e6baf5c-6213-46a4-a17e-5ee389013511 bound to our chassis
Jan 22 00:02:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:36.929 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2e6baf5c-6213-46a4-a17e-5ee389013511
Jan 22 00:02:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:36.949 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[26f82f09-baf8-4642-ad89-051d1b2d9ee5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:36.950 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2e6baf5c-61 in ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:02:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:36.954 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2e6baf5c-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:02:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:36.954 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[61c0083d-f72a-419d-b58d-d3029adb1591]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:36.955 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2136d910-4e92-4662-8522-80735e165a58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:36.976 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[446b86e6-2d96-4553-b541-ebfc2dae9397]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:36 compute-0 nova_compute[182935]: 2026-01-22 00:02:36.997 182939 DEBUG oslo_concurrency.lockutils [req-542bb107-f9f0-448b-ac24-30b9094c209f req-161299a5-cd6a-42d0-83f0-157538585df4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-8009ab5e-1bf8-4d17-8ff2-b62b25eeff20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.008 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[32d144b1-4bab-43ff-928c-e3777db2bece]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.059 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[1a70c988-b979-417d-9f34-b240c259248a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.069 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[13094e5e-9026-424e-bda6-8d3bc8096efb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-0 NetworkManager[55139]: <info>  [1769040157.0703] manager: (tap2e6baf5c-60): new Veth device (/org/freedesktop/NetworkManager/Devices/148)
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.104 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040142.1022651, 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.106 182939 INFO nova.compute.manager [-] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] VM Stopped (Lifecycle Event)
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.128 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[ea7bfe29-04b7-4fcc-94f2-c6c773653e1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.135 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[da26db30-87fc-41ec-90a4-c1ffb7fe2f7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-0 NetworkManager[55139]: <info>  [1769040157.1719] device (tap2e6baf5c-60): carrier: link connected
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.182 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[72033ae9-5482-40fa-9398-dfe7fa34a928]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.214 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[16617f86-348f-477c-8e8b-3cf1a914bdb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2e6baf5c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:ea:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 101], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465991, 'reachable_time': 20295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224881, 'error': None, 'target': 'ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.244 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e966d944-8a17-4da6-bcd9-1a4e6b3fa6d0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2c:ea27'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 465991, 'tstamp': 465991}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224882, 'error': None, 'target': 'ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.277 182939 DEBUG nova.compute.manager [None req-b63afb11-b8b1-430c-b064-6ea59f7dcbd9 - - - - - -] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.279 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6f15d9d2-214f-4038-a156-08c930d1fce5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2e6baf5c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:ea:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 101], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465991, 'reachable_time': 20295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224883, 'error': None, 'target': 'ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.331 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4e4353de-7507-4b6f-9079-85a8dce3186d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.446 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[64768867-fb37-4d59-a705-350994c714e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.448 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e6baf5c-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.448 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.449 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e6baf5c-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.451 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-0 NetworkManager[55139]: <info>  [1769040157.4528] manager: (tap2e6baf5c-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Jan 22 00:02:37 compute-0 kernel: tap2e6baf5c-60: entered promiscuous mode
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.456 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.459 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2e6baf5c-60, col_values=(('external_ids', {'iface-id': 'eba1ad2c-eba9-413b-8318-773455c1e2ff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.461 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-0 ovn_controller[95047]: 2026-01-22T00:02:37Z|00327|binding|INFO|Releasing lport eba1ad2c-eba9-413b-8318-773455c1e2ff from this chassis (sb_readonly=0)
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.479 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040157.4787307, e216ca9d-2882-457b-955e-b7a7cd7213d2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.479 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] VM Started (Lifecycle Event)
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.489 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.490 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2e6baf5c-6213-46a4-a17e-5ee389013511.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2e6baf5c-6213-46a4-a17e-5ee389013511.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.492 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0a4035ab-ef44-4a6a-a1a2-670bf9c61adf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.493 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-2e6baf5c-6213-46a4-a17e-5ee389013511
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/2e6baf5c-6213-46a4-a17e-5ee389013511.pid.haproxy
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 2e6baf5c-6213-46a4-a17e-5ee389013511
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:02:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:37.495 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511', 'env', 'PROCESS_TAG=haproxy-2e6baf5c-6213-46a4-a17e-5ee389013511', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2e6baf5c-6213-46a4-a17e-5ee389013511.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.560 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.565 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040157.4788342, e216ca9d-2882-457b-955e-b7a7cd7213d2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.566 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] VM Paused (Lifecycle Event)
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.663 182939 DEBUG nova.network.neutron [req-af970e60-5657-4bf5-b27f-3318b86bb453 req-e4b74120-2a7a-4f2d-b845-d8384eafb94c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Updated VIF entry in instance network info cache for port 637a179d-ce7a-481e-ba08-5430bd76b13b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:02:37 compute-0 nova_compute[182935]: 2026-01-22 00:02:37.664 182939 DEBUG nova.network.neutron [req-af970e60-5657-4bf5-b27f-3318b86bb453 req-e4b74120-2a7a-4f2d-b845-d8384eafb94c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Updating instance_info_cache with network_info: [{"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:37 compute-0 podman[224920]: 2026-01-22 00:02:37.980989204 +0000 UTC m=+0.083413079 container create 454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:02:38 compute-0 podman[224920]: 2026-01-22 00:02:37.939133266 +0000 UTC m=+0.041557121 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:02:38 compute-0 systemd[1]: Started libpod-conmon-454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673.scope.
Jan 22 00:02:38 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.078 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.080 182939 DEBUG oslo_concurrency.lockutils [req-af970e60-5657-4bf5-b27f-3318b86bb453 req-e4b74120-2a7a-4f2d-b845-d8384eafb94c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-e216ca9d-2882-457b-955e-b7a7cd7213d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:02:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48a9b2bf2b3e64a2a0622c5492d22212a875a9f82e6627f326e3d0bfe8498871/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.086 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:02:38 compute-0 podman[224920]: 2026-01-22 00:02:38.103028842 +0000 UTC m=+0.205452727 container init 454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 00:02:38 compute-0 podman[224920]: 2026-01-22 00:02:38.113639217 +0000 UTC m=+0.216063062 container start 454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:02:38 compute-0 neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511[224936]: [NOTICE]   (224940) : New worker (224942) forked
Jan 22 00:02:38 compute-0 neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511[224936]: [NOTICE]   (224940) : Loading success.
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.256 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.537 182939 DEBUG nova.compute.manager [req-9df57746-f212-4e3f-b989-28e1c7c2e17e req-7b58daa0-98f4-413e-85ba-43ed49a9e5a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received event network-vif-plugged-637a179d-ce7a-481e-ba08-5430bd76b13b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.538 182939 DEBUG oslo_concurrency.lockutils [req-9df57746-f212-4e3f-b989-28e1c7c2e17e req-7b58daa0-98f4-413e-85ba-43ed49a9e5a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.538 182939 DEBUG oslo_concurrency.lockutils [req-9df57746-f212-4e3f-b989-28e1c7c2e17e req-7b58daa0-98f4-413e-85ba-43ed49a9e5a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.539 182939 DEBUG oslo_concurrency.lockutils [req-9df57746-f212-4e3f-b989-28e1c7c2e17e req-7b58daa0-98f4-413e-85ba-43ed49a9e5a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.539 182939 DEBUG nova.compute.manager [req-9df57746-f212-4e3f-b989-28e1c7c2e17e req-7b58daa0-98f4-413e-85ba-43ed49a9e5a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Processing event network-vif-plugged-637a179d-ce7a-481e-ba08-5430bd76b13b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.541 182939 DEBUG nova.compute.manager [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.545 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040158.5456133, e216ca9d-2882-457b-955e-b7a7cd7213d2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.546 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] VM Resumed (Lifecycle Event)
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.550 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.556 182939 INFO nova.virt.libvirt.driver [-] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Instance spawned successfully.
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.557 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.788 182939 DEBUG nova.compute.manager [req-4af6ca45-6295-44fe-a037-1fe6d0d10854 req-c7ae5120-3e11-43cf-875a-a4ab4e4af83a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.789 182939 DEBUG oslo_concurrency.lockutils [req-4af6ca45-6295-44fe-a037-1fe6d0d10854 req-c7ae5120-3e11-43cf-875a-a4ab4e4af83a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.790 182939 DEBUG oslo_concurrency.lockutils [req-4af6ca45-6295-44fe-a037-1fe6d0d10854 req-c7ae5120-3e11-43cf-875a-a4ab4e4af83a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.790 182939 DEBUG oslo_concurrency.lockutils [req-4af6ca45-6295-44fe-a037-1fe6d0d10854 req-c7ae5120-3e11-43cf-875a-a4ab4e4af83a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "8009ab5e-1bf8-4d17-8ff2-b62b25eeff20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.791 182939 DEBUG nova.compute.manager [req-4af6ca45-6295-44fe-a037-1fe6d0d10854 req-c7ae5120-3e11-43cf-875a-a4ab4e4af83a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] No waiting events found dispatching network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.791 182939 WARNING nova.compute.manager [req-4af6ca45-6295-44fe-a037-1fe6d0d10854 req-c7ae5120-3e11-43cf-875a-a4ab4e4af83a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 8009ab5e-1bf8-4d17-8ff2-b62b25eeff20] Received unexpected event network-vif-plugged-5965ccd1-7d75-4079-ade6-e1859a860162 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.813 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.819 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.820 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.820 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.821 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.822 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.823 182939 DEBUG nova.virt.libvirt.driver [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:02:38 compute-0 nova_compute[182935]: 2026-01-22 00:02:38.832 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:02:39 compute-0 nova_compute[182935]: 2026-01-22 00:02:39.248 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:02:39 compute-0 nova_compute[182935]: 2026-01-22 00:02:39.824 182939 INFO nova.compute.manager [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Took 15.35 seconds to spawn the instance on the hypervisor.
Jan 22 00:02:39 compute-0 nova_compute[182935]: 2026-01-22 00:02:39.825 182939 DEBUG nova.compute.manager [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:02:39 compute-0 nova_compute[182935]: 2026-01-22 00:02:39.907 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:40.653 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:02:40 compute-0 nova_compute[182935]: 2026-01-22 00:02:40.653 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:40.656 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:02:41 compute-0 nova_compute[182935]: 2026-01-22 00:02:41.521 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:41 compute-0 nova_compute[182935]: 2026-01-22 00:02:41.813 182939 DEBUG nova.compute.manager [req-c5c591e9-682b-48e4-9395-4ad2ea815bf9 req-3ec34b17-86ac-427d-a55c-cdd2f42a8c91 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received event network-vif-plugged-637a179d-ce7a-481e-ba08-5430bd76b13b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:41 compute-0 nova_compute[182935]: 2026-01-22 00:02:41.814 182939 DEBUG oslo_concurrency.lockutils [req-c5c591e9-682b-48e4-9395-4ad2ea815bf9 req-3ec34b17-86ac-427d-a55c-cdd2f42a8c91 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:41 compute-0 nova_compute[182935]: 2026-01-22 00:02:41.815 182939 DEBUG oslo_concurrency.lockutils [req-c5c591e9-682b-48e4-9395-4ad2ea815bf9 req-3ec34b17-86ac-427d-a55c-cdd2f42a8c91 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:41 compute-0 nova_compute[182935]: 2026-01-22 00:02:41.816 182939 DEBUG oslo_concurrency.lockutils [req-c5c591e9-682b-48e4-9395-4ad2ea815bf9 req-3ec34b17-86ac-427d-a55c-cdd2f42a8c91 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:41 compute-0 nova_compute[182935]: 2026-01-22 00:02:41.816 182939 DEBUG nova.compute.manager [req-c5c591e9-682b-48e4-9395-4ad2ea815bf9 req-3ec34b17-86ac-427d-a55c-cdd2f42a8c91 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] No waiting events found dispatching network-vif-plugged-637a179d-ce7a-481e-ba08-5430bd76b13b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:41 compute-0 nova_compute[182935]: 2026-01-22 00:02:41.817 182939 WARNING nova.compute.manager [req-c5c591e9-682b-48e4-9395-4ad2ea815bf9 req-3ec34b17-86ac-427d-a55c-cdd2f42a8c91 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received unexpected event network-vif-plugged-637a179d-ce7a-481e-ba08-5430bd76b13b for instance with vm_state active and task_state None.
Jan 22 00:02:42 compute-0 ovn_controller[95047]: 2026-01-22T00:02:42Z|00328|binding|INFO|Releasing lport eba1ad2c-eba9-413b-8318-773455c1e2ff from this chassis (sb_readonly=0)
Jan 22 00:02:42 compute-0 nova_compute[182935]: 2026-01-22 00:02:42.462 182939 INFO nova.compute.manager [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Took 24.87 seconds to build instance.
Jan 22 00:02:42 compute-0 nova_compute[182935]: 2026-01-22 00:02:42.475 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:42 compute-0 ovn_controller[95047]: 2026-01-22T00:02:42Z|00329|binding|INFO|Releasing lport eba1ad2c-eba9-413b-8318-773455c1e2ff from this chassis (sb_readonly=0)
Jan 22 00:02:42 compute-0 nova_compute[182935]: 2026-01-22 00:02:42.671 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:42 compute-0 podman[224954]: 2026-01-22 00:02:42.752834435 +0000 UTC m=+0.082425796 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:02:42 compute-0 podman[224952]: 2026-01-22 00:02:42.804284653 +0000 UTC m=+0.140210397 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:02:42 compute-0 nova_compute[182935]: 2026-01-22 00:02:42.869 182939 DEBUG oslo_concurrency.lockutils [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquiring lock "interface-e216ca9d-2882-457b-955e-b7a7cd7213d2-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:42 compute-0 nova_compute[182935]: 2026-01-22 00:02:42.870 182939 DEBUG oslo_concurrency.lockutils [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "interface-e216ca9d-2882-457b-955e-b7a7cd7213d2-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:42 compute-0 nova_compute[182935]: 2026-01-22 00:02:42.870 182939 DEBUG nova.objects.instance [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lazy-loading 'flavor' on Instance uuid e216ca9d-2882-457b-955e-b7a7cd7213d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:02:43 compute-0 nova_compute[182935]: 2026-01-22 00:02:43.142 182939 DEBUG oslo_concurrency.lockutils [None req-d2926bee-0fd4-42c1-bdc4-f13b2b1c2724 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:43 compute-0 nova_compute[182935]: 2026-01-22 00:02:43.576 182939 DEBUG nova.objects.instance [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lazy-loading 'pci_requests' on Instance uuid e216ca9d-2882-457b-955e-b7a7cd7213d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:02:43 compute-0 nova_compute[182935]: 2026-01-22 00:02:43.919 182939 DEBUG nova.network.neutron [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:02:44 compute-0 nova_compute[182935]: 2026-01-22 00:02:44.909 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:45.659 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:46 compute-0 nova_compute[182935]: 2026-01-22 00:02:46.524 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:46 compute-0 nova_compute[182935]: 2026-01-22 00:02:46.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:47 compute-0 nova_compute[182935]: 2026-01-22 00:02:47.022 182939 DEBUG nova.policy [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cc334793f1e0484084ad779dd9ef0596', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1b63a2653b604354979ee32dbb6cd6c6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:02:48 compute-0 nova_compute[182935]: 2026-01-22 00:02:48.624 182939 DEBUG nova.network.neutron [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Successfully created port: ae98e2be-52c6-44c0-a8f4-4c279bedab2a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:02:49 compute-0 nova_compute[182935]: 2026-01-22 00:02:49.914 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:50 compute-0 nova_compute[182935]: 2026-01-22 00:02:50.150 182939 DEBUG nova.network.neutron [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Successfully updated port: ae98e2be-52c6-44c0-a8f4-4c279bedab2a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:02:50 compute-0 nova_compute[182935]: 2026-01-22 00:02:50.169 182939 DEBUG oslo_concurrency.lockutils [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquiring lock "refresh_cache-e216ca9d-2882-457b-955e-b7a7cd7213d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:02:50 compute-0 nova_compute[182935]: 2026-01-22 00:02:50.169 182939 DEBUG oslo_concurrency.lockutils [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquired lock "refresh_cache-e216ca9d-2882-457b-955e-b7a7cd7213d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:02:50 compute-0 nova_compute[182935]: 2026-01-22 00:02:50.170 182939 DEBUG nova.network.neutron [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:02:50 compute-0 nova_compute[182935]: 2026-01-22 00:02:50.411 182939 DEBUG nova.compute.manager [req-e37490a0-a035-45d7-87e8-2c9bc85ed321 req-03a9e3af-ab9c-4ca6-9610-daa6dc1cf86f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received event network-changed-ae98e2be-52c6-44c0-a8f4-4c279bedab2a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:50 compute-0 nova_compute[182935]: 2026-01-22 00:02:50.412 182939 DEBUG nova.compute.manager [req-e37490a0-a035-45d7-87e8-2c9bc85ed321 req-03a9e3af-ab9c-4ca6-9610-daa6dc1cf86f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Refreshing instance network info cache due to event network-changed-ae98e2be-52c6-44c0-a8f4-4c279bedab2a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:02:50 compute-0 nova_compute[182935]: 2026-01-22 00:02:50.412 182939 DEBUG oslo_concurrency.lockutils [req-e37490a0-a035-45d7-87e8-2c9bc85ed321 req-03a9e3af-ab9c-4ca6-9610-daa6dc1cf86f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-e216ca9d-2882-457b-955e-b7a7cd7213d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:02:50 compute-0 nova_compute[182935]: 2026-01-22 00:02:50.643 182939 WARNING nova.network.neutron [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] 2e6baf5c-6213-46a4-a17e-5ee389013511 already exists in list: networks containing: ['2e6baf5c-6213-46a4-a17e-5ee389013511']. ignoring it
Jan 22 00:02:50 compute-0 sshd-session[225015]: Invalid user git from 188.166.69.60 port 50150
Jan 22 00:02:51 compute-0 sshd-session[225015]: Connection closed by invalid user git 188.166.69.60 port 50150 [preauth]
Jan 22 00:02:51 compute-0 podman[225017]: 2026-01-22 00:02:51.049487376 +0000 UTC m=+0.077477165 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:02:51 compute-0 nova_compute[182935]: 2026-01-22 00:02:51.527 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:52 compute-0 ovn_controller[95047]: 2026-01-22T00:02:52Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:11:3b:f8 10.100.0.7
Jan 22 00:02:52 compute-0 ovn_controller[95047]: 2026-01-22T00:02:52Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:11:3b:f8 10.100.0.7
Jan 22 00:02:52 compute-0 nova_compute[182935]: 2026-01-22 00:02:52.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:02:52 compute-0 nova_compute[182935]: 2026-01-22 00:02:52.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:02:52 compute-0 nova_compute[182935]: 2026-01-22 00:02:52.910 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.179 182939 DEBUG nova.network.neutron [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Updating instance_info_cache with network_info: [{"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "address": "fa:16:3e:38:7a:e0", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae98e2be-52", "ovs_interfaceid": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.421 182939 DEBUG oslo_concurrency.lockutils [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Releasing lock "refresh_cache-e216ca9d-2882-457b-955e-b7a7cd7213d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.423 182939 DEBUG oslo_concurrency.lockutils [req-e37490a0-a035-45d7-87e8-2c9bc85ed321 req-03a9e3af-ab9c-4ca6-9610-daa6dc1cf86f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-e216ca9d-2882-457b-955e-b7a7cd7213d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.423 182939 DEBUG nova.network.neutron [req-e37490a0-a035-45d7-87e8-2c9bc85ed321 req-03a9e3af-ab9c-4ca6-9610-daa6dc1cf86f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Refreshing network info cache for port ae98e2be-52c6-44c0-a8f4-4c279bedab2a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.426 182939 DEBUG nova.virt.libvirt.vif [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:02:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-803124597',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-803124597',id=87,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:02:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1b63a2653b604354979ee32dbb6cd6c6',ramdisk_id='',reservation_id='r-m6owog58',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1840931641',owner_user_name='tempest-AttachInterfacesV270Test-1840931641-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:02:40Z,user_data=None,user_id='cc334793f1e0484084ad779dd9ef0596',uuid=e216ca9d-2882-457b-955e-b7a7cd7213d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "address": "fa:16:3e:38:7a:e0", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae98e2be-52", "ovs_interfaceid": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.427 182939 DEBUG nova.network.os_vif_util [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Converting VIF {"id": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "address": "fa:16:3e:38:7a:e0", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae98e2be-52", "ovs_interfaceid": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.428 182939 DEBUG nova.network.os_vif_util [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:7a:e0,bridge_name='br-int',has_traffic_filtering=True,id=ae98e2be-52c6-44c0-a8f4-4c279bedab2a,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae98e2be-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.428 182939 DEBUG os_vif [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:7a:e0,bridge_name='br-int',has_traffic_filtering=True,id=ae98e2be-52c6-44c0-a8f4-4c279bedab2a,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae98e2be-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.429 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.429 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.430 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.434 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.435 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae98e2be-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.435 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapae98e2be-52, col_values=(('external_ids', {'iface-id': 'ae98e2be-52c6-44c0-a8f4-4c279bedab2a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:7a:e0', 'vm-uuid': 'e216ca9d-2882-457b-955e-b7a7cd7213d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.437 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.440 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:02:53 compute-0 NetworkManager[55139]: <info>  [1769040173.4403] manager: (tapae98e2be-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.449 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.451 182939 INFO os_vif [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:7a:e0,bridge_name='br-int',has_traffic_filtering=True,id=ae98e2be-52c6-44c0-a8f4-4c279bedab2a,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae98e2be-52')
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.452 182939 DEBUG nova.virt.libvirt.vif [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:02:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-803124597',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-803124597',id=87,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:02:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1b63a2653b604354979ee32dbb6cd6c6',ramdisk_id='',reservation_id='r-m6owog58',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1840931641',owner_user_name='tempest-AttachInterfacesV270Test-1840931641-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:02:40Z,user_data=None,user_id='cc334793f1e0484084ad779dd9ef0596',uuid=e216ca9d-2882-457b-955e-b7a7cd7213d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "address": "fa:16:3e:38:7a:e0", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae98e2be-52", "ovs_interfaceid": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.453 182939 DEBUG nova.network.os_vif_util [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Converting VIF {"id": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "address": "fa:16:3e:38:7a:e0", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae98e2be-52", "ovs_interfaceid": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.454 182939 DEBUG nova.network.os_vif_util [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:7a:e0,bridge_name='br-int',has_traffic_filtering=True,id=ae98e2be-52c6-44c0-a8f4-4c279bedab2a,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae98e2be-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.457 182939 DEBUG nova.virt.libvirt.guest [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] attach device xml: <interface type="ethernet">
Jan 22 00:02:53 compute-0 nova_compute[182935]:   <mac address="fa:16:3e:38:7a:e0"/>
Jan 22 00:02:53 compute-0 nova_compute[182935]:   <model type="virtio"/>
Jan 22 00:02:53 compute-0 nova_compute[182935]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:02:53 compute-0 nova_compute[182935]:   <mtu size="1442"/>
Jan 22 00:02:53 compute-0 nova_compute[182935]:   <target dev="tapae98e2be-52"/>
Jan 22 00:02:53 compute-0 nova_compute[182935]: </interface>
Jan 22 00:02:53 compute-0 nova_compute[182935]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 22 00:02:53 compute-0 kernel: tapae98e2be-52: entered promiscuous mode
Jan 22 00:02:53 compute-0 NetworkManager[55139]: <info>  [1769040173.4789] manager: (tapae98e2be-52): new Tun device (/org/freedesktop/NetworkManager/Devices/151)
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.479 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:53 compute-0 ovn_controller[95047]: 2026-01-22T00:02:53Z|00330|binding|INFO|Claiming lport ae98e2be-52c6-44c0-a8f4-4c279bedab2a for this chassis.
Jan 22 00:02:53 compute-0 ovn_controller[95047]: 2026-01-22T00:02:53Z|00331|binding|INFO|ae98e2be-52c6-44c0-a8f4-4c279bedab2a: Claiming fa:16:3e:38:7a:e0 10.100.0.8
Jan 22 00:02:53 compute-0 ovn_controller[95047]: 2026-01-22T00:02:53Z|00332|binding|INFO|Setting lport ae98e2be-52c6-44c0-a8f4-4c279bedab2a ovn-installed in OVS
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.511 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:53 compute-0 systemd-udevd[225046]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.519 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:53 compute-0 NetworkManager[55139]: <info>  [1769040173.5372] device (tapae98e2be-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:02:53 compute-0 NetworkManager[55139]: <info>  [1769040173.5395] device (tapae98e2be-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:02:53 compute-0 ovn_controller[95047]: 2026-01-22T00:02:53Z|00333|binding|INFO|Setting lport ae98e2be-52c6-44c0-a8f4-4c279bedab2a up in Southbound
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.754 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:7a:e0 10.100.0.8'], port_security=['fa:16:3e:38:7a:e0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e216ca9d-2882-457b-955e-b7a7cd7213d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e6baf5c-6213-46a4-a17e-5ee389013511', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1b63a2653b604354979ee32dbb6cd6c6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '574903b5-96b8-417f-a4ae-c6746a9fa810', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e01e65b5-3b0b-4206-b1f8-00c526cd5319, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=ae98e2be-52c6-44c0-a8f4-4c279bedab2a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.756 104408 INFO neutron.agent.ovn.metadata.agent [-] Port ae98e2be-52c6-44c0-a8f4-4c279bedab2a in datapath 2e6baf5c-6213-46a4-a17e-5ee389013511 bound to our chassis
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.757 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2e6baf5c-6213-46a4-a17e-5ee389013511
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.788 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[02186c2e-7c0b-4c5f-b03e-5480e5faacb4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.843 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f01380dc-8130-4b15-bafd-48fc56b962d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.849 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[22479fed-697a-4c45-80f0-93d306b4f5c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.896 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[df35d0d6-09e1-47cb-b2b1-1a4f34610efc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.917 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[13ec67dd-9b9e-4d57-b168-8a64d1c30a3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2e6baf5c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:ea:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 101], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465991, 'reachable_time': 20295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225055, 'error': None, 'target': 'ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.935 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b71fa4c6-d769-478d-96aa-ec6daef842dd]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2e6baf5c-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466012, 'tstamp': 466012}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225056, 'error': None, 'target': 'ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2e6baf5c-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466017, 'tstamp': 466017}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225056, 'error': None, 'target': 'ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.937 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e6baf5c-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:53 compute-0 nova_compute[182935]: 2026-01-22 00:02:53.939 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.940 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e6baf5c-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.940 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.941 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2e6baf5c-60, col_values=(('external_ids', {'iface-id': 'eba1ad2c-eba9-413b-8318-773455c1e2ff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:02:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:02:53.941 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:02:54 compute-0 nova_compute[182935]: 2026-01-22 00:02:54.260 182939 DEBUG nova.virt.libvirt.driver [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:02:54 compute-0 nova_compute[182935]: 2026-01-22 00:02:54.261 182939 DEBUG nova.virt.libvirt.driver [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:02:54 compute-0 nova_compute[182935]: 2026-01-22 00:02:54.262 182939 DEBUG nova.virt.libvirt.driver [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] No VIF found with MAC fa:16:3e:11:3b:f8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:02:54 compute-0 nova_compute[182935]: 2026-01-22 00:02:54.262 182939 DEBUG nova.virt.libvirt.driver [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] No VIF found with MAC fa:16:3e:38:7a:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:02:54 compute-0 nova_compute[182935]: 2026-01-22 00:02:54.802 182939 DEBUG nova.virt.libvirt.guest [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:02:54 compute-0 nova_compute[182935]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:02:54 compute-0 nova_compute[182935]:   <nova:name>tempest-AttachInterfacesV270Test-server-803124597</nova:name>
Jan 22 00:02:54 compute-0 nova_compute[182935]:   <nova:creationTime>2026-01-22 00:02:54</nova:creationTime>
Jan 22 00:02:54 compute-0 nova_compute[182935]:   <nova:flavor name="m1.nano">
Jan 22 00:02:54 compute-0 nova_compute[182935]:     <nova:memory>128</nova:memory>
Jan 22 00:02:54 compute-0 nova_compute[182935]:     <nova:disk>1</nova:disk>
Jan 22 00:02:54 compute-0 nova_compute[182935]:     <nova:swap>0</nova:swap>
Jan 22 00:02:54 compute-0 nova_compute[182935]:     <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:02:54 compute-0 nova_compute[182935]:     <nova:vcpus>1</nova:vcpus>
Jan 22 00:02:54 compute-0 nova_compute[182935]:   </nova:flavor>
Jan 22 00:02:54 compute-0 nova_compute[182935]:   <nova:owner>
Jan 22 00:02:54 compute-0 nova_compute[182935]:     <nova:user uuid="cc334793f1e0484084ad779dd9ef0596">tempest-AttachInterfacesV270Test-1840931641-project-member</nova:user>
Jan 22 00:02:54 compute-0 nova_compute[182935]:     <nova:project uuid="1b63a2653b604354979ee32dbb6cd6c6">tempest-AttachInterfacesV270Test-1840931641</nova:project>
Jan 22 00:02:54 compute-0 nova_compute[182935]:   </nova:owner>
Jan 22 00:02:54 compute-0 nova_compute[182935]:   <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:02:54 compute-0 nova_compute[182935]:   <nova:ports>
Jan 22 00:02:54 compute-0 nova_compute[182935]:     <nova:port uuid="637a179d-ce7a-481e-ba08-5430bd76b13b">
Jan 22 00:02:54 compute-0 nova_compute[182935]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 00:02:54 compute-0 nova_compute[182935]:     </nova:port>
Jan 22 00:02:54 compute-0 nova_compute[182935]:     <nova:port uuid="ae98e2be-52c6-44c0-a8f4-4c279bedab2a">
Jan 22 00:02:54 compute-0 nova_compute[182935]:       <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 00:02:54 compute-0 nova_compute[182935]:     </nova:port>
Jan 22 00:02:54 compute-0 nova_compute[182935]:   </nova:ports>
Jan 22 00:02:54 compute-0 nova_compute[182935]: </nova:instance>
Jan 22 00:02:54 compute-0 nova_compute[182935]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 22 00:02:54 compute-0 ovn_controller[95047]: 2026-01-22T00:02:54Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:38:7a:e0 10.100.0.8
Jan 22 00:02:54 compute-0 ovn_controller[95047]: 2026-01-22T00:02:54Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:7a:e0 10.100.0.8
Jan 22 00:02:55 compute-0 nova_compute[182935]: 2026-01-22 00:02:55.204 182939 DEBUG oslo_concurrency.lockutils [None req-8df209fb-3293-490f-b26b-4a95e92ab689 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "interface-e216ca9d-2882-457b-955e-b7a7cd7213d2-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 12.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:55 compute-0 nova_compute[182935]: 2026-01-22 00:02:55.250 182939 DEBUG nova.network.neutron [req-e37490a0-a035-45d7-87e8-2c9bc85ed321 req-03a9e3af-ab9c-4ca6-9610-daa6dc1cf86f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Updated VIF entry in instance network info cache for port ae98e2be-52c6-44c0-a8f4-4c279bedab2a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:02:55 compute-0 nova_compute[182935]: 2026-01-22 00:02:55.251 182939 DEBUG nova.network.neutron [req-e37490a0-a035-45d7-87e8-2c9bc85ed321 req-03a9e3af-ab9c-4ca6-9610-daa6dc1cf86f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Updating instance_info_cache with network_info: [{"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "address": "fa:16:3e:38:7a:e0", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae98e2be-52", "ovs_interfaceid": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:02:55 compute-0 nova_compute[182935]: 2026-01-22 00:02:55.444 182939 DEBUG oslo_concurrency.lockutils [req-e37490a0-a035-45d7-87e8-2c9bc85ed321 req-03a9e3af-ab9c-4ca6-9610-daa6dc1cf86f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-e216ca9d-2882-457b-955e-b7a7cd7213d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:02:55 compute-0 podman[225057]: 2026-01-22 00:02:55.735940301 +0000 UTC m=+0.094291240 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 00:02:56 compute-0 nova_compute[182935]: 2026-01-22 00:02:56.483 182939 DEBUG nova.compute.manager [req-ebc2ce18-5fa3-492f-b435-f57ddeea683e req-919ed9f6-8316-4792-921c-0e72c3b31cfb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received event network-vif-plugged-ae98e2be-52c6-44c0-a8f4-4c279bedab2a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:56 compute-0 nova_compute[182935]: 2026-01-22 00:02:56.484 182939 DEBUG oslo_concurrency.lockutils [req-ebc2ce18-5fa3-492f-b435-f57ddeea683e req-919ed9f6-8316-4792-921c-0e72c3b31cfb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:56 compute-0 nova_compute[182935]: 2026-01-22 00:02:56.484 182939 DEBUG oslo_concurrency.lockutils [req-ebc2ce18-5fa3-492f-b435-f57ddeea683e req-919ed9f6-8316-4792-921c-0e72c3b31cfb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:56 compute-0 nova_compute[182935]: 2026-01-22 00:02:56.484 182939 DEBUG oslo_concurrency.lockutils [req-ebc2ce18-5fa3-492f-b435-f57ddeea683e req-919ed9f6-8316-4792-921c-0e72c3b31cfb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:56 compute-0 nova_compute[182935]: 2026-01-22 00:02:56.485 182939 DEBUG nova.compute.manager [req-ebc2ce18-5fa3-492f-b435-f57ddeea683e req-919ed9f6-8316-4792-921c-0e72c3b31cfb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] No waiting events found dispatching network-vif-plugged-ae98e2be-52c6-44c0-a8f4-4c279bedab2a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:56 compute-0 nova_compute[182935]: 2026-01-22 00:02:56.485 182939 WARNING nova.compute.manager [req-ebc2ce18-5fa3-492f-b435-f57ddeea683e req-919ed9f6-8316-4792-921c-0e72c3b31cfb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received unexpected event network-vif-plugged-ae98e2be-52c6-44c0-a8f4-4c279bedab2a for instance with vm_state active and task_state None.
Jan 22 00:02:56 compute-0 nova_compute[182935]: 2026-01-22 00:02:56.530 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:58 compute-0 nova_compute[182935]: 2026-01-22 00:02:58.439 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:02:58 compute-0 nova_compute[182935]: 2026-01-22 00:02:58.911 182939 DEBUG nova.compute.manager [req-1da54d6e-7473-48ad-ac2e-3633e6938dde req-015667eb-5761-4240-b9c4-f45824ebdf7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received event network-vif-plugged-ae98e2be-52c6-44c0-a8f4-4c279bedab2a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:02:58 compute-0 nova_compute[182935]: 2026-01-22 00:02:58.912 182939 DEBUG oslo_concurrency.lockutils [req-1da54d6e-7473-48ad-ac2e-3633e6938dde req-015667eb-5761-4240-b9c4-f45824ebdf7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:58 compute-0 nova_compute[182935]: 2026-01-22 00:02:58.912 182939 DEBUG oslo_concurrency.lockutils [req-1da54d6e-7473-48ad-ac2e-3633e6938dde req-015667eb-5761-4240-b9c4-f45824ebdf7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:58 compute-0 nova_compute[182935]: 2026-01-22 00:02:58.913 182939 DEBUG oslo_concurrency.lockutils [req-1da54d6e-7473-48ad-ac2e-3633e6938dde req-015667eb-5761-4240-b9c4-f45824ebdf7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:58 compute-0 nova_compute[182935]: 2026-01-22 00:02:58.914 182939 DEBUG nova.compute.manager [req-1da54d6e-7473-48ad-ac2e-3633e6938dde req-015667eb-5761-4240-b9c4-f45824ebdf7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] No waiting events found dispatching network-vif-plugged-ae98e2be-52c6-44c0-a8f4-4c279bedab2a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:02:58 compute-0 nova_compute[182935]: 2026-01-22 00:02:58.914 182939 WARNING nova.compute.manager [req-1da54d6e-7473-48ad-ac2e-3633e6938dde req-015667eb-5761-4240-b9c4-f45824ebdf7d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received unexpected event network-vif-plugged-ae98e2be-52c6-44c0-a8f4-4c279bedab2a for instance with vm_state active and task_state deleting.
Jan 22 00:02:59 compute-0 nova_compute[182935]: 2026-01-22 00:02:59.035 182939 DEBUG oslo_concurrency.lockutils [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquiring lock "e216ca9d-2882-457b-955e-b7a7cd7213d2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:59 compute-0 nova_compute[182935]: 2026-01-22 00:02:59.036 182939 DEBUG oslo_concurrency.lockutils [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:59 compute-0 nova_compute[182935]: 2026-01-22 00:02:59.037 182939 DEBUG oslo_concurrency.lockutils [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquiring lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:02:59 compute-0 nova_compute[182935]: 2026-01-22 00:02:59.037 182939 DEBUG oslo_concurrency.lockutils [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:02:59 compute-0 nova_compute[182935]: 2026-01-22 00:02:59.037 182939 DEBUG oslo_concurrency.lockutils [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:02:59 compute-0 nova_compute[182935]: 2026-01-22 00:02:59.632 182939 INFO nova.compute.manager [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Terminating instance
Jan 22 00:03:00 compute-0 nova_compute[182935]: 2026-01-22 00:03:00.812 182939 DEBUG nova.compute.manager [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:03:00 compute-0 kernel: tap637a179d-ce (unregistering): left promiscuous mode
Jan 22 00:03:00 compute-0 NetworkManager[55139]: <info>  [1769040180.8415] device (tap637a179d-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:03:00 compute-0 nova_compute[182935]: 2026-01-22 00:03:00.855 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:00 compute-0 ovn_controller[95047]: 2026-01-22T00:03:00Z|00334|binding|INFO|Releasing lport 637a179d-ce7a-481e-ba08-5430bd76b13b from this chassis (sb_readonly=0)
Jan 22 00:03:00 compute-0 ovn_controller[95047]: 2026-01-22T00:03:00Z|00335|binding|INFO|Setting lport 637a179d-ce7a-481e-ba08-5430bd76b13b down in Southbound
Jan 22 00:03:00 compute-0 ovn_controller[95047]: 2026-01-22T00:03:00Z|00336|binding|INFO|Removing iface tap637a179d-ce ovn-installed in OVS
Jan 22 00:03:00 compute-0 nova_compute[182935]: 2026-01-22 00:03:00.858 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:00 compute-0 kernel: tapae98e2be-52 (unregistering): left promiscuous mode
Jan 22 00:03:00 compute-0 NetworkManager[55139]: <info>  [1769040180.8844] device (tapae98e2be-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:03:00 compute-0 nova_compute[182935]: 2026-01-22 00:03:00.887 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:00 compute-0 ovn_controller[95047]: 2026-01-22T00:03:00Z|00337|binding|INFO|Releasing lport ae98e2be-52c6-44c0-a8f4-4c279bedab2a from this chassis (sb_readonly=1)
Jan 22 00:03:00 compute-0 ovn_controller[95047]: 2026-01-22T00:03:00Z|00338|binding|INFO|Removing iface tapae98e2be-52 ovn-installed in OVS
Jan 22 00:03:00 compute-0 ovn_controller[95047]: 2026-01-22T00:03:00Z|00339|if_status|INFO|Dropped 2 log messages in last 885 seconds (most recently, 885 seconds ago) due to excessive rate
Jan 22 00:03:00 compute-0 ovn_controller[95047]: 2026-01-22T00:03:00Z|00340|if_status|INFO|Not setting lport ae98e2be-52c6-44c0-a8f4-4c279bedab2a down as sb is readonly
Jan 22 00:03:00 compute-0 nova_compute[182935]: 2026-01-22 00:03:00.902 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:00 compute-0 nova_compute[182935]: 2026-01-22 00:03:00.979 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:00 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000057.scope: Deactivated successfully.
Jan 22 00:03:01 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000057.scope: Consumed 14.519s CPU time.
Jan 22 00:03:01 compute-0 systemd-machined[154182]: Machine qemu-44-instance-00000057 terminated.
Jan 22 00:03:01 compute-0 NetworkManager[55139]: <info>  [1769040181.0544] manager: (tapae98e2be-52): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Jan 22 00:03:01 compute-0 ovn_controller[95047]: 2026-01-22T00:03:01Z|00341|binding|INFO|Setting lport ae98e2be-52c6-44c0-a8f4-4c279bedab2a down in Southbound
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.091 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:3b:f8 10.100.0.7'], port_security=['fa:16:3e:11:3b:f8 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e216ca9d-2882-457b-955e-b7a7cd7213d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e6baf5c-6213-46a4-a17e-5ee389013511', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1b63a2653b604354979ee32dbb6cd6c6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '574903b5-96b8-417f-a4ae-c6746a9fa810', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e01e65b5-3b0b-4206-b1f8-00c526cd5319, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=637a179d-ce7a-481e-ba08-5430bd76b13b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.092 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 637a179d-ce7a-481e-ba08-5430bd76b13b in datapath 2e6baf5c-6213-46a4-a17e-5ee389013511 unbound from our chassis
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.094 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2e6baf5c-6213-46a4-a17e-5ee389013511
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.100 182939 INFO nova.virt.libvirt.driver [-] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Instance destroyed successfully.
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.101 182939 DEBUG nova.objects.instance [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lazy-loading 'resources' on Instance uuid e216ca9d-2882-457b-955e-b7a7cd7213d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.112 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b037b7c8-b92a-4682-ada2-a273c1f9ae8c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.152 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[fa55190a-2918-49b5-9e81-1ffbeb18c2cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.160 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[781d8f65-1c1d-42e5-bd99-3f54d0a81c89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.210 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7d4fc794-9166-4c7a-b884-332895b52046]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.237 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a79da9-bb83-4737-a2e6-f16e4b729d73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2e6baf5c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:ea:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 101], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465991, 'reachable_time': 20295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225122, 'error': None, 'target': 'ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.267 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6b4b1135-e48a-45d2-bfd5-e3d0567ac80d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2e6baf5c-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466012, 'tstamp': 466012}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225123, 'error': None, 'target': 'ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2e6baf5c-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466017, 'tstamp': 466017}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225123, 'error': None, 'target': 'ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.270 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e6baf5c-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.272 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.284 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.285 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e6baf5c-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.285 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.286 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2e6baf5c-60, col_values=(('external_ids', {'iface-id': 'eba1ad2c-eba9-413b-8318-773455c1e2ff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.286 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.532 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.706 182939 DEBUG nova.virt.libvirt.vif [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:02:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-803124597',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-803124597',id=87,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:02:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1b63a2653b604354979ee32dbb6cd6c6',ramdisk_id='',reservation_id='r-m6owog58',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1840931641',owner_user_name='tempest-AttachInterfacesV270Test-1840931641-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:02:40Z,user_data=None,user_id='cc334793f1e0484084ad779dd9ef0596',uuid=e216ca9d-2882-457b-955e-b7a7cd7213d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.707 182939 DEBUG nova.network.os_vif_util [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Converting VIF {"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.708 182939 DEBUG nova.network.os_vif_util [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:3b:f8,bridge_name='br-int',has_traffic_filtering=True,id=637a179d-ce7a-481e-ba08-5430bd76b13b,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637a179d-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.708 182939 DEBUG os_vif [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:3b:f8,bridge_name='br-int',has_traffic_filtering=True,id=637a179d-ce7a-481e-ba08-5430bd76b13b,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637a179d-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.710 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.710 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap637a179d-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.712 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.715 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.720 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.723 182939 INFO os_vif [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:3b:f8,bridge_name='br-int',has_traffic_filtering=True,id=637a179d-ce7a-481e-ba08-5430bd76b13b,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap637a179d-ce')
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.724 182939 DEBUG nova.virt.libvirt.vif [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:02:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-803124597',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-803124597',id=87,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:02:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1b63a2653b604354979ee32dbb6cd6c6',ramdisk_id='',reservation_id='r-m6owog58',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1840931641',owner_user_name='tempest-AttachInterfacesV270Test-1840931641-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:02:40Z,user_data=None,user_id='cc334793f1e0484084ad779dd9ef0596',uuid=e216ca9d-2882-457b-955e-b7a7cd7213d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "address": "fa:16:3e:38:7a:e0", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae98e2be-52", "ovs_interfaceid": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.724 182939 DEBUG nova.network.os_vif_util [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Converting VIF {"id": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "address": "fa:16:3e:38:7a:e0", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae98e2be-52", "ovs_interfaceid": "ae98e2be-52c6-44c0-a8f4-4c279bedab2a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.725 182939 DEBUG nova.network.os_vif_util [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:7a:e0,bridge_name='br-int',has_traffic_filtering=True,id=ae98e2be-52c6-44c0-a8f4-4c279bedab2a,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae98e2be-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.725 182939 DEBUG os_vif [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:7a:e0,bridge_name='br-int',has_traffic_filtering=True,id=ae98e2be-52c6-44c0-a8f4-4c279bedab2a,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae98e2be-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.727 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.727 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae98e2be-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.729 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.731 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.733 182939 INFO os_vif [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:7a:e0,bridge_name='br-int',has_traffic_filtering=True,id=ae98e2be-52c6-44c0-a8f4-4c279bedab2a,network=Network(2e6baf5c-6213-46a4-a17e-5ee389013511),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae98e2be-52')
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.734 182939 INFO nova.virt.libvirt.driver [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Deleting instance files /var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2_del
Jan 22 00:03:01 compute-0 nova_compute[182935]: 2026-01-22 00:03:01.735 182939 INFO nova.virt.libvirt.driver [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Deletion of /var/lib/nova/instances/e216ca9d-2882-457b-955e-b7a7cd7213d2_del complete
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.838 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:7a:e0 10.100.0.8'], port_security=['fa:16:3e:38:7a:e0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e216ca9d-2882-457b-955e-b7a7cd7213d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e6baf5c-6213-46a4-a17e-5ee389013511', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1b63a2653b604354979ee32dbb6cd6c6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '574903b5-96b8-417f-a4ae-c6746a9fa810', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e01e65b5-3b0b-4206-b1f8-00c526cd5319, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=ae98e2be-52c6-44c0-a8f4-4c279bedab2a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.840 104408 INFO neutron.agent.ovn.metadata.agent [-] Port ae98e2be-52c6-44c0-a8f4-4c279bedab2a in datapath 2e6baf5c-6213-46a4-a17e-5ee389013511 unbound from our chassis
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.842 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2e6baf5c-6213-46a4-a17e-5ee389013511, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.843 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[edcb923c-3ecc-441b-9450-c139ae6fd198]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:01.844 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511 namespace which is not needed anymore
Jan 22 00:03:02 compute-0 neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511[224936]: [NOTICE]   (224940) : haproxy version is 2.8.14-c23fe91
Jan 22 00:03:02 compute-0 neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511[224936]: [NOTICE]   (224940) : path to executable is /usr/sbin/haproxy
Jan 22 00:03:02 compute-0 neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511[224936]: [WARNING]  (224940) : Exiting Master process...
Jan 22 00:03:02 compute-0 neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511[224936]: [ALERT]    (224940) : Current worker (224942) exited with code 143 (Terminated)
Jan 22 00:03:02 compute-0 neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511[224936]: [WARNING]  (224940) : All workers exited. Exiting... (0)
Jan 22 00:03:02 compute-0 systemd[1]: libpod-454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673.scope: Deactivated successfully.
Jan 22 00:03:02 compute-0 conmon[224936]: conmon 454b35e585661cd0e0c1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673.scope/container/memory.events
Jan 22 00:03:02 compute-0 podman[225145]: 2026-01-22 00:03:02.064582944 +0000 UTC m=+0.064087748 container died 454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:03:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673-userdata-shm.mount: Deactivated successfully.
Jan 22 00:03:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-48a9b2bf2b3e64a2a0622c5492d22212a875a9f82e6627f326e3d0bfe8498871-merged.mount: Deactivated successfully.
Jan 22 00:03:02 compute-0 podman[225145]: 2026-01-22 00:03:02.113832568 +0000 UTC m=+0.113337372 container cleanup 454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:03:02 compute-0 systemd[1]: libpod-conmon-454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673.scope: Deactivated successfully.
Jan 22 00:03:02 compute-0 podman[225174]: 2026-01-22 00:03:02.210023445 +0000 UTC m=+0.066099108 container remove 454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:03:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:02.217 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[36941f92-6262-4a9a-a660-94aa23ed62c8]: (4, ('Thu Jan 22 12:03:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511 (454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673)\n454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673\nThu Jan 22 12:03:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511 (454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673)\n454b35e585661cd0e0c16f584b80d53d4269fe2459d84f05e2d81a78086cd673\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:02.221 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e610b3d2-22be-4cc3-8e28-9261cba8858e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:02.223 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e6baf5c-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:03:02 compute-0 nova_compute[182935]: 2026-01-22 00:03:02.247 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:02 compute-0 kernel: tap2e6baf5c-60: left promiscuous mode
Jan 22 00:03:02 compute-0 nova_compute[182935]: 2026-01-22 00:03:02.272 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:02 compute-0 nova_compute[182935]: 2026-01-22 00:03:02.274 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:02.278 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd3f218-9c39-4cba-b739-3e0d449e88d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:02.296 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f85ef199-2b0d-49b9-93f9-a7452827a928]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:02.298 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[10f7dab2-4119-45be-a970-c66f689b314a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:02.319 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4ccfdb38-8c26-4e6e-80f8-e51997861c93]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 465978, 'reachable_time': 21851, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225193, 'error': None, 'target': 'ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:02.324 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2e6baf5c-6213-46a4-a17e-5ee389013511 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:03:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:02.324 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[2671eac0-3b12-46fa-a269-aeb9df981dce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:03:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d2e6baf5c\x2d6213\x2d46a4\x2da17e\x2d5ee389013511.mount: Deactivated successfully.
Jan 22 00:03:02 compute-0 podman[225191]: 2026-01-22 00:03:02.396716423 +0000 UTC m=+0.073172154 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 22 00:03:02 compute-0 podman[225189]: 2026-01-22 00:03:02.422925378 +0000 UTC m=+0.112189724 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 00:03:02 compute-0 nova_compute[182935]: 2026-01-22 00:03:02.995 182939 INFO nova.compute.manager [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Took 2.18 seconds to destroy the instance on the hypervisor.
Jan 22 00:03:02 compute-0 nova_compute[182935]: 2026-01-22 00:03:02.996 182939 DEBUG oslo.service.loopingcall [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:03:02 compute-0 nova_compute[182935]: 2026-01-22 00:03:02.997 182939 DEBUG nova.compute.manager [-] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:03:02 compute-0 nova_compute[182935]: 2026-01-22 00:03:02.997 182939 DEBUG nova.network.neutron [-] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:03:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:03.197 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:03.198 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:03.198 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:04 compute-0 nova_compute[182935]: 2026-01-22 00:03:04.368 182939 DEBUG nova.compute.manager [req-fe5aeb85-d329-436c-b81c-50867f7fc176 req-f4a825e8-8d52-4d07-89b3-0534d6f5131e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received event network-vif-unplugged-637a179d-ce7a-481e-ba08-5430bd76b13b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:03:04 compute-0 nova_compute[182935]: 2026-01-22 00:03:04.369 182939 DEBUG oslo_concurrency.lockutils [req-fe5aeb85-d329-436c-b81c-50867f7fc176 req-f4a825e8-8d52-4d07-89b3-0534d6f5131e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:04 compute-0 nova_compute[182935]: 2026-01-22 00:03:04.369 182939 DEBUG oslo_concurrency.lockutils [req-fe5aeb85-d329-436c-b81c-50867f7fc176 req-f4a825e8-8d52-4d07-89b3-0534d6f5131e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:04 compute-0 nova_compute[182935]: 2026-01-22 00:03:04.370 182939 DEBUG oslo_concurrency.lockutils [req-fe5aeb85-d329-436c-b81c-50867f7fc176 req-f4a825e8-8d52-4d07-89b3-0534d6f5131e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:04 compute-0 nova_compute[182935]: 2026-01-22 00:03:04.370 182939 DEBUG nova.compute.manager [req-fe5aeb85-d329-436c-b81c-50867f7fc176 req-f4a825e8-8d52-4d07-89b3-0534d6f5131e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] No waiting events found dispatching network-vif-unplugged-637a179d-ce7a-481e-ba08-5430bd76b13b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:03:04 compute-0 nova_compute[182935]: 2026-01-22 00:03:04.370 182939 DEBUG nova.compute.manager [req-fe5aeb85-d329-436c-b81c-50867f7fc176 req-f4a825e8-8d52-4d07-89b3-0534d6f5131e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received event network-vif-unplugged-637a179d-ce7a-481e-ba08-5430bd76b13b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:03:05 compute-0 nova_compute[182935]: 2026-01-22 00:03:05.879 182939 DEBUG nova.compute.manager [req-d838449b-cf57-450b-9bc0-e21169a915ee req-96bfeb11-f47c-4de3-88e5-1bce76da94a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received event network-vif-deleted-ae98e2be-52c6-44c0-a8f4-4c279bedab2a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:03:05 compute-0 nova_compute[182935]: 2026-01-22 00:03:05.879 182939 INFO nova.compute.manager [req-d838449b-cf57-450b-9bc0-e21169a915ee req-96bfeb11-f47c-4de3-88e5-1bce76da94a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Neutron deleted interface ae98e2be-52c6-44c0-a8f4-4c279bedab2a; detaching it from the instance and deleting it from the info cache
Jan 22 00:03:05 compute-0 nova_compute[182935]: 2026-01-22 00:03:05.880 182939 DEBUG nova.network.neutron [req-d838449b-cf57-450b-9bc0-e21169a915ee req-96bfeb11-f47c-4de3-88e5-1bce76da94a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Updating instance_info_cache with network_info: [{"id": "637a179d-ce7a-481e-ba08-5430bd76b13b", "address": "fa:16:3e:11:3b:f8", "network": {"id": "2e6baf5c-6213-46a4-a17e-5ee389013511", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1897775396-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1b63a2653b604354979ee32dbb6cd6c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap637a179d-ce", "ovs_interfaceid": "637a179d-ce7a-481e-ba08-5430bd76b13b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:03:06 compute-0 nova_compute[182935]: 2026-01-22 00:03:06.186 182939 DEBUG nova.compute.manager [req-d838449b-cf57-450b-9bc0-e21169a915ee req-96bfeb11-f47c-4de3-88e5-1bce76da94a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Detach interface failed, port_id=ae98e2be-52c6-44c0-a8f4-4c279bedab2a, reason: Instance e216ca9d-2882-457b-955e-b7a7cd7213d2 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:03:06 compute-0 nova_compute[182935]: 2026-01-22 00:03:06.534 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:06 compute-0 nova_compute[182935]: 2026-01-22 00:03:06.729 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:06 compute-0 nova_compute[182935]: 2026-01-22 00:03:06.740 182939 DEBUG nova.compute.manager [req-e8a4c945-ff4d-4c0d-bfff-1f611e9bb498 req-0ddea156-b69e-4c95-b6f0-2726e713d607 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received event network-vif-plugged-637a179d-ce7a-481e-ba08-5430bd76b13b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:03:06 compute-0 nova_compute[182935]: 2026-01-22 00:03:06.740 182939 DEBUG oslo_concurrency.lockutils [req-e8a4c945-ff4d-4c0d-bfff-1f611e9bb498 req-0ddea156-b69e-4c95-b6f0-2726e713d607 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:06 compute-0 nova_compute[182935]: 2026-01-22 00:03:06.741 182939 DEBUG oslo_concurrency.lockutils [req-e8a4c945-ff4d-4c0d-bfff-1f611e9bb498 req-0ddea156-b69e-4c95-b6f0-2726e713d607 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:06 compute-0 nova_compute[182935]: 2026-01-22 00:03:06.741 182939 DEBUG oslo_concurrency.lockutils [req-e8a4c945-ff4d-4c0d-bfff-1f611e9bb498 req-0ddea156-b69e-4c95-b6f0-2726e713d607 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:06 compute-0 nova_compute[182935]: 2026-01-22 00:03:06.741 182939 DEBUG nova.compute.manager [req-e8a4c945-ff4d-4c0d-bfff-1f611e9bb498 req-0ddea156-b69e-4c95-b6f0-2726e713d607 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] No waiting events found dispatching network-vif-plugged-637a179d-ce7a-481e-ba08-5430bd76b13b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:03:06 compute-0 nova_compute[182935]: 2026-01-22 00:03:06.741 182939 WARNING nova.compute.manager [req-e8a4c945-ff4d-4c0d-bfff-1f611e9bb498 req-0ddea156-b69e-4c95-b6f0-2726e713d607 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received unexpected event network-vif-plugged-637a179d-ce7a-481e-ba08-5430bd76b13b for instance with vm_state active and task_state deleting.
Jan 22 00:03:07 compute-0 nova_compute[182935]: 2026-01-22 00:03:07.078 182939 DEBUG nova.network.neutron [-] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:03:07 compute-0 nova_compute[182935]: 2026-01-22 00:03:07.335 182939 INFO nova.compute.manager [-] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Took 4.34 seconds to deallocate network for instance.
Jan 22 00:03:08 compute-0 nova_compute[182935]: 2026-01-22 00:03:08.019 182939 DEBUG nova.compute.manager [req-9fb7da07-76e8-4fca-bd74-9df31e154793 req-43e60db3-66dd-4fb7-af2e-8712d8e97a24 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Received event network-vif-deleted-637a179d-ce7a-481e-ba08-5430bd76b13b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:03:08 compute-0 nova_compute[182935]: 2026-01-22 00:03:08.069 182939 DEBUG oslo_concurrency.lockutils [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:08 compute-0 nova_compute[182935]: 2026-01-22 00:03:08.070 182939 DEBUG oslo_concurrency.lockutils [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:08 compute-0 nova_compute[182935]: 2026-01-22 00:03:08.373 182939 DEBUG nova.compute.provider_tree [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:03:08 compute-0 nova_compute[182935]: 2026-01-22 00:03:08.479 182939 DEBUG nova.scheduler.client.report [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:03:08 compute-0 nova_compute[182935]: 2026-01-22 00:03:08.660 182939 DEBUG oslo_concurrency.lockutils [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:08 compute-0 nova_compute[182935]: 2026-01-22 00:03:08.763 182939 INFO nova.scheduler.client.report [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Deleted allocations for instance e216ca9d-2882-457b-955e-b7a7cd7213d2
Jan 22 00:03:09 compute-0 nova_compute[182935]: 2026-01-22 00:03:09.450 182939 DEBUG oslo_concurrency.lockutils [None req-277c8f8e-de42-40cd-ad74-6fa7e39a7dc6 cc334793f1e0484084ad779dd9ef0596 1b63a2653b604354979ee32dbb6cd6c6 - - default default] Lock "e216ca9d-2882-457b-955e-b7a7cd7213d2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.414s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:11 compute-0 nova_compute[182935]: 2026-01-22 00:03:11.537 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:11 compute-0 nova_compute[182935]: 2026-01-22 00:03:11.732 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:13 compute-0 podman[225235]: 2026-01-22 00:03:13.677200164 +0000 UTC m=+0.049632372 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:03:13 compute-0 podman[225234]: 2026-01-22 00:03:13.719495417 +0000 UTC m=+0.092957891 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:03:15 compute-0 nova_compute[182935]: 2026-01-22 00:03:15.316 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:16 compute-0 nova_compute[182935]: 2026-01-22 00:03:16.099 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040181.096761, e216ca9d-2882-457b-955e-b7a7cd7213d2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:03:16 compute-0 nova_compute[182935]: 2026-01-22 00:03:16.099 182939 INFO nova.compute.manager [-] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] VM Stopped (Lifecycle Event)
Jan 22 00:03:16 compute-0 nova_compute[182935]: 2026-01-22 00:03:16.218 182939 DEBUG nova.compute.manager [None req-956cb428-7f7f-4e07-8019-adeec9f36d8d - - - - - -] [instance: e216ca9d-2882-457b-955e-b7a7cd7213d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:03:16 compute-0 nova_compute[182935]: 2026-01-22 00:03:16.539 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:16 compute-0 nova_compute[182935]: 2026-01-22 00:03:16.769 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:19.278 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:03:19 compute-0 nova_compute[182935]: 2026-01-22 00:03:19.278 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:19.280 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:03:21 compute-0 nova_compute[182935]: 2026-01-22 00:03:21.543 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:21 compute-0 podman[225282]: 2026-01-22 00:03:21.6835197 +0000 UTC m=+0.056319408 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:03:21 compute-0 nova_compute[182935]: 2026-01-22 00:03:21.772 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:26 compute-0 nova_compute[182935]: 2026-01-22 00:03:26.545 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:26 compute-0 podman[225306]: 2026-01-22 00:03:26.718484513 +0000 UTC m=+0.082727909 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 22 00:03:26 compute-0 nova_compute[182935]: 2026-01-22 00:03:26.774 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:28 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:28.283 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:03:28 compute-0 nova_compute[182935]: 2026-01-22 00:03:28.912 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:28 compute-0 nova_compute[182935]: 2026-01-22 00:03:28.913 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:03:28 compute-0 nova_compute[182935]: 2026-01-22 00:03:28.913 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:03:30 compute-0 nova_compute[182935]: 2026-01-22 00:03:30.523 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:03:31 compute-0 nova_compute[182935]: 2026-01-22 00:03:31.548 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:31 compute-0 nova_compute[182935]: 2026-01-22 00:03:31.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:31 compute-0 nova_compute[182935]: 2026-01-22 00:03:31.814 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:32 compute-0 podman[225327]: 2026-01-22 00:03:32.675693222 +0000 UTC m=+0.053927078 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible)
Jan 22 00:03:32 compute-0 podman[225328]: 2026-01-22 00:03:32.703104338 +0000 UTC m=+0.079105489 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:03:32 compute-0 nova_compute[182935]: 2026-01-22 00:03:32.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:32 compute-0 nova_compute[182935]: 2026-01-22 00:03:32.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:03:32 compute-0 nova_compute[182935]: 2026-01-22 00:03:32.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:34 compute-0 sshd-session[225369]: Invalid user git from 188.166.69.60 port 41066
Jan 22 00:03:34 compute-0 sshd-session[225369]: Connection closed by invalid user git 188.166.69.60 port 41066 [preauth]
Jan 22 00:03:36 compute-0 nova_compute[182935]: 2026-01-22 00:03:36.550 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:36 compute-0 nova_compute[182935]: 2026-01-22 00:03:36.581 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:36 compute-0 nova_compute[182935]: 2026-01-22 00:03:36.582 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:36 compute-0 nova_compute[182935]: 2026-01-22 00:03:36.582 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:36 compute-0 nova_compute[182935]: 2026-01-22 00:03:36.583 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:03:36 compute-0 nova_compute[182935]: 2026-01-22 00:03:36.788 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:03:36 compute-0 nova_compute[182935]: 2026-01-22 00:03:36.790 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5714MB free_disk=73.20233917236328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:03:36 compute-0 nova_compute[182935]: 2026-01-22 00:03:36.790 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:36 compute-0 nova_compute[182935]: 2026-01-22 00:03:36.791 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:36 compute-0 nova_compute[182935]: 2026-01-22 00:03:36.815 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:41 compute-0 nova_compute[182935]: 2026-01-22 00:03:41.554 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:41 compute-0 nova_compute[182935]: 2026-01-22 00:03:41.817 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:44 compute-0 nova_compute[182935]: 2026-01-22 00:03:44.364 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:03:44 compute-0 nova_compute[182935]: 2026-01-22 00:03:44.364 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:03:44 compute-0 nova_compute[182935]: 2026-01-22 00:03:44.451 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:03:44 compute-0 podman[225374]: 2026-01-22 00:03:44.699881457 +0000 UTC m=+0.068413865 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:03:44 compute-0 podman[225373]: 2026-01-22 00:03:44.763061593 +0000 UTC m=+0.124857785 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:03:46 compute-0 nova_compute[182935]: 2026-01-22 00:03:46.556 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:46 compute-0 nova_compute[182935]: 2026-01-22 00:03:46.819 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:47 compute-0 nova_compute[182935]: 2026-01-22 00:03:47.285 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:03:47 compute-0 nova_compute[182935]: 2026-01-22 00:03:47.767 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:03:47 compute-0 nova_compute[182935]: 2026-01-22 00:03:47.768 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 10.977s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:48 compute-0 nova_compute[182935]: 2026-01-22 00:03:48.768 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:48 compute-0 nova_compute[182935]: 2026-01-22 00:03:48.769 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:48 compute-0 ovn_controller[95047]: 2026-01-22T00:03:48Z|00342|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 22 00:03:51 compute-0 nova_compute[182935]: 2026-01-22 00:03:51.559 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:51 compute-0 nova_compute[182935]: 2026-01-22 00:03:51.855 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:51 compute-0 nova_compute[182935]: 2026-01-22 00:03:51.972 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:51 compute-0 nova_compute[182935]: 2026-01-22 00:03:51.973 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:51 compute-0 nova_compute[182935]: 2026-01-22 00:03:51.973 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:51 compute-0 nova_compute[182935]: 2026-01-22 00:03:51.973 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:03:52 compute-0 podman[225422]: 2026-01-22 00:03:52.690013124 +0000 UTC m=+0.064551930 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:03:56 compute-0 nova_compute[182935]: 2026-01-22 00:03:56.453 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Acquiring lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:56 compute-0 nova_compute[182935]: 2026-01-22 00:03:56.453 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:56 compute-0 nova_compute[182935]: 2026-01-22 00:03:56.533 182939 DEBUG nova.compute.manager [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:03:56 compute-0 nova_compute[182935]: 2026-01-22 00:03:56.559 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:56 compute-0 nova_compute[182935]: 2026-01-22 00:03:56.747 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:56 compute-0 nova_compute[182935]: 2026-01-22 00:03:56.748 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:56 compute-0 nova_compute[182935]: 2026-01-22 00:03:56.756 182939 DEBUG nova.virt.hardware [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:03:56 compute-0 nova_compute[182935]: 2026-01-22 00:03:56.757 182939 INFO nova.compute.claims [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:03:56 compute-0 nova_compute[182935]: 2026-01-22 00:03:56.857 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:56.986 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:03:56 compute-0 nova_compute[182935]: 2026-01-22 00:03:56.987 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:03:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:56.987 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.475 182939 DEBUG nova.compute.provider_tree [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.498 182939 DEBUG nova.scheduler.client.report [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.537 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.789s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.538 182939 DEBUG nova.compute.manager [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.663 182939 DEBUG nova.compute.manager [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.663 182939 DEBUG nova.network.neutron [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:03:57 compute-0 podman[225446]: 2026-01-22 00:03:57.684829168 +0000 UTC m=+0.056584813 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.696 182939 INFO nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.715 182939 DEBUG nova.compute.manager [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.994 182939 DEBUG nova.compute.manager [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.996 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.996 182939 INFO nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Creating image(s)
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.997 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Acquiring lock "/var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.997 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "/var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:57 compute-0 nova_compute[182935]: 2026-01-22 00:03:57.998 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "/var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.010 182939 DEBUG oslo_concurrency.processutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.057 182939 DEBUG nova.policy [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aef36a0345de4d4ba834c68026792663', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd015b8945f149928b5915953637c812', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.073 182939 DEBUG oslo_concurrency.processutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.074 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.074 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.085 182939 DEBUG oslo_concurrency.processutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.148 182939 DEBUG oslo_concurrency.processutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.149 182939 DEBUG oslo_concurrency.processutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.184 182939 DEBUG oslo_concurrency.processutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.185 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.185 182939 DEBUG oslo_concurrency.processutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.242 182939 DEBUG oslo_concurrency.processutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.243 182939 DEBUG nova.virt.disk.api [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Checking if we can resize image /var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.243 182939 DEBUG oslo_concurrency.processutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.304 182939 DEBUG oslo_concurrency.processutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.305 182939 DEBUG nova.virt.disk.api [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Cannot resize image /var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.305 182939 DEBUG nova.objects.instance [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lazy-loading 'migration_context' on Instance uuid d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.329 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.330 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Ensure instance console log exists: /var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.330 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.331 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:03:58 compute-0 nova_compute[182935]: 2026-01-22 00:03:58.331 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:03:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:03:59.988 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:00 compute-0 nova_compute[182935]: 2026-01-22 00:04:00.361 182939 DEBUG nova.network.neutron [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Successfully created port: 2e340635-02cb-45e6-ac92-5fe5fe003dec _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:04:01 compute-0 nova_compute[182935]: 2026-01-22 00:04:01.561 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:01 compute-0 nova_compute[182935]: 2026-01-22 00:04:01.859 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:02 compute-0 nova_compute[182935]: 2026-01-22 00:04:02.105 182939 DEBUG nova.network.neutron [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Successfully updated port: 2e340635-02cb-45e6-ac92-5fe5fe003dec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:04:02 compute-0 nova_compute[182935]: 2026-01-22 00:04:02.166 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Acquiring lock "refresh_cache-d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:04:02 compute-0 nova_compute[182935]: 2026-01-22 00:04:02.166 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Acquired lock "refresh_cache-d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:04:02 compute-0 nova_compute[182935]: 2026-01-22 00:04:02.167 182939 DEBUG nova.network.neutron [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:04:02 compute-0 nova_compute[182935]: 2026-01-22 00:04:02.430 182939 DEBUG nova.network.neutron [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:04:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:03.198 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:03.199 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:03.199 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:03 compute-0 podman[225480]: 2026-01-22 00:04:03.686670199 +0000 UTC m=+0.059293062 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Jan 22 00:04:03 compute-0 podman[225481]: 2026-01-22 00:04:03.687839698 +0000 UTC m=+0.057630751 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.450 182939 DEBUG nova.network.neutron [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Updating instance_info_cache with network_info: [{"id": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "address": "fa:16:3e:b8:1e:10", "network": {"id": "f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-2018629590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd015b8945f149928b5915953637c812", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e340635-02", "ovs_interfaceid": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.460 182939 DEBUG nova.compute.manager [req-c9c0fc63-b815-4efd-a775-e1456983f326 req-9ede2760-8425-4148-8263-be5f6826c535 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Received event network-changed-2e340635-02cb-45e6-ac92-5fe5fe003dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.460 182939 DEBUG nova.compute.manager [req-c9c0fc63-b815-4efd-a775-e1456983f326 req-9ede2760-8425-4148-8263-be5f6826c535 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Refreshing instance network info cache due to event network-changed-2e340635-02cb-45e6-ac92-5fe5fe003dec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.461 182939 DEBUG oslo_concurrency.lockutils [req-c9c0fc63-b815-4efd-a775-e1456983f326 req-9ede2760-8425-4148-8263-be5f6826c535 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.483 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Releasing lock "refresh_cache-d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.484 182939 DEBUG nova.compute.manager [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Instance network_info: |[{"id": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "address": "fa:16:3e:b8:1e:10", "network": {"id": "f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-2018629590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd015b8945f149928b5915953637c812", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e340635-02", "ovs_interfaceid": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.484 182939 DEBUG oslo_concurrency.lockutils [req-c9c0fc63-b815-4efd-a775-e1456983f326 req-9ede2760-8425-4148-8263-be5f6826c535 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.484 182939 DEBUG nova.network.neutron [req-c9c0fc63-b815-4efd-a775-e1456983f326 req-9ede2760-8425-4148-8263-be5f6826c535 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Refreshing network info cache for port 2e340635-02cb-45e6-ac92-5fe5fe003dec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.486 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Start _get_guest_xml network_info=[{"id": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "address": "fa:16:3e:b8:1e:10", "network": {"id": "f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-2018629590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd015b8945f149928b5915953637c812", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e340635-02", "ovs_interfaceid": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.491 182939 WARNING nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.503 182939 DEBUG nova.virt.libvirt.host [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.503 182939 DEBUG nova.virt.libvirt.host [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.507 182939 DEBUG nova.virt.libvirt.host [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.508 182939 DEBUG nova.virt.libvirt.host [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.509 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.509 182939 DEBUG nova.virt.hardware [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.510 182939 DEBUG nova.virt.hardware [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.510 182939 DEBUG nova.virt.hardware [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.510 182939 DEBUG nova.virt.hardware [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.510 182939 DEBUG nova.virt.hardware [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.510 182939 DEBUG nova.virt.hardware [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.511 182939 DEBUG nova.virt.hardware [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.511 182939 DEBUG nova.virt.hardware [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.511 182939 DEBUG nova.virt.hardware [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.511 182939 DEBUG nova.virt.hardware [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.512 182939 DEBUG nova.virt.hardware [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.515 182939 DEBUG nova.virt.libvirt.vif [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:03:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-1279263803',display_name='tempest-NoVNCConsoleTestJSON-server-1279263803',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-1279263803',id=89,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd015b8945f149928b5915953637c812',ramdisk_id='',reservation_id='r-9i3br92f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-55272866',owner_user_name='tempest-NoVNCConsoleTestJSON-55272866-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:03:57Z,user_data=None,user_id='aef36a0345de4d4ba834c68026792663',uuid=d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "address": "fa:16:3e:b8:1e:10", "network": {"id": "f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-2018629590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd015b8945f149928b5915953637c812", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e340635-02", "ovs_interfaceid": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.515 182939 DEBUG nova.network.os_vif_util [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Converting VIF {"id": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "address": "fa:16:3e:b8:1e:10", "network": {"id": "f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-2018629590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd015b8945f149928b5915953637c812", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e340635-02", "ovs_interfaceid": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.516 182939 DEBUG nova.network.os_vif_util [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=2e340635-02cb-45e6-ac92-5fe5fe003dec,network=Network(f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e340635-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.517 182939 DEBUG nova.objects.instance [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lazy-loading 'pci_devices' on Instance uuid d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.531 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:04:04 compute-0 nova_compute[182935]:   <uuid>d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d</uuid>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   <name>instance-00000059</name>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <nova:name>tempest-NoVNCConsoleTestJSON-server-1279263803</nova:name>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:04:04</nova:creationTime>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:04:04 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:04:04 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:04:04 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:04:04 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:04:04 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:04:04 compute-0 nova_compute[182935]:         <nova:user uuid="aef36a0345de4d4ba834c68026792663">tempest-NoVNCConsoleTestJSON-55272866-project-member</nova:user>
Jan 22 00:04:04 compute-0 nova_compute[182935]:         <nova:project uuid="fd015b8945f149928b5915953637c812">tempest-NoVNCConsoleTestJSON-55272866</nova:project>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:04:04 compute-0 nova_compute[182935]:         <nova:port uuid="2e340635-02cb-45e6-ac92-5fe5fe003dec">
Jan 22 00:04:04 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <system>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <entry name="serial">d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d</entry>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <entry name="uuid">d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d</entry>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     </system>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   <os>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   </os>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   <features>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   </features>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk.config"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:b8:1e:10"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <target dev="tap2e340635-02"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/console.log" append="off"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <video>
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     </video>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:04:04 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:04:04 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:04:04 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:04:04 compute-0 nova_compute[182935]: </domain>
Jan 22 00:04:04 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.532 182939 DEBUG nova.compute.manager [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Preparing to wait for external event network-vif-plugged-2e340635-02cb-45e6-ac92-5fe5fe003dec prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.532 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Acquiring lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.533 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.533 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.534 182939 DEBUG nova.virt.libvirt.vif [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:03:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-1279263803',display_name='tempest-NoVNCConsoleTestJSON-server-1279263803',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-1279263803',id=89,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd015b8945f149928b5915953637c812',ramdisk_id='',reservation_id='r-9i3br92f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-55272866',owner_user_name='tempest-NoVNCConsoleTestJSON-55272866-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:03:57Z,user_data=None,user_id='aef36a0345de4d4ba834c68026792663',uuid=d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "address": "fa:16:3e:b8:1e:10", "network": {"id": "f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-2018629590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd015b8945f149928b5915953637c812", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e340635-02", "ovs_interfaceid": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.534 182939 DEBUG nova.network.os_vif_util [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Converting VIF {"id": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "address": "fa:16:3e:b8:1e:10", "network": {"id": "f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-2018629590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd015b8945f149928b5915953637c812", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e340635-02", "ovs_interfaceid": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.534 182939 DEBUG nova.network.os_vif_util [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=2e340635-02cb-45e6-ac92-5fe5fe003dec,network=Network(f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e340635-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.535 182939 DEBUG os_vif [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=2e340635-02cb-45e6-ac92-5fe5fe003dec,network=Network(f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e340635-02') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.536 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.536 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.539 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.540 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e340635-02, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.540 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2e340635-02, col_values=(('external_ids', {'iface-id': '2e340635-02cb-45e6-ac92-5fe5fe003dec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:1e:10', 'vm-uuid': 'd0c6ae12-84cc-4e3e-8dfc-a10e2f15143d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.541 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:04 compute-0 NetworkManager[55139]: <info>  [1769040244.5426] manager: (tap2e340635-02): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.544 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.548 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.549 182939 INFO os_vif [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=2e340635-02cb-45e6-ac92-5fe5fe003dec,network=Network(f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e340635-02')
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.610 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.611 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.611 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] No VIF found with MAC fa:16:3e:b8:1e:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:04:04 compute-0 nova_compute[182935]: 2026-01-22 00:04:04.612 182939 INFO nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Using config drive
Jan 22 00:04:05 compute-0 nova_compute[182935]: 2026-01-22 00:04:05.430 182939 INFO nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Creating config drive at /var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk.config
Jan 22 00:04:05 compute-0 nova_compute[182935]: 2026-01-22 00:04:05.435 182939 DEBUG oslo_concurrency.processutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps6xaxgi4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:05 compute-0 nova_compute[182935]: 2026-01-22 00:04:05.564 182939 DEBUG oslo_concurrency.processutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps6xaxgi4" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:05 compute-0 kernel: tap2e340635-02: entered promiscuous mode
Jan 22 00:04:05 compute-0 NetworkManager[55139]: <info>  [1769040245.6322] manager: (tap2e340635-02): new Tun device (/org/freedesktop/NetworkManager/Devices/154)
Jan 22 00:04:05 compute-0 systemd-udevd[225539]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:04:05 compute-0 ovn_controller[95047]: 2026-01-22T00:04:05Z|00343|binding|INFO|Claiming lport 2e340635-02cb-45e6-ac92-5fe5fe003dec for this chassis.
Jan 22 00:04:05 compute-0 ovn_controller[95047]: 2026-01-22T00:04:05Z|00344|binding|INFO|2e340635-02cb-45e6-ac92-5fe5fe003dec: Claiming fa:16:3e:b8:1e:10 10.100.0.10
Jan 22 00:04:05 compute-0 nova_compute[182935]: 2026-01-22 00:04:05.805 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:05 compute-0 nova_compute[182935]: 2026-01-22 00:04:05.816 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.838 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:1e:10 10.100.0.10'], port_security=['fa:16:3e:b8:1e:10 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd0c6ae12-84cc-4e3e-8dfc-a10e2f15143d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd015b8945f149928b5915953637c812', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'bded0d92-ff50-4771-a532-f277eae957fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ca4e4d1-e168-45ec-b34c-1f4d14412611, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=2e340635-02cb-45e6-ac92-5fe5fe003dec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.839 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 2e340635-02cb-45e6-ac92-5fe5fe003dec in datapath f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2 bound to our chassis
Jan 22 00:04:05 compute-0 NetworkManager[55139]: <info>  [1769040245.8404] device (tap2e340635-02): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:04:05 compute-0 NetworkManager[55139]: <info>  [1769040245.8410] device (tap2e340635-02): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.842 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.857 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7d797fa5-c0df-45bc-bbe2-ed6e25758f3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.858 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf3c701f5-41 in ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:04:05 compute-0 systemd-machined[154182]: New machine qemu-45-instance-00000059.
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.862 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf3c701f5-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.862 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc6bd02-a587-4e45-b59c-120a6850c51d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.863 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9c481461-4862-491e-a58c-517d564c5751]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.875 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4d1958-5689-4175-b3db-10634f04a3b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:05 compute-0 ovn_controller[95047]: 2026-01-22T00:04:05Z|00345|binding|INFO|Setting lport 2e340635-02cb-45e6-ac92-5fe5fe003dec ovn-installed in OVS
Jan 22 00:04:05 compute-0 ovn_controller[95047]: 2026-01-22T00:04:05Z|00346|binding|INFO|Setting lport 2e340635-02cb-45e6-ac92-5fe5fe003dec up in Southbound
Jan 22 00:04:05 compute-0 nova_compute[182935]: 2026-01-22 00:04:05.885 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:05 compute-0 systemd[1]: Started Virtual Machine qemu-45-instance-00000059.
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.901 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b656b2a6-190e-4d66-a77e-a6382f59f13f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.936 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7859cc00-667e-4547-8150-3efb33bc5517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.942 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[10697457-8e96-46ff-95b1-e6ca70aa63d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:05 compute-0 NetworkManager[55139]: <info>  [1769040245.9431] manager: (tapf3c701f5-40): new Veth device (/org/freedesktop/NetworkManager/Devices/155)
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.976 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[e0639bf7-f409-4362-b343-31fbf02735b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:05.980 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[cdde867a-c460-465b-bdb5-77080619d1af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:06 compute-0 NetworkManager[55139]: <info>  [1769040246.0035] device (tapf3c701f5-40): carrier: link connected
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.011 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[4c40ca7c-55b5-4bcb-ae47-131eb75c8073]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.030 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d990982a-ed2f-41a3-ae96-44099b338f26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf3c701f5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:88:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474874, 'reachable_time': 39302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225575, 'error': None, 'target': 'ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.048 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c1865f-a266-4710-8cd5-4d3c1c406963]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9f:8802'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474874, 'tstamp': 474874}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225576, 'error': None, 'target': 'ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.066 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5f7af9-534c-4789-8053-7e36d94fe042]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf3c701f5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:88:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 106], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474874, 'reachable_time': 39302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225577, 'error': None, 'target': 'ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.101 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[337009b0-2b36-4291-bbe1-a7b8b0c0e906]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.165 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4db876d6-fac1-49be-aa3f-a7ba68bdf034]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.167 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3c701f5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.167 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.167 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf3c701f5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:06 compute-0 nova_compute[182935]: 2026-01-22 00:04:06.169 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:06 compute-0 NetworkManager[55139]: <info>  [1769040246.1700] manager: (tapf3c701f5-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Jan 22 00:04:06 compute-0 kernel: tapf3c701f5-40: entered promiscuous mode
Jan 22 00:04:06 compute-0 nova_compute[182935]: 2026-01-22 00:04:06.171 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.174 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf3c701f5-40, col_values=(('external_ids', {'iface-id': '16697c30-9e95-4380-bcfd-e0e009b3c900'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:06 compute-0 nova_compute[182935]: 2026-01-22 00:04:06.175 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:06 compute-0 ovn_controller[95047]: 2026-01-22T00:04:06Z|00347|binding|INFO|Releasing lport 16697c30-9e95-4380-bcfd-e0e009b3c900 from this chassis (sb_readonly=0)
Jan 22 00:04:06 compute-0 nova_compute[182935]: 2026-01-22 00:04:06.175 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.177 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.178 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6f554709-bde6-4c7b-96a6-4d673c80b03c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.179 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2.pid.haproxy
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:04:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:06.182 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2', 'env', 'PROCESS_TAG=haproxy-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:04:06 compute-0 nova_compute[182935]: 2026-01-22 00:04:06.187 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:06 compute-0 nova_compute[182935]: 2026-01-22 00:04:06.321 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040246.320769, d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:06 compute-0 nova_compute[182935]: 2026-01-22 00:04:06.322 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] VM Started (Lifecycle Event)
Jan 22 00:04:06 compute-0 podman[225616]: 2026-01-22 00:04:06.546176667 +0000 UTC m=+0.048066925 container create 7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 00:04:06 compute-0 nova_compute[182935]: 2026-01-22 00:04:06.563 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:06 compute-0 systemd[1]: Started libpod-conmon-7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead.scope.
Jan 22 00:04:06 compute-0 podman[225616]: 2026-01-22 00:04:06.523109889 +0000 UTC m=+0.025000177 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:04:06 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:04:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e30aff57b641320633fcaae0b3ef3641d0dcc45d6d1abbc756a55e9ff46d73a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:04:06 compute-0 podman[225616]: 2026-01-22 00:04:06.651139461 +0000 UTC m=+0.153029729 container init 7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:04:06 compute-0 podman[225616]: 2026-01-22 00:04:06.657225161 +0000 UTC m=+0.159115429 container start 7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 00:04:06 compute-0 neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2[225632]: [NOTICE]   (225636) : New worker (225638) forked
Jan 22 00:04:06 compute-0 neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2[225632]: [NOTICE]   (225636) : Loading success.
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.391 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.396 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040246.3210762, d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.396 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] VM Paused (Lifecycle Event)
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.427 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.430 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.510 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.814 182939 DEBUG nova.compute.manager [req-b853899b-b920-48fa-b7e4-48e08c8f0dd1 req-3b18d9a7-7809-47a9-8aa0-bb8accadb25b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Received event network-vif-plugged-2e340635-02cb-45e6-ac92-5fe5fe003dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.815 182939 DEBUG oslo_concurrency.lockutils [req-b853899b-b920-48fa-b7e4-48e08c8f0dd1 req-3b18d9a7-7809-47a9-8aa0-bb8accadb25b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.815 182939 DEBUG oslo_concurrency.lockutils [req-b853899b-b920-48fa-b7e4-48e08c8f0dd1 req-3b18d9a7-7809-47a9-8aa0-bb8accadb25b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.816 182939 DEBUG oslo_concurrency.lockutils [req-b853899b-b920-48fa-b7e4-48e08c8f0dd1 req-3b18d9a7-7809-47a9-8aa0-bb8accadb25b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.816 182939 DEBUG nova.compute.manager [req-b853899b-b920-48fa-b7e4-48e08c8f0dd1 req-3b18d9a7-7809-47a9-8aa0-bb8accadb25b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Processing event network-vif-plugged-2e340635-02cb-45e6-ac92-5fe5fe003dec _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.816 182939 DEBUG nova.compute.manager [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.820 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040247.819854, d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.820 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] VM Resumed (Lifecycle Event)
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.822 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.826 182939 INFO nova.virt.libvirt.driver [-] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Instance spawned successfully.
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.826 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.874 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.875 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.875 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.876 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.876 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.876 182939 DEBUG nova.virt.libvirt.driver [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.889 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.892 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:04:07 compute-0 nova_compute[182935]: 2026-01-22 00:04:07.955 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:04:08 compute-0 nova_compute[182935]: 2026-01-22 00:04:08.042 182939 INFO nova.compute.manager [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Took 10.05 seconds to spawn the instance on the hypervisor.
Jan 22 00:04:08 compute-0 nova_compute[182935]: 2026-01-22 00:04:08.042 182939 DEBUG nova.compute.manager [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:08 compute-0 nova_compute[182935]: 2026-01-22 00:04:08.121 182939 DEBUG nova.network.neutron [req-c9c0fc63-b815-4efd-a775-e1456983f326 req-9ede2760-8425-4148-8263-be5f6826c535 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Updated VIF entry in instance network info cache for port 2e340635-02cb-45e6-ac92-5fe5fe003dec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:04:08 compute-0 nova_compute[182935]: 2026-01-22 00:04:08.122 182939 DEBUG nova.network.neutron [req-c9c0fc63-b815-4efd-a775-e1456983f326 req-9ede2760-8425-4148-8263-be5f6826c535 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Updating instance_info_cache with network_info: [{"id": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "address": "fa:16:3e:b8:1e:10", "network": {"id": "f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-2018629590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd015b8945f149928b5915953637c812", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e340635-02", "ovs_interfaceid": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:04:08 compute-0 nova_compute[182935]: 2026-01-22 00:04:08.155 182939 DEBUG oslo_concurrency.lockutils [req-c9c0fc63-b815-4efd-a775-e1456983f326 req-9ede2760-8425-4148-8263-be5f6826c535 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:04:08 compute-0 nova_compute[182935]: 2026-01-22 00:04:08.197 182939 INFO nova.compute.manager [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Took 11.52 seconds to build instance.
Jan 22 00:04:08 compute-0 nova_compute[182935]: 2026-01-22 00:04:08.242 182939 DEBUG oslo_concurrency.lockutils [None req-5b71ad74-a06d-47cb-a281-e2697bcecf54 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.788s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:09 compute-0 nova_compute[182935]: 2026-01-22 00:04:09.637 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:10 compute-0 nova_compute[182935]: 2026-01-22 00:04:10.989 182939 DEBUG nova.compute.manager [req-5591e366-7807-467b-ae16-a77de5b66fc1 req-c49aa4bc-f1c1-4574-8b52-6985b6efe308 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Received event network-vif-plugged-2e340635-02cb-45e6-ac92-5fe5fe003dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:10 compute-0 nova_compute[182935]: 2026-01-22 00:04:10.990 182939 DEBUG oslo_concurrency.lockutils [req-5591e366-7807-467b-ae16-a77de5b66fc1 req-c49aa4bc-f1c1-4574-8b52-6985b6efe308 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:10 compute-0 nova_compute[182935]: 2026-01-22 00:04:10.990 182939 DEBUG oslo_concurrency.lockutils [req-5591e366-7807-467b-ae16-a77de5b66fc1 req-c49aa4bc-f1c1-4574-8b52-6985b6efe308 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:10 compute-0 nova_compute[182935]: 2026-01-22 00:04:10.990 182939 DEBUG oslo_concurrency.lockutils [req-5591e366-7807-467b-ae16-a77de5b66fc1 req-c49aa4bc-f1c1-4574-8b52-6985b6efe308 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:10 compute-0 nova_compute[182935]: 2026-01-22 00:04:10.990 182939 DEBUG nova.compute.manager [req-5591e366-7807-467b-ae16-a77de5b66fc1 req-c49aa4bc-f1c1-4574-8b52-6985b6efe308 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] No waiting events found dispatching network-vif-plugged-2e340635-02cb-45e6-ac92-5fe5fe003dec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:04:10 compute-0 nova_compute[182935]: 2026-01-22 00:04:10.991 182939 WARNING nova.compute.manager [req-5591e366-7807-467b-ae16-a77de5b66fc1 req-c49aa4bc-f1c1-4574-8b52-6985b6efe308 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Received unexpected event network-vif-plugged-2e340635-02cb-45e6-ac92-5fe5fe003dec for instance with vm_state active and task_state None.
Jan 22 00:04:11 compute-0 nova_compute[182935]: 2026-01-22 00:04:11.067 182939 DEBUG nova.compute.manager [None req-582cfe51-4930-4537-b736-6410e565242f aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196
Jan 22 00:04:11 compute-0 nova_compute[182935]: 2026-01-22 00:04:11.565 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:11 compute-0 nova_compute[182935]: 2026-01-22 00:04:11.976 182939 DEBUG nova.compute.manager [None req-52f3e8da-76b5-4766-8b24-6634e60b3b50 aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.529 182939 DEBUG oslo_concurrency.lockutils [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Acquiring lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.530 182939 DEBUG oslo_concurrency.lockutils [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.530 182939 DEBUG oslo_concurrency.lockutils [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Acquiring lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.530 182939 DEBUG oslo_concurrency.lockutils [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.531 182939 DEBUG oslo_concurrency.lockutils [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.542 182939 INFO nova.compute.manager [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Terminating instance
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.553 182939 DEBUG nova.compute.manager [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:04:12 compute-0 kernel: tap2e340635-02 (unregistering): left promiscuous mode
Jan 22 00:04:12 compute-0 NetworkManager[55139]: <info>  [1769040252.5727] device (tap2e340635-02): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:04:12 compute-0 ovn_controller[95047]: 2026-01-22T00:04:12Z|00348|binding|INFO|Releasing lport 2e340635-02cb-45e6-ac92-5fe5fe003dec from this chassis (sb_readonly=0)
Jan 22 00:04:12 compute-0 ovn_controller[95047]: 2026-01-22T00:04:12Z|00349|binding|INFO|Setting lport 2e340635-02cb-45e6-ac92-5fe5fe003dec down in Southbound
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.581 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:12 compute-0 ovn_controller[95047]: 2026-01-22T00:04:12Z|00350|binding|INFO|Removing iface tap2e340635-02 ovn-installed in OVS
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.594 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:1e:10 10.100.0.10'], port_security=['fa:16:3e:b8:1e:10 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd0c6ae12-84cc-4e3e-8dfc-a10e2f15143d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fd015b8945f149928b5915953637c812', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'bded0d92-ff50-4771-a532-f277eae957fa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ca4e4d1-e168-45ec-b34c-1f4d14412611, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=2e340635-02cb-45e6-ac92-5fe5fe003dec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.596 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 2e340635-02cb-45e6-ac92-5fe5fe003dec in datapath f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2 unbound from our chassis
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.597 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.598 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.599 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e652e4cd-fbbf-41bb-85ba-6da6fd2824c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.599 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2 namespace which is not needed anymore
Jan 22 00:04:12 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000059.scope: Deactivated successfully.
Jan 22 00:04:12 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000059.scope: Consumed 5.133s CPU time.
Jan 22 00:04:12 compute-0 systemd-machined[154182]: Machine qemu-45-instance-00000059 terminated.
Jan 22 00:04:12 compute-0 neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2[225632]: [NOTICE]   (225636) : haproxy version is 2.8.14-c23fe91
Jan 22 00:04:12 compute-0 neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2[225632]: [NOTICE]   (225636) : path to executable is /usr/sbin/haproxy
Jan 22 00:04:12 compute-0 neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2[225632]: [WARNING]  (225636) : Exiting Master process...
Jan 22 00:04:12 compute-0 neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2[225632]: [ALERT]    (225636) : Current worker (225638) exited with code 143 (Terminated)
Jan 22 00:04:12 compute-0 neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2[225632]: [WARNING]  (225636) : All workers exited. Exiting... (0)
Jan 22 00:04:12 compute-0 systemd[1]: libpod-7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead.scope: Deactivated successfully.
Jan 22 00:04:12 compute-0 podman[225672]: 2026-01-22 00:04:12.73783234 +0000 UTC m=+0.048910266 container died 7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 00:04:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead-userdata-shm.mount: Deactivated successfully.
Jan 22 00:04:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-e30aff57b641320633fcaae0b3ef3641d0dcc45d6d1abbc756a55e9ff46d73a6-merged.mount: Deactivated successfully.
Jan 22 00:04:12 compute-0 podman[225672]: 2026-01-22 00:04:12.770514395 +0000 UTC m=+0.081592401 container cleanup 7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.775 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.779 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:12 compute-0 systemd[1]: libpod-conmon-7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead.scope: Deactivated successfully.
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.815 182939 INFO nova.virt.libvirt.driver [-] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Instance destroyed successfully.
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.816 182939 DEBUG nova.objects.instance [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lazy-loading 'resources' on Instance uuid d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.836 182939 DEBUG nova.virt.libvirt.vif [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:03:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-1279263803',display_name='tempest-NoVNCConsoleTestJSON-server-1279263803',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-1279263803',id=89,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:04:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fd015b8945f149928b5915953637c812',ramdisk_id='',reservation_id='r-9i3br92f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NoVNCConsoleTestJSON-55272866',owner_user_name='tempest-NoVNCConsoleTestJSON-55272866-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:04:08Z,user_data=None,user_id='aef36a0345de4d4ba834c68026792663',uuid=d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "address": "fa:16:3e:b8:1e:10", "network": {"id": "f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-2018629590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd015b8945f149928b5915953637c812", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e340635-02", "ovs_interfaceid": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.836 182939 DEBUG nova.network.os_vif_util [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Converting VIF {"id": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "address": "fa:16:3e:b8:1e:10", "network": {"id": "f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-2018629590-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fd015b8945f149928b5915953637c812", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e340635-02", "ovs_interfaceid": "2e340635-02cb-45e6-ac92-5fe5fe003dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.837 182939 DEBUG nova.network.os_vif_util [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=2e340635-02cb-45e6-ac92-5fe5fe003dec,network=Network(f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e340635-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.837 182939 DEBUG os_vif [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=2e340635-02cb-45e6-ac92-5fe5fe003dec,network=Network(f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e340635-02') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:04:12 compute-0 podman[225704]: 2026-01-22 00:04:12.839110104 +0000 UTC m=+0.045344868 container remove 7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.839 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.839 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e340635-02, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.841 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.842 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.845 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[99af5337-72b7-48e8-9c6b-aac26d5c140d]: (4, ('Thu Jan 22 12:04:12 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2 (7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead)\n7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead\nThu Jan 22 12:04:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2 (7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead)\n7f97bb0942452f88cbdc0480285f964aced6576e4a58b0de2d30120ba8095ead\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.847 182939 INFO os_vif [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:1e:10,bridge_name='br-int',has_traffic_filtering=True,id=2e340635-02cb-45e6-ac92-5fe5fe003dec,network=Network(f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e340635-02')
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.847 182939 INFO nova.virt.libvirt.driver [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Deleting instance files /var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d_del
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.848 182939 INFO nova.virt.libvirt.driver [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Deletion of /var/lib/nova/instances/d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d_del complete
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.847 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[41f5fe43-ba5e-4271-bfd2-6846e5fe09c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.847 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3c701f5-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:12 compute-0 kernel: tapf3c701f5-40: left promiscuous mode
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.850 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:12 compute-0 nova_compute[182935]: 2026-01-22 00:04:12.860 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.863 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[eceee574-82c0-47b5-953c-3e35d63c7362]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.880 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[09b6a73d-8e51-4954-a391-41f339ff7dd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.881 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fe07c869-7550-4287-ab8d-29bf9e5ca5ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.899 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[17a542f1-a91c-46a6-beff-73f63e6d4e40]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474866, 'reachable_time': 43250, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225731, 'error': None, 'target': 'ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:12 compute-0 systemd[1]: run-netns-ovnmeta\x2df3c701f5\x2d4a37\x2d4da8\x2d8e3d\x2d6d96ed2f66e2.mount: Deactivated successfully.
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.902 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f3c701f5-4a37-4da8-8e3d-6d96ed2f66e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:04:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:12.902 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[01831f9d-24c8-42bd-916e-bc55ab4ca3e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:13 compute-0 nova_compute[182935]: 2026-01-22 00:04:13.026 182939 INFO nova.compute.manager [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 22 00:04:13 compute-0 nova_compute[182935]: 2026-01-22 00:04:13.027 182939 DEBUG oslo.service.loopingcall [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:04:13 compute-0 nova_compute[182935]: 2026-01-22 00:04:13.027 182939 DEBUG nova.compute.manager [-] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:04:13 compute-0 nova_compute[182935]: 2026-01-22 00:04:13.027 182939 DEBUG nova.network.neutron [-] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:04:13 compute-0 nova_compute[182935]: 2026-01-22 00:04:13.105 182939 DEBUG nova.compute.manager [req-cd0bb849-3126-4597-8ead-d822bdfc912c req-ed292ac0-467d-4104-b6f9-7395051d8099 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Received event network-vif-unplugged-2e340635-02cb-45e6-ac92-5fe5fe003dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:13 compute-0 nova_compute[182935]: 2026-01-22 00:04:13.105 182939 DEBUG oslo_concurrency.lockutils [req-cd0bb849-3126-4597-8ead-d822bdfc912c req-ed292ac0-467d-4104-b6f9-7395051d8099 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:13 compute-0 nova_compute[182935]: 2026-01-22 00:04:13.106 182939 DEBUG oslo_concurrency.lockutils [req-cd0bb849-3126-4597-8ead-d822bdfc912c req-ed292ac0-467d-4104-b6f9-7395051d8099 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:13 compute-0 nova_compute[182935]: 2026-01-22 00:04:13.106 182939 DEBUG oslo_concurrency.lockutils [req-cd0bb849-3126-4597-8ead-d822bdfc912c req-ed292ac0-467d-4104-b6f9-7395051d8099 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:13 compute-0 nova_compute[182935]: 2026-01-22 00:04:13.106 182939 DEBUG nova.compute.manager [req-cd0bb849-3126-4597-8ead-d822bdfc912c req-ed292ac0-467d-4104-b6f9-7395051d8099 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] No waiting events found dispatching network-vif-unplugged-2e340635-02cb-45e6-ac92-5fe5fe003dec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:04:13 compute-0 nova_compute[182935]: 2026-01-22 00:04:13.106 182939 DEBUG nova.compute.manager [req-cd0bb849-3126-4597-8ead-d822bdfc912c req-ed292ac0-467d-4104-b6f9-7395051d8099 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Received event network-vif-unplugged-2e340635-02cb-45e6-ac92-5fe5fe003dec for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.040 182939 DEBUG nova.network.neutron [-] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.078 182939 INFO nova.compute.manager [-] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Took 2.05 seconds to deallocate network for instance.
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.204 182939 DEBUG oslo_concurrency.lockutils [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.204 182939 DEBUG oslo_concurrency.lockutils [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.294 182939 DEBUG nova.compute.provider_tree [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.325 182939 DEBUG nova.scheduler.client.report [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.358 182939 DEBUG nova.compute.manager [req-47fccc45-4db1-44d0-bc3a-c30594788123 req-08cd4704-e819-4e7d-b3a5-1686ceee39c2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Received event network-vif-plugged-2e340635-02cb-45e6-ac92-5fe5fe003dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.359 182939 DEBUG oslo_concurrency.lockutils [req-47fccc45-4db1-44d0-bc3a-c30594788123 req-08cd4704-e819-4e7d-b3a5-1686ceee39c2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.359 182939 DEBUG oslo_concurrency.lockutils [req-47fccc45-4db1-44d0-bc3a-c30594788123 req-08cd4704-e819-4e7d-b3a5-1686ceee39c2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.359 182939 DEBUG oslo_concurrency.lockutils [req-47fccc45-4db1-44d0-bc3a-c30594788123 req-08cd4704-e819-4e7d-b3a5-1686ceee39c2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.360 182939 DEBUG nova.compute.manager [req-47fccc45-4db1-44d0-bc3a-c30594788123 req-08cd4704-e819-4e7d-b3a5-1686ceee39c2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] No waiting events found dispatching network-vif-plugged-2e340635-02cb-45e6-ac92-5fe5fe003dec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.360 182939 WARNING nova.compute.manager [req-47fccc45-4db1-44d0-bc3a-c30594788123 req-08cd4704-e819-4e7d-b3a5-1686ceee39c2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Received unexpected event network-vif-plugged-2e340635-02cb-45e6-ac92-5fe5fe003dec for instance with vm_state deleted and task_state None.
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.362 182939 DEBUG oslo_concurrency.lockutils [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.417 182939 INFO nova.scheduler.client.report [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Deleted allocations for instance d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.459 182939 DEBUG nova.compute.manager [req-afab9e4e-a4a5-4118-a170-d018f9d97165 req-94888d0c-3cae-4b1e-8cba-a8d3d80e2422 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Received event network-vif-deleted-2e340635-02cb-45e6-ac92-5fe5fe003dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:15 compute-0 nova_compute[182935]: 2026-01-22 00:04:15.520 182939 DEBUG oslo_concurrency.lockutils [None req-9908eb27-a716-479e-b4ce-1e69bf6de13b aef36a0345de4d4ba834c68026792663 fd015b8945f149928b5915953637c812 - - default default] Lock "d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.991s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:15 compute-0 podman[225733]: 2026-01-22 00:04:15.744732338 +0000 UTC m=+0.109087478 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:04:15 compute-0 podman[225732]: 2026-01-22 00:04:15.760668899 +0000 UTC m=+0.129735854 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:04:16 compute-0 nova_compute[182935]: 2026-01-22 00:04:16.567 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:17 compute-0 nova_compute[182935]: 2026-01-22 00:04:17.491 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:17 compute-0 nova_compute[182935]: 2026-01-22 00:04:17.492 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:17 compute-0 nova_compute[182935]: 2026-01-22 00:04:17.492 182939 INFO nova.compute.manager [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Unshelving
Jan 22 00:04:17 compute-0 nova_compute[182935]: 2026-01-22 00:04:17.675 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:17 compute-0 nova_compute[182935]: 2026-01-22 00:04:17.675 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:17 compute-0 nova_compute[182935]: 2026-01-22 00:04:17.680 182939 DEBUG nova.objects.instance [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'pci_requests' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:17 compute-0 nova_compute[182935]: 2026-01-22 00:04:17.698 182939 DEBUG nova.objects.instance [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:17 compute-0 nova_compute[182935]: 2026-01-22 00:04:17.724 182939 DEBUG nova.virt.hardware [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:04:17 compute-0 nova_compute[182935]: 2026-01-22 00:04:17.725 182939 INFO nova.compute.claims [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:04:17 compute-0 nova_compute[182935]: 2026-01-22 00:04:17.842 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:17 compute-0 nova_compute[182935]: 2026-01-22 00:04:17.955 182939 DEBUG nova.compute.provider_tree [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:04:17 compute-0 nova_compute[182935]: 2026-01-22 00:04:17.974 182939 DEBUG nova.scheduler.client.report [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:04:18 compute-0 nova_compute[182935]: 2026-01-22 00:04:18.007 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:18 compute-0 nova_compute[182935]: 2026-01-22 00:04:18.330 182939 INFO nova.network.neutron [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updating port 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 22 00:04:20 compute-0 sshd-session[225781]: Invalid user svn from 188.166.69.60 port 34020
Jan 22 00:04:20 compute-0 sshd-session[225781]: Connection closed by invalid user svn 188.166.69.60 port 34020 [preauth]
Jan 22 00:04:21 compute-0 nova_compute[182935]: 2026-01-22 00:04:21.570 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:21 compute-0 nova_compute[182935]: 2026-01-22 00:04:21.883 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:04:21 compute-0 nova_compute[182935]: 2026-01-22 00:04:21.884 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquired lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:04:21 compute-0 nova_compute[182935]: 2026-01-22 00:04:21.884 182939 DEBUG nova.network.neutron [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:04:22 compute-0 nova_compute[182935]: 2026-01-22 00:04:22.843 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:04:23.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:04:23 compute-0 podman[225783]: 2026-01-22 00:04:23.679896161 +0000 UTC m=+0.051656223 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:04:24 compute-0 nova_compute[182935]: 2026-01-22 00:04:24.094 182939 DEBUG nova.compute.manager [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-changed-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:24 compute-0 nova_compute[182935]: 2026-01-22 00:04:24.094 182939 DEBUG nova.compute.manager [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Refreshing instance network info cache due to event network-changed-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:04:24 compute-0 nova_compute[182935]: 2026-01-22 00:04:24.094 182939 DEBUG oslo_concurrency.lockutils [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:04:24 compute-0 nova_compute[182935]: 2026-01-22 00:04:24.986 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.204 182939 DEBUG nova.network.neutron [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updating instance_info_cache with network_info: [{"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.225 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Releasing lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.227 182939 DEBUG nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.227 182939 INFO nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Creating image(s)
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.228 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "/var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.228 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "/var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.229 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "/var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.229 182939 DEBUG nova.objects.instance [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.230 182939 DEBUG oslo_concurrency.lockutils [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.230 182939 DEBUG nova.network.neutron [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Refreshing network info cache for port 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.264 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "f293700577693f64ada9e231fdffdcd8806f6455" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.265 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "f293700577693f64ada9e231fdffdcd8806f6455" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:26 compute-0 nova_compute[182935]: 2026-01-22 00:04:26.571 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:27 compute-0 nova_compute[182935]: 2026-01-22 00:04:27.813 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040252.8118556, d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:27 compute-0 nova_compute[182935]: 2026-01-22 00:04:27.813 182939 INFO nova.compute.manager [-] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] VM Stopped (Lifecycle Event)
Jan 22 00:04:27 compute-0 nova_compute[182935]: 2026-01-22 00:04:27.846 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:27 compute-0 nova_compute[182935]: 2026-01-22 00:04:27.864 182939 DEBUG nova.compute.manager [None req-ec49da72-0845-43fa-af1f-261269b2f24d - - - - - -] [instance: d0c6ae12-84cc-4e3e-8dfc-a10e2f15143d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:28 compute-0 podman[225807]: 2026-01-22 00:04:28.703850631 +0000 UTC m=+0.081854556 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.034 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.104 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455.part --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.105 182939 DEBUG nova.virt.images [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] bb239f4f-83bb-4009-8354-2ec967a0da2a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.106 182939 DEBUG nova.privsep.utils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.107 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455.part /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.445 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455.part /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455.converted" returned: 0 in 0.338s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.468 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.549 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455.converted --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.550 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "f293700577693f64ada9e231fdffdcd8806f6455" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.564 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.623 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.624 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "f293700577693f64ada9e231fdffdcd8806f6455" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.624 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "f293700577693f64ada9e231fdffdcd8806f6455" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.637 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.694 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.695 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455,backing_fmt=raw /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.730 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455,backing_fmt=raw /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.731 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "f293700577693f64ada9e231fdffdcd8806f6455" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.731 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.804 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.805 182939 DEBUG nova.objects.instance [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'migration_context' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.876 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.928 182939 INFO nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Rebasing disk image.
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.929 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.987 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:29 compute-0 nova_compute[182935]: 2026-01-22 00:04:29.988 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 -F raw /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:30 compute-0 nova_compute[182935]: 2026-01-22 00:04:30.083 182939 DEBUG nova.network.neutron [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updated VIF entry in instance network info cache for port 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:04:30 compute-0 nova_compute[182935]: 2026-01-22 00:04:30.084 182939 DEBUG nova.network.neutron [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updating instance_info_cache with network_info: [{"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:04:30 compute-0 nova_compute[182935]: 2026-01-22 00:04:30.109 182939 DEBUG oslo_concurrency.lockutils [req-03419a74-87d2-40de-9bd1-e9cc35c9f57d req-6d73a311-db08-426a-88df-5b0a91b71df8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:04:30 compute-0 nova_compute[182935]: 2026-01-22 00:04:30.110 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:04:30 compute-0 nova_compute[182935]: 2026-01-22 00:04:30.111 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:04:30 compute-0 nova_compute[182935]: 2026-01-22 00:04:30.111 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.501 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 -F raw /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk" returned: 0 in 1.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.502 182939 DEBUG nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.503 182939 DEBUG nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Ensure instance console log exists: /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.503 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.503 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.504 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.506 182939 DEBUG nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Start _get_guest_xml network_info=[{"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='bb14ee8534853ecb1d50d91b73a5de8b',container_format='bare',created_at=2026-01-22T00:03:44Z,direct_url=<?>,disk_format='qcow2',id=bb239f4f-83bb-4009-8354-2ec967a0da2a,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-1381246704-shelved',owner='a7e425a4d1854533a17d5f0dcd9d87b9',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2026-01-22T00:04:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.511 182939 WARNING nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.518 182939 DEBUG nova.virt.libvirt.host [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.519 182939 DEBUG nova.virt.libvirt.host [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.522 182939 DEBUG nova.virt.libvirt.host [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.523 182939 DEBUG nova.virt.libvirt.host [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.524 182939 DEBUG nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.524 182939 DEBUG nova.virt.hardware [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='bb14ee8534853ecb1d50d91b73a5de8b',container_format='bare',created_at=2026-01-22T00:03:44Z,direct_url=<?>,disk_format='qcow2',id=bb239f4f-83bb-4009-8354-2ec967a0da2a,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-1381246704-shelved',owner='a7e425a4d1854533a17d5f0dcd9d87b9',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2026-01-22T00:04:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.525 182939 DEBUG nova.virt.hardware [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.525 182939 DEBUG nova.virt.hardware [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.525 182939 DEBUG nova.virt.hardware [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.525 182939 DEBUG nova.virt.hardware [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.526 182939 DEBUG nova.virt.hardware [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.526 182939 DEBUG nova.virt.hardware [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.526 182939 DEBUG nova.virt.hardware [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.526 182939 DEBUG nova.virt.hardware [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.526 182939 DEBUG nova.virt.hardware [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.527 182939 DEBUG nova.virt.hardware [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.527 182939 DEBUG nova.objects.instance [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.577 182939 DEBUG nova.virt.libvirt.vif [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1381246704',display_name='tempest-ServersNegativeTestJSON-server-1381246704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1381246704',id=83,image_ref='bb239f4f-83bb-4009-8354-2ec967a0da2a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:01:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='a7e425a4d1854533a17d5f0dcd9d87b9',ramdisk_id='',reservation_id='r-37fe06tc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1689661',owner_user_name='tempest-ServersNegativeTestJSON-1689661-project-member',shelved_at='2026-01-22T00:04:02.371491',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='bb239f4f-83bb-4009-8354-2ec967a0da2a'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:04:17Z,user_data=None,user_id='531ec5a088a94b78af6e2c3feda17c0c',uuid=2cb6e3d6-f22a-49ea-aab8-900dd88605e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.578 182939 DEBUG nova.network.os_vif_util [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converting VIF {"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.578 182939 DEBUG nova.network.os_vif_util [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.579 182939 DEBUG nova.objects.instance [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.580 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.595 182939 DEBUG nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:04:31 compute-0 nova_compute[182935]:   <uuid>2cb6e3d6-f22a-49ea-aab8-900dd88605e9</uuid>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   <name>instance-00000053</name>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersNegativeTestJSON-server-1381246704</nova:name>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:04:31</nova:creationTime>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:04:31 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:04:31 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:04:31 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:04:31 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:04:31 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:04:31 compute-0 nova_compute[182935]:         <nova:user uuid="531ec5a088a94b78af6e2c3feda17c0c">tempest-ServersNegativeTestJSON-1689661-project-member</nova:user>
Jan 22 00:04:31 compute-0 nova_compute[182935]:         <nova:project uuid="a7e425a4d1854533a17d5f0dcd9d87b9">tempest-ServersNegativeTestJSON-1689661</nova:project>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="bb239f4f-83bb-4009-8354-2ec967a0da2a"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:04:31 compute-0 nova_compute[182935]:         <nova:port uuid="8412a083-ca97-4457-bb0e-9c7bcd8bfb2f">
Jan 22 00:04:31 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <system>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <entry name="serial">2cb6e3d6-f22a-49ea-aab8-900dd88605e9</entry>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <entry name="uuid">2cb6e3d6-f22a-49ea-aab8-900dd88605e9</entry>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     </system>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   <os>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   </os>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   <features>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   </features>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.config"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:e0:ee:91"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <target dev="tap8412a083-ca"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/console.log" append="off"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <video>
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     </video>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <input type="keyboard" bus="usb"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:04:31 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:04:31 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:04:31 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:04:31 compute-0 nova_compute[182935]: </domain>
Jan 22 00:04:31 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.597 182939 DEBUG nova.compute.manager [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Preparing to wait for external event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.597 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.597 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.597 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.598 182939 DEBUG nova.virt.libvirt.vif [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1381246704',display_name='tempest-ServersNegativeTestJSON-server-1381246704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1381246704',id=83,image_ref='bb239f4f-83bb-4009-8354-2ec967a0da2a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:01:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='a7e425a4d1854533a17d5f0dcd9d87b9',ramdisk_id='',reservation_id='r-37fe06tc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1689661',owner_user_name='tempest-ServersNegativeTestJSON-1689661-project-member',shelved_at='2026-01-22T00:04:02.371491',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='bb239f4f-83bb-4009-8354-2ec967a0da2a'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:04:17Z,user_data=None,user_id='531ec5a088a94b78af6e2c3feda17c0c',uuid=2cb6e3d6-f22a-49ea-aab8-900dd88605e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.598 182939 DEBUG nova.network.os_vif_util [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converting VIF {"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.599 182939 DEBUG nova.network.os_vif_util [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.599 182939 DEBUG os_vif [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.599 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.600 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.600 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.603 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.603 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8412a083-ca, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.603 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8412a083-ca, col_values=(('external_ids', {'iface-id': '8412a083-ca97-4457-bb0e-9c7bcd8bfb2f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:ee:91', 'vm-uuid': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.605 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:31 compute-0 NetworkManager[55139]: <info>  [1769040271.6060] manager: (tap8412a083-ca): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.607 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.611 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.612 182939 INFO os_vif [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca')
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.758 182939 DEBUG nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.758 182939 DEBUG nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.759 182939 DEBUG nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] No VIF found with MAC fa:16:3e:e0:ee:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.759 182939 INFO nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Using config drive
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.786 182939 DEBUG nova.objects.instance [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:31 compute-0 nova_compute[182935]: 2026-01-22 00:04:31.861 182939 DEBUG nova.objects.instance [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'keypairs' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:33 compute-0 nova_compute[182935]: 2026-01-22 00:04:33.180 182939 INFO nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Creating config drive at /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.config
Jan 22 00:04:33 compute-0 nova_compute[182935]: 2026-01-22 00:04:33.186 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2663vtpc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:33 compute-0 nova_compute[182935]: 2026-01-22 00:04:33.316 182939 DEBUG oslo_concurrency.processutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2663vtpc" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:33 compute-0 kernel: tap8412a083-ca: entered promiscuous mode
Jan 22 00:04:33 compute-0 NetworkManager[55139]: <info>  [1769040273.4007] manager: (tap8412a083-ca): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Jan 22 00:04:33 compute-0 ovn_controller[95047]: 2026-01-22T00:04:33Z|00351|binding|INFO|Claiming lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for this chassis.
Jan 22 00:04:33 compute-0 ovn_controller[95047]: 2026-01-22T00:04:33Z|00352|binding|INFO|8412a083-ca97-4457-bb0e-9c7bcd8bfb2f: Claiming fa:16:3e:e0:ee:91 10.100.0.3
Jan 22 00:04:33 compute-0 nova_compute[182935]: 2026-01-22 00:04:33.443 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:33 compute-0 nova_compute[182935]: 2026-01-22 00:04:33.446 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:33 compute-0 systemd-udevd[225877]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.468 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:ee:91 10.100.0.3'], port_security=['fa:16:3e:e0:ee:91 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'neutron:revision_number': '7', 'neutron:security_group_ids': '5fb84efc-d0d8-44ae-84e4-97e70d8c202e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10175545-8ba8-4bcf-9e15-f460a54818aa, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.469 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f in datapath 397ba44b-e27b-4a2a-a10b-7de0daa31656 bound to our chassis
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.470 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 397ba44b-e27b-4a2a-a10b-7de0daa31656
Jan 22 00:04:33 compute-0 NetworkManager[55139]: <info>  [1769040273.4788] device (tap8412a083-ca): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:04:33 compute-0 NetworkManager[55139]: <info>  [1769040273.4797] device (tap8412a083-ca): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:04:33 compute-0 systemd-machined[154182]: New machine qemu-46-instance-00000053.
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.485 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[192b9bce-1d0c-41e0-bf88-154a1ed2ebaf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.486 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap397ba44b-e1 in ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.488 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap397ba44b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.489 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0baa3a4e-aaf1-4810-921d-d361a188cc29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.489 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4b3955-c741-43cc-821f-3177b6c869e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 ovn_controller[95047]: 2026-01-22T00:04:33Z|00353|binding|INFO|Setting lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f ovn-installed in OVS
Jan 22 00:04:33 compute-0 ovn_controller[95047]: 2026-01-22T00:04:33Z|00354|binding|INFO|Setting lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f up in Southbound
Jan 22 00:04:33 compute-0 nova_compute[182935]: 2026-01-22 00:04:33.500 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.502 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[fb501151-db16-4f5b-a3d6-f0c4453f0477]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 systemd[1]: Started Virtual Machine qemu-46-instance-00000053.
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.519 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d9f1f903-b595-42b6-a664-4514a14168a7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.552 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[87dfab49-7b5b-46e4-a924-bdbcf9682685]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 NetworkManager[55139]: <info>  [1769040273.5606] manager: (tap397ba44b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/159)
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.560 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[df2b8d2f-d389-4e1a-8da7-e6020de199bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 systemd-udevd[225881]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.595 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[71bed1de-f6c7-45d6-b7ea-9c20124720cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.599 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[a2fbf5af-a05c-47b9-b328-36a81da5b6e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 NetworkManager[55139]: <info>  [1769040273.6252] device (tap397ba44b-e0): carrier: link connected
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.634 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[937ae49d-5773-4b56-8591-f275b06671d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.656 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[63589b6a-9eeb-40a5-965e-ea2a3ec8437f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap397ba44b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:12:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477636, 'reachable_time': 23467, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225911, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.680 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[75608490-6a87-4418-8926-df2cb0d2c51d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:12aa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477636, 'tstamp': 477636}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225912, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.700 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6004e725-349f-4658-89e8-7666dd2da7fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap397ba44b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:12:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477636, 'reachable_time': 23467, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225913, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.737 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[df8c06b6-b541-4b69-8262-62175a2c836f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.817 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[82f40e07-06c7-4647-ba57-035b786976c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.819 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap397ba44b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.819 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.819 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap397ba44b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:33 compute-0 nova_compute[182935]: 2026-01-22 00:04:33.821 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:33 compute-0 NetworkManager[55139]: <info>  [1769040273.8223] manager: (tap397ba44b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/160)
Jan 22 00:04:33 compute-0 kernel: tap397ba44b-e0: entered promiscuous mode
Jan 22 00:04:33 compute-0 nova_compute[182935]: 2026-01-22 00:04:33.825 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.826 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap397ba44b-e0, col_values=(('external_ids', {'iface-id': 'f7f4d7e4-9841-41f2-85bd-658a3b613e0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:33 compute-0 nova_compute[182935]: 2026-01-22 00:04:33.827 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:33 compute-0 ovn_controller[95047]: 2026-01-22T00:04:33Z|00355|binding|INFO|Releasing lport f7f4d7e4-9841-41f2-85bd-658a3b613e0d from this chassis (sb_readonly=0)
Jan 22 00:04:33 compute-0 nova_compute[182935]: 2026-01-22 00:04:33.828 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.830 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/397ba44b-e27b-4a2a-a10b-7de0daa31656.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/397ba44b-e27b-4a2a-a10b-7de0daa31656.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.831 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cf5f930c-1d3d-4c70-8bb6-0068a810bf95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.832 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-397ba44b-e27b-4a2a-a10b-7de0daa31656
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/397ba44b-e27b-4a2a-a10b-7de0daa31656.pid.haproxy
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 397ba44b-e27b-4a2a-a10b-7de0daa31656
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:04:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:33.833 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'env', 'PROCESS_TAG=haproxy-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/397ba44b-e27b-4a2a-a10b-7de0daa31656.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:04:33 compute-0 nova_compute[182935]: 2026-01-22 00:04:33.841 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.195 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040274.1946633, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.196 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Started (Lifecycle Event)
Jan 22 00:04:34 compute-0 podman[225948]: 2026-01-22 00:04:34.220133956 +0000 UTC m=+0.050546776 container create ce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.231 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.238 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040274.1950586, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.239 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Paused (Lifecycle Event)
Jan 22 00:04:34 compute-0 systemd[1]: Started libpod-conmon-ce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e.scope.
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.264 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.270 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:04:34 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:04:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1117c33c56f5422946b5c16f19731b9b59999d46353c7ccfbc7b9f2c22e3f5fd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:04:34 compute-0 podman[225948]: 2026-01-22 00:04:34.19591557 +0000 UTC m=+0.026328420 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:04:34 compute-0 podman[225948]: 2026-01-22 00:04:34.298346092 +0000 UTC m=+0.128758922 container init ce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:04:34 compute-0 podman[225948]: 2026-01-22 00:04:34.30396462 +0000 UTC m=+0.134377440 container start ce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 00:04:34 compute-0 podman[225965]: 2026-01-22 00:04:34.31733263 +0000 UTC m=+0.060506722 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.317 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:04:34 compute-0 podman[225962]: 2026-01-22 00:04:34.326901514 +0000 UTC m=+0.072000253 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 00:04:34 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[225966]: [NOTICE]   (226004) : New worker (226009) forked
Jan 22 00:04:34 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[225966]: [NOTICE]   (226004) : Loading success.
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.378 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:34.378 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:04:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:34.379 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:04:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:34.380 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.485 182939 DEBUG nova.compute.manager [req-03469105-fda9-40f1-9401-cae9e45e977f req-ebadbd7a-5d27-4928-b5a5-0a70edc1f826 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.486 182939 DEBUG oslo_concurrency.lockutils [req-03469105-fda9-40f1-9401-cae9e45e977f req-ebadbd7a-5d27-4928-b5a5-0a70edc1f826 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.487 182939 DEBUG oslo_concurrency.lockutils [req-03469105-fda9-40f1-9401-cae9e45e977f req-ebadbd7a-5d27-4928-b5a5-0a70edc1f826 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.487 182939 DEBUG oslo_concurrency.lockutils [req-03469105-fda9-40f1-9401-cae9e45e977f req-ebadbd7a-5d27-4928-b5a5-0a70edc1f826 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.488 182939 DEBUG nova.compute.manager [req-03469105-fda9-40f1-9401-cae9e45e977f req-ebadbd7a-5d27-4928-b5a5-0a70edc1f826 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Processing event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.489 182939 DEBUG nova.compute.manager [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.493 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040274.4923818, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.493 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Resumed (Lifecycle Event)
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.496 182939 DEBUG nova.virt.libvirt.driver [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.499 182939 INFO nova.virt.libvirt.driver [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance spawned successfully.
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.524 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.528 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.571 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.736 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updating instance_info_cache with network_info: [{"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.774 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.774 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.774 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.775 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.775 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.809 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.809 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.809 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.810 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.893 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.956 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:34 compute-0 nova_compute[182935]: 2026-01-22 00:04:34.957 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.034 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.183 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.185 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5618MB free_disk=73.09932708740234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.185 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.186 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.271 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.271 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.271 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.289 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.355 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.355 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.376 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.407 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.461 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.551 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.944 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.944 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.758s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.963 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:35 compute-0 nova_compute[182935]: 2026-01-22 00:04:35.963 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:04:36 compute-0 nova_compute[182935]: 2026-01-22 00:04:36.185 182939 DEBUG nova.compute.manager [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:36 compute-0 nova_compute[182935]: 2026-01-22 00:04:36.310 182939 DEBUG oslo_concurrency.lockutils [None req-c8b3903d-ffea-4a9e-9e69-8a1bc3fdfcc1 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 18.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:36 compute-0 nova_compute[182935]: 2026-01-22 00:04:36.573 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:36 compute-0 nova_compute[182935]: 2026-01-22 00:04:36.606 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:36 compute-0 nova_compute[182935]: 2026-01-22 00:04:36.641 182939 DEBUG nova.compute.manager [req-67a024bc-f718-40e1-91f8-d902f730a816 req-2ec56b6c-a64b-48e0-aa3d-fab1defc2cef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:36 compute-0 nova_compute[182935]: 2026-01-22 00:04:36.642 182939 DEBUG oslo_concurrency.lockutils [req-67a024bc-f718-40e1-91f8-d902f730a816 req-2ec56b6c-a64b-48e0-aa3d-fab1defc2cef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:36 compute-0 nova_compute[182935]: 2026-01-22 00:04:36.642 182939 DEBUG oslo_concurrency.lockutils [req-67a024bc-f718-40e1-91f8-d902f730a816 req-2ec56b6c-a64b-48e0-aa3d-fab1defc2cef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:36 compute-0 nova_compute[182935]: 2026-01-22 00:04:36.642 182939 DEBUG oslo_concurrency.lockutils [req-67a024bc-f718-40e1-91f8-d902f730a816 req-2ec56b6c-a64b-48e0-aa3d-fab1defc2cef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:36 compute-0 nova_compute[182935]: 2026-01-22 00:04:36.642 182939 DEBUG nova.compute.manager [req-67a024bc-f718-40e1-91f8-d902f730a816 req-2ec56b6c-a64b-48e0-aa3d-fab1defc2cef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] No waiting events found dispatching network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:04:36 compute-0 nova_compute[182935]: 2026-01-22 00:04:36.642 182939 WARNING nova.compute.manager [req-67a024bc-f718-40e1-91f8-d902f730a816 req-2ec56b6c-a64b-48e0-aa3d-fab1defc2cef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received unexpected event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for instance with vm_state active and task_state None.
Jan 22 00:04:36 compute-0 nova_compute[182935]: 2026-01-22 00:04:36.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:36 compute-0 nova_compute[182935]: 2026-01-22 00:04:36.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:38 compute-0 nova_compute[182935]: 2026-01-22 00:04:38.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:41 compute-0 nova_compute[182935]: 2026-01-22 00:04:41.574 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:41 compute-0 nova_compute[182935]: 2026-01-22 00:04:41.608 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:43 compute-0 nova_compute[182935]: 2026-01-22 00:04:43.802 182939 DEBUG nova.objects.instance [None req-666c6877-bc3b-478a-b15f-3524bb781ca6 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:43 compute-0 nova_compute[182935]: 2026-01-22 00:04:43.890 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040283.8902214, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:43 compute-0 nova_compute[182935]: 2026-01-22 00:04:43.891 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Paused (Lifecycle Event)
Jan 22 00:04:43 compute-0 nova_compute[182935]: 2026-01-22 00:04:43.925 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:43 compute-0 nova_compute[182935]: 2026-01-22 00:04:43.929 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:04:43 compute-0 nova_compute[182935]: 2026-01-22 00:04:43.958 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 22 00:04:44 compute-0 kernel: tap8412a083-ca (unregistering): left promiscuous mode
Jan 22 00:04:44 compute-0 NetworkManager[55139]: <info>  [1769040284.5735] device (tap8412a083-ca): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:04:44 compute-0 nova_compute[182935]: 2026-01-22 00:04:44.583 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:44 compute-0 ovn_controller[95047]: 2026-01-22T00:04:44Z|00356|binding|INFO|Releasing lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f from this chassis (sb_readonly=0)
Jan 22 00:04:44 compute-0 ovn_controller[95047]: 2026-01-22T00:04:44Z|00357|binding|INFO|Setting lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f down in Southbound
Jan 22 00:04:44 compute-0 nova_compute[182935]: 2026-01-22 00:04:44.584 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:44 compute-0 ovn_controller[95047]: 2026-01-22T00:04:44Z|00358|binding|INFO|Removing iface tap8412a083-ca ovn-installed in OVS
Jan 22 00:04:44 compute-0 ovn_controller[95047]: 2026-01-22T00:04:44Z|00359|binding|INFO|Releasing lport f7f4d7e4-9841-41f2-85bd-658a3b613e0d from this chassis (sb_readonly=0)
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.595 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:ee:91 10.100.0.3'], port_security=['fa:16:3e:e0:ee:91 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'neutron:revision_number': '9', 'neutron:security_group_ids': '5fb84efc-d0d8-44ae-84e4-97e70d8c202e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10175545-8ba8-4bcf-9e15-f460a54818aa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.597 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f in datapath 397ba44b-e27b-4a2a-a10b-7de0daa31656 unbound from our chassis
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.598 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 397ba44b-e27b-4a2a-a10b-7de0daa31656, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:04:44 compute-0 nova_compute[182935]: 2026-01-22 00:04:44.599 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.600 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[86fedde5-68f4-4326-a3bf-08bca42cbe6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.602 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 namespace which is not needed anymore
Jan 22 00:04:44 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000053.scope: Deactivated successfully.
Jan 22 00:04:44 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000053.scope: Consumed 10.630s CPU time.
Jan 22 00:04:44 compute-0 nova_compute[182935]: 2026-01-22 00:04:44.649 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:44 compute-0 systemd-machined[154182]: Machine qemu-46-instance-00000053 terminated.
Jan 22 00:04:44 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[225966]: [NOTICE]   (226004) : haproxy version is 2.8.14-c23fe91
Jan 22 00:04:44 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[225966]: [NOTICE]   (226004) : path to executable is /usr/sbin/haproxy
Jan 22 00:04:44 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[225966]: [WARNING]  (226004) : Exiting Master process...
Jan 22 00:04:44 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[225966]: [WARNING]  (226004) : Exiting Master process...
Jan 22 00:04:44 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[225966]: [ALERT]    (226004) : Current worker (226009) exited with code 143 (Terminated)
Jan 22 00:04:44 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[225966]: [WARNING]  (226004) : All workers exited. Exiting... (0)
Jan 22 00:04:44 compute-0 systemd[1]: libpod-ce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e.scope: Deactivated successfully.
Jan 22 00:04:44 compute-0 podman[226055]: 2026-01-22 00:04:44.775219648 +0000 UTC m=+0.049269074 container died ce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:04:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e-userdata-shm.mount: Deactivated successfully.
Jan 22 00:04:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-1117c33c56f5422946b5c16f19731b9b59999d46353c7ccfbc7b9f2c22e3f5fd-merged.mount: Deactivated successfully.
Jan 22 00:04:44 compute-0 podman[226055]: 2026-01-22 00:04:44.810975908 +0000 UTC m=+0.085025324 container cleanup ce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:04:44 compute-0 nova_compute[182935]: 2026-01-22 00:04:44.820 182939 DEBUG nova.compute.manager [None req-666c6877-bc3b-478a-b15f-3524bb781ca6 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:44 compute-0 systemd[1]: libpod-conmon-ce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e.scope: Deactivated successfully.
Jan 22 00:04:44 compute-0 podman[226104]: 2026-01-22 00:04:44.879557476 +0000 UTC m=+0.040817475 container remove ce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.886 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9bebc208-386d-4c32-8b9d-ebdc58c8db6a]: (4, ('Thu Jan 22 12:04:44 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 (ce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e)\nce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e\nThu Jan 22 12:04:44 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 (ce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e)\nce0a8c90ad0d0e7b02fd0f9182c35f20a43e654799c5e414fd2dbc667cee4d9e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.888 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[10f8f9e7-f8f6-46cc-a3cc-f4e4cb916285]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.889 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap397ba44b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:44 compute-0 nova_compute[182935]: 2026-01-22 00:04:44.891 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:44 compute-0 kernel: tap397ba44b-e0: left promiscuous mode
Jan 22 00:04:44 compute-0 nova_compute[182935]: 2026-01-22 00:04:44.909 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.912 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[da54deb5-f5ad-4f47-8eff-9c2faacecac0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.935 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[69006f75-d77d-46c8-a85e-72b38a26244b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.938 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[647f9b40-3318-40ed-9987-c9dc56461790]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.953 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f927bea8-818f-4af4-9b06-e25039068798]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477628, 'reachable_time': 25605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226123, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d397ba44b\x2de27b\x2d4a2a\x2da10b\x2d7de0daa31656.mount: Deactivated successfully.
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.957 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:04:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:44.958 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[832115e5-c19a-4dba-b5ba-78f50d68c03c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:45 compute-0 nova_compute[182935]: 2026-01-22 00:04:45.001 182939 DEBUG nova.compute.manager [req-fa30dbf7-a1a1-40d0-972a-3f60409f5e13 req-97c6e807-b96e-44df-a8f0-5bb81d522a31 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-unplugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:45 compute-0 nova_compute[182935]: 2026-01-22 00:04:45.001 182939 DEBUG oslo_concurrency.lockutils [req-fa30dbf7-a1a1-40d0-972a-3f60409f5e13 req-97c6e807-b96e-44df-a8f0-5bb81d522a31 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:45 compute-0 nova_compute[182935]: 2026-01-22 00:04:45.001 182939 DEBUG oslo_concurrency.lockutils [req-fa30dbf7-a1a1-40d0-972a-3f60409f5e13 req-97c6e807-b96e-44df-a8f0-5bb81d522a31 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:45 compute-0 nova_compute[182935]: 2026-01-22 00:04:45.002 182939 DEBUG oslo_concurrency.lockutils [req-fa30dbf7-a1a1-40d0-972a-3f60409f5e13 req-97c6e807-b96e-44df-a8f0-5bb81d522a31 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:45 compute-0 nova_compute[182935]: 2026-01-22 00:04:45.002 182939 DEBUG nova.compute.manager [req-fa30dbf7-a1a1-40d0-972a-3f60409f5e13 req-97c6e807-b96e-44df-a8f0-5bb81d522a31 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] No waiting events found dispatching network-vif-unplugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:04:45 compute-0 nova_compute[182935]: 2026-01-22 00:04:45.002 182939 WARNING nova.compute.manager [req-fa30dbf7-a1a1-40d0-972a-3f60409f5e13 req-97c6e807-b96e-44df-a8f0-5bb81d522a31 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received unexpected event network-vif-unplugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for instance with vm_state suspended and task_state None.
Jan 22 00:04:46 compute-0 nova_compute[182935]: 2026-01-22 00:04:46.576 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:46 compute-0 nova_compute[182935]: 2026-01-22 00:04:46.609 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:46 compute-0 podman[226125]: 2026-01-22 00:04:46.699289403 +0000 UTC m=+0.065696679 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:04:46 compute-0 podman[226124]: 2026-01-22 00:04:46.727288772 +0000 UTC m=+0.102418632 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 00:04:46 compute-0 nova_compute[182935]: 2026-01-22 00:04:46.880 182939 INFO nova.compute.manager [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Resuming
Jan 22 00:04:46 compute-0 nova_compute[182935]: 2026-01-22 00:04:46.881 182939 DEBUG nova.objects.instance [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'flavor' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:46 compute-0 nova_compute[182935]: 2026-01-22 00:04:46.939 182939 DEBUG oslo_concurrency.lockutils [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:04:46 compute-0 nova_compute[182935]: 2026-01-22 00:04:46.940 182939 DEBUG oslo_concurrency.lockutils [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquired lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:04:46 compute-0 nova_compute[182935]: 2026-01-22 00:04:46.940 182939 DEBUG nova.network.neutron [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:04:47 compute-0 nova_compute[182935]: 2026-01-22 00:04:47.225 182939 DEBUG nova.compute.manager [req-1d46f1ea-2a08-47bb-afa8-6f5912085fad req-0ae3ebd4-5cf7-4e91-b382-3c3e3fe315d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:47 compute-0 nova_compute[182935]: 2026-01-22 00:04:47.225 182939 DEBUG oslo_concurrency.lockutils [req-1d46f1ea-2a08-47bb-afa8-6f5912085fad req-0ae3ebd4-5cf7-4e91-b382-3c3e3fe315d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:47 compute-0 nova_compute[182935]: 2026-01-22 00:04:47.225 182939 DEBUG oslo_concurrency.lockutils [req-1d46f1ea-2a08-47bb-afa8-6f5912085fad req-0ae3ebd4-5cf7-4e91-b382-3c3e3fe315d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:47 compute-0 nova_compute[182935]: 2026-01-22 00:04:47.226 182939 DEBUG oslo_concurrency.lockutils [req-1d46f1ea-2a08-47bb-afa8-6f5912085fad req-0ae3ebd4-5cf7-4e91-b382-3c3e3fe315d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:47 compute-0 nova_compute[182935]: 2026-01-22 00:04:47.226 182939 DEBUG nova.compute.manager [req-1d46f1ea-2a08-47bb-afa8-6f5912085fad req-0ae3ebd4-5cf7-4e91-b382-3c3e3fe315d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] No waiting events found dispatching network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:04:47 compute-0 nova_compute[182935]: 2026-01-22 00:04:47.226 182939 WARNING nova.compute.manager [req-1d46f1ea-2a08-47bb-afa8-6f5912085fad req-0ae3ebd4-5cf7-4e91-b382-3c3e3fe315d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received unexpected event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for instance with vm_state suspended and task_state resuming.
Jan 22 00:04:48 compute-0 nova_compute[182935]: 2026-01-22 00:04:48.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.886 182939 DEBUG nova.network.neutron [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updating instance_info_cache with network_info: [{"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.904 182939 DEBUG oslo_concurrency.lockutils [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Releasing lock "refresh_cache-2cb6e3d6-f22a-49ea-aab8-900dd88605e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.909 182939 DEBUG nova.virt.libvirt.vif [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1381246704',display_name='tempest-ServersNegativeTestJSON-server-1381246704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1381246704',id=83,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:04:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='a7e425a4d1854533a17d5f0dcd9d87b9',ramdisk_id='',reservation_id='r-37fe06tc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServersNegativeTestJSON-1689661',owner_user_name='tempest-ServersNegativeTestJSON-1689661-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:04:44Z,user_data=None,user_id='531ec5a088a94b78af6e2c3feda17c0c',uuid=2cb6e3d6-f22a-49ea-aab8-900dd88605e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.910 182939 DEBUG nova.network.os_vif_util [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converting VIF {"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.910 182939 DEBUG nova.network.os_vif_util [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.911 182939 DEBUG os_vif [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.911 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.912 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.912 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.916 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.917 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8412a083-ca, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.917 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8412a083-ca, col_values=(('external_ids', {'iface-id': '8412a083-ca97-4457-bb0e-9c7bcd8bfb2f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:ee:91', 'vm-uuid': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.917 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.918 182939 INFO os_vif [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca')
Jan 22 00:04:49 compute-0 nova_compute[182935]: 2026-01-22 00:04:49.940 182939 DEBUG nova.objects.instance [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:50 compute-0 kernel: tap8412a083-ca: entered promiscuous mode
Jan 22 00:04:50 compute-0 NetworkManager[55139]: <info>  [1769040290.0101] manager: (tap8412a083-ca): new Tun device (/org/freedesktop/NetworkManager/Devices/161)
Jan 22 00:04:50 compute-0 ovn_controller[95047]: 2026-01-22T00:04:50Z|00360|binding|INFO|Claiming lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for this chassis.
Jan 22 00:04:50 compute-0 ovn_controller[95047]: 2026-01-22T00:04:50Z|00361|binding|INFO|8412a083-ca97-4457-bb0e-9c7bcd8bfb2f: Claiming fa:16:3e:e0:ee:91 10.100.0.3
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.009 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.013 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.022 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:ee:91 10.100.0.3'], port_security=['fa:16:3e:e0:ee:91 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'neutron:revision_number': '10', 'neutron:security_group_ids': '5fb84efc-d0d8-44ae-84e4-97e70d8c202e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10175545-8ba8-4bcf-9e15-f460a54818aa, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:04:50 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.024 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f in datapath 397ba44b-e27b-4a2a-a10b-7de0daa31656 bound to our chassis
Jan 22 00:04:50 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.025 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 397ba44b-e27b-4a2a-a10b-7de0daa31656
Jan 22 00:04:50 compute-0 systemd-udevd[226187]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.039 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[472543ba-7fa1-45d1-9b69-9bceb97ebfa0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.040 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap397ba44b-e1 in ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.042 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap397ba44b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.043 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[277a90de-b5be-4a35-b079-e61648aa1b5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.044 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[58944fc8-1821-4f9d-bebd-38373d4cd3ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 NetworkManager[55139]: <info>  [1769040290.0589] device (tap8412a083-ca): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:04:50 compute-0 NetworkManager[55139]: <info>  [1769040290.0601] device (tap8412a083-ca): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:04:50 compute-0 systemd-machined[154182]: New machine qemu-47-instance-00000053.
Jan 22 00:04:50 compute-0 ovn_controller[95047]: 2026-01-22T00:04:50Z|00362|binding|INFO|Setting lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f ovn-installed in OVS
Jan 22 00:04:50 compute-0 ovn_controller[95047]: 2026-01-22T00:04:50Z|00363|binding|INFO|Setting lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f up in Southbound
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.065 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[ad36e4ef-ecc7-4f94-b348-31e3d9ca3761]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.068 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.069 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:50 compute-0 systemd[1]: Started Virtual Machine qemu-47-instance-00000053.
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.096 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e55127-230e-4e3d-a906-53b77744edf9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.131 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[47383ca4-aea9-4d08-bf02-8e91ee00bab9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.137 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ed87b262-41d5-4101-85ca-0c5d3201e92e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 NetworkManager[55139]: <info>  [1769040290.1394] manager: (tap397ba44b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/162)
Jan 22 00:04:50 compute-0 systemd-udevd[226191]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.175 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[dd2d39f5-53e8-41f4-80f4-4cb0222721e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.179 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[bb063dd4-a1c4-4148-a457-158a7d902d88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 NetworkManager[55139]: <info>  [1769040290.2056] device (tap397ba44b-e0): carrier: link connected
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.210 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[42c222cc-eee0-4b20-b3e3-0d662c76a00a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.230 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f48ed109-4fd8-4731-bbe5-82fed3ca70dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap397ba44b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:12:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479294, 'reachable_time': 20058, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226220, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.244 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c15222c7-5c82-404d-910a-74fdd5a67f93]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:12aa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 479294, 'tstamp': 479294}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226221, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.265 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[59f3f9f1-356d-4217-996c-0af8957f0408]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap397ba44b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:12:aa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479294, 'reachable_time': 20058, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226222, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.297 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a25fe038-0589-46e7-87ba-97155761d7b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.381 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ec0191df-bc0e-43e2-a60f-43af14901677]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.383 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap397ba44b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.383 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.384 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap397ba44b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:50 compute-0 NetworkManager[55139]: <info>  [1769040290.4205] manager: (tap397ba44b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Jan 22 00:04:50 compute-0 kernel: tap397ba44b-e0: entered promiscuous mode
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.422 182939 DEBUG nova.compute.manager [req-04f438c8-36a4-4153-b7f7-19141e2a4800 req-05d2a7a0-a10d-4d30-a886-b9b8f74210e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.422 182939 DEBUG oslo_concurrency.lockutils [req-04f438c8-36a4-4153-b7f7-19141e2a4800 req-05d2a7a0-a10d-4d30-a886-b9b8f74210e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.422 182939 DEBUG oslo_concurrency.lockutils [req-04f438c8-36a4-4153-b7f7-19141e2a4800 req-05d2a7a0-a10d-4d30-a886-b9b8f74210e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.423 182939 DEBUG oslo_concurrency.lockutils [req-04f438c8-36a4-4153-b7f7-19141e2a4800 req-05d2a7a0-a10d-4d30-a886-b9b8f74210e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.423 182939 DEBUG nova.compute.manager [req-04f438c8-36a4-4153-b7f7-19141e2a4800 req-05d2a7a0-a10d-4d30-a886-b9b8f74210e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] No waiting events found dispatching network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.423 182939 WARNING nova.compute.manager [req-04f438c8-36a4-4153-b7f7-19141e2a4800 req-05d2a7a0-a10d-4d30-a886-b9b8f74210e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received unexpected event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for instance with vm_state suspended and task_state resuming.
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.424 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.424 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap397ba44b-e0, col_values=(('external_ids', {'iface-id': 'f7f4d7e4-9841-41f2-85bd-658a3b613e0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:04:50 compute-0 ovn_controller[95047]: 2026-01-22T00:04:50Z|00364|binding|INFO|Releasing lport f7f4d7e4-9841-41f2-85bd-658a3b613e0d from this chassis (sb_readonly=0)
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.439 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.441 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/397ba44b-e27b-4a2a-a10b-7de0daa31656.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/397ba44b-e27b-4a2a-a10b-7de0daa31656.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.442 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5eed33d1-93a9-490d-892d-46abcbcc0e2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.443 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-397ba44b-e27b-4a2a-a10b-7de0daa31656
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/397ba44b-e27b-4a2a-a10b-7de0daa31656.pid.haproxy
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 397ba44b-e27b-4a2a-a10b-7de0daa31656
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:04:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:04:50.444 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'env', 'PROCESS_TAG=haproxy-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/397ba44b-e27b-4a2a-a10b-7de0daa31656.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.598 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.599 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040290.59789, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.599 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Started (Lifecycle Event)
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.626 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.645 182939 DEBUG nova.compute.manager [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.646 182939 DEBUG nova.objects.instance [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.649 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.678 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] During sync_power_state the instance has a pending task (resuming). Skip.
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.679 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040290.6097443, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.679 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Resumed (Lifecycle Event)
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.681 182939 INFO nova.virt.libvirt.driver [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance running successfully.
Jan 22 00:04:50 compute-0 virtqemud[182477]: argument unsupported: QEMU guest agent is not configured
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.686 182939 DEBUG nova.virt.libvirt.guest [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.687 182939 DEBUG nova.compute.manager [None req-e65c4ba4-8aa4-4ba1-80e6-5944abd84f6e 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.712 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.716 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:04:50 compute-0 nova_compute[182935]: 2026-01-22 00:04:50.750 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] During sync_power_state the instance has a pending task (resuming). Skip.
Jan 22 00:04:50 compute-0 podman[226261]: 2026-01-22 00:04:50.879863039 +0000 UTC m=+0.067739729 container create 734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:04:50 compute-0 podman[226261]: 2026-01-22 00:04:50.841187617 +0000 UTC m=+0.029064287 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:04:50 compute-0 systemd[1]: Started libpod-conmon-734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660.scope.
Jan 22 00:04:50 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:04:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79d0dfe93c2239cf00ace3011944ae6aae0518d4e7608a02a71684f8be338f06/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:04:51 compute-0 podman[226261]: 2026-01-22 00:04:51.011459439 +0000 UTC m=+0.199336179 container init 734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 00:04:51 compute-0 podman[226261]: 2026-01-22 00:04:51.01716634 +0000 UTC m=+0.205043030 container start 734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 00:04:51 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[226276]: [NOTICE]   (226280) : New worker (226282) forked
Jan 22 00:04:51 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[226276]: [NOTICE]   (226280) : Loading success.
Jan 22 00:04:51 compute-0 nova_compute[182935]: 2026-01-22 00:04:51.579 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:51 compute-0 nova_compute[182935]: 2026-01-22 00:04:51.612 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:52 compute-0 nova_compute[182935]: 2026-01-22 00:04:52.520 182939 DEBUG nova.compute.manager [req-c4dec85f-2ff5-43c5-ba93-93508af5967e req-4a8f0bc1-b9b8-43a9-a77a-cdafd0ed6f16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:04:52 compute-0 nova_compute[182935]: 2026-01-22 00:04:52.520 182939 DEBUG oslo_concurrency.lockutils [req-c4dec85f-2ff5-43c5-ba93-93508af5967e req-4a8f0bc1-b9b8-43a9-a77a-cdafd0ed6f16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:04:52 compute-0 nova_compute[182935]: 2026-01-22 00:04:52.520 182939 DEBUG oslo_concurrency.lockutils [req-c4dec85f-2ff5-43c5-ba93-93508af5967e req-4a8f0bc1-b9b8-43a9-a77a-cdafd0ed6f16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:04:52 compute-0 nova_compute[182935]: 2026-01-22 00:04:52.521 182939 DEBUG oslo_concurrency.lockutils [req-c4dec85f-2ff5-43c5-ba93-93508af5967e req-4a8f0bc1-b9b8-43a9-a77a-cdafd0ed6f16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:04:52 compute-0 nova_compute[182935]: 2026-01-22 00:04:52.521 182939 DEBUG nova.compute.manager [req-c4dec85f-2ff5-43c5-ba93-93508af5967e req-4a8f0bc1-b9b8-43a9-a77a-cdafd0ed6f16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] No waiting events found dispatching network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:04:52 compute-0 nova_compute[182935]: 2026-01-22 00:04:52.521 182939 WARNING nova.compute.manager [req-c4dec85f-2ff5-43c5-ba93-93508af5967e req-4a8f0bc1-b9b8-43a9-a77a-cdafd0ed6f16 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received unexpected event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for instance with vm_state active and task_state None.
Jan 22 00:04:53 compute-0 ovn_controller[95047]: 2026-01-22T00:04:53Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e0:ee:91 10.100.0.3
Jan 22 00:04:54 compute-0 podman[226301]: 2026-01-22 00:04:54.704704606 +0000 UTC m=+0.078692429 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:04:56 compute-0 nova_compute[182935]: 2026-01-22 00:04:56.581 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:56 compute-0 nova_compute[182935]: 2026-01-22 00:04:56.614 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:04:59 compute-0 podman[226324]: 2026-01-22 00:04:59.673708236 +0000 UTC m=+0.049156042 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:05:01 compute-0 anacron[30915]: Job `cron.monthly' started
Jan 22 00:05:01 compute-0 anacron[30915]: Job `cron.monthly' terminated
Jan 22 00:05:01 compute-0 anacron[30915]: Normal exit (3 jobs run)
Jan 22 00:05:01 compute-0 nova_compute[182935]: 2026-01-22 00:05:01.584 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:01 compute-0 nova_compute[182935]: 2026-01-22 00:05:01.616 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:03.199 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:03.200 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:03.201 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:04 compute-0 sshd-session[226346]: Invalid user svn from 188.166.69.60 port 48780
Jan 22 00:05:04 compute-0 podman[226349]: 2026-01-22 00:05:04.434089467 +0000 UTC m=+0.066044447 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 00:05:04 compute-0 podman[226348]: 2026-01-22 00:05:04.456669464 +0000 UTC m=+0.089888395 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, name=ubi9-minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6)
Jan 22 00:05:04 compute-0 sshd-session[226346]: Connection closed by invalid user svn 188.166.69.60 port 48780 [preauth]
Jan 22 00:05:06 compute-0 nova_compute[182935]: 2026-01-22 00:05:06.586 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:06 compute-0 nova_compute[182935]: 2026-01-22 00:05:06.617 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.525 182939 DEBUG oslo_concurrency.lockutils [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.525 182939 DEBUG oslo_concurrency.lockutils [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.526 182939 DEBUG oslo_concurrency.lockutils [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.526 182939 DEBUG oslo_concurrency.lockutils [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.526 182939 DEBUG oslo_concurrency.lockutils [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.538 182939 INFO nova.compute.manager [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Terminating instance
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.548 182939 DEBUG nova.compute.manager [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:05:11 compute-0 kernel: tap8412a083-ca (unregistering): left promiscuous mode
Jan 22 00:05:11 compute-0 NetworkManager[55139]: <info>  [1769040311.5811] device (tap8412a083-ca): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.587 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:11 compute-0 ovn_controller[95047]: 2026-01-22T00:05:11Z|00365|binding|INFO|Releasing lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f from this chassis (sb_readonly=0)
Jan 22 00:05:11 compute-0 ovn_controller[95047]: 2026-01-22T00:05:11Z|00366|binding|INFO|Setting lport 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f down in Southbound
Jan 22 00:05:11 compute-0 ovn_controller[95047]: 2026-01-22T00:05:11Z|00367|binding|INFO|Removing iface tap8412a083-ca ovn-installed in OVS
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.590 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.599 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:ee:91 10.100.0.3'], port_security=['fa:16:3e:e0:ee:91 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '2cb6e3d6-f22a-49ea-aab8-900dd88605e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a7e425a4d1854533a17d5f0dcd9d87b9', 'neutron:revision_number': '11', 'neutron:security_group_ids': '5fb84efc-d0d8-44ae-84e4-97e70d8c202e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10175545-8ba8-4bcf-9e15-f460a54818aa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.600 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 8412a083-ca97-4457-bb0e-9c7bcd8bfb2f in datapath 397ba44b-e27b-4a2a-a10b-7de0daa31656 unbound from our chassis
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.602 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 397ba44b-e27b-4a2a-a10b-7de0daa31656, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.603 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[49f821ee-5f3d-4f4e-b034-da87928aa94e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.605 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 namespace which is not needed anymore
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.606 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.618 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:11 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000053.scope: Deactivated successfully.
Jan 22 00:05:11 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000053.scope: Consumed 3.694s CPU time.
Jan 22 00:05:11 compute-0 systemd-machined[154182]: Machine qemu-47-instance-00000053 terminated.
Jan 22 00:05:11 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[226276]: [NOTICE]   (226280) : haproxy version is 2.8.14-c23fe91
Jan 22 00:05:11 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[226276]: [NOTICE]   (226280) : path to executable is /usr/sbin/haproxy
Jan 22 00:05:11 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[226276]: [WARNING]  (226280) : Exiting Master process...
Jan 22 00:05:11 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[226276]: [ALERT]    (226280) : Current worker (226282) exited with code 143 (Terminated)
Jan 22 00:05:11 compute-0 neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656[226276]: [WARNING]  (226280) : All workers exited. Exiting... (0)
Jan 22 00:05:11 compute-0 systemd[1]: libpod-734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660.scope: Deactivated successfully.
Jan 22 00:05:11 compute-0 podman[226410]: 2026-01-22 00:05:11.732299296 +0000 UTC m=+0.046431034 container died 734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:05:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660-userdata-shm.mount: Deactivated successfully.
Jan 22 00:05:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-79d0dfe93c2239cf00ace3011944ae6aae0518d4e7608a02a71684f8be338f06-merged.mount: Deactivated successfully.
Jan 22 00:05:11 compute-0 podman[226410]: 2026-01-22 00:05:11.769362848 +0000 UTC m=+0.083494586 container cleanup 734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:05:11 compute-0 systemd[1]: libpod-conmon-734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660.scope: Deactivated successfully.
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.843 182939 INFO nova.virt.libvirt.driver [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Instance destroyed successfully.
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.844 182939 DEBUG nova.objects.instance [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lazy-loading 'resources' on Instance uuid 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:11 compute-0 podman[226445]: 2026-01-22 00:05:11.866924061 +0000 UTC m=+0.051307545 container remove 734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.873 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4ef2d271-fc6e-4998-a83a-c1e18224b87b]: (4, ('Thu Jan 22 12:05:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 (734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660)\n734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660\nThu Jan 22 12:05:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 (734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660)\n734681380a2323c5a33178f069c915888fc77d2c0c4e966af605239e30eff660\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.876 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7e9b4ee4-02b2-4377-b3a3-86bdf7fb6352]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.877 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap397ba44b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:11 compute-0 kernel: tap397ba44b-e0: left promiscuous mode
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.880 182939 DEBUG nova.virt.libvirt.vif [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:01:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1381246704',display_name='tempest-ServersNegativeTestJSON-server-1381246704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1381246704',id=83,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:04:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a7e425a4d1854533a17d5f0dcd9d87b9',ramdisk_id='',reservation_id='r-37fe06tc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1689661',owner_user_name='tempest-ServersNegativeTestJSON-1689661-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:04:50Z,user_data=None,user_id='531ec5a088a94b78af6e2c3feda17c0c',uuid=2cb6e3d6-f22a-49ea-aab8-900dd88605e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.881 182939 DEBUG nova.network.os_vif_util [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converting VIF {"id": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "address": "fa:16:3e:e0:ee:91", "network": {"id": "397ba44b-e27b-4a2a-a10b-7de0daa31656", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1751919755-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a7e425a4d1854533a17d5f0dcd9d87b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8412a083-ca", "ovs_interfaceid": "8412a083-ca97-4457-bb0e-9c7bcd8bfb2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.882 182939 DEBUG nova.network.os_vif_util [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.882 182939 DEBUG os_vif [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.884 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.885 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8412a083-ca, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.886 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.886 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.888 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.895 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.896 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.898 182939 INFO os_vif [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:ee:91,bridge_name='br-int',has_traffic_filtering=True,id=8412a083-ca97-4457-bb0e-9c7bcd8bfb2f,network=Network(397ba44b-e27b-4a2a-a10b-7de0daa31656),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8412a083-ca')
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.898 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ffab5546-9770-4a18-ba01-710308588277]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.899 182939 INFO nova.virt.libvirt.driver [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Deleting instance files /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9_del
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.907 182939 INFO nova.virt.libvirt.driver [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Deletion of /var/lib/nova/instances/2cb6e3d6-f22a-49ea-aab8-900dd88605e9_del complete
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.918 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[072f0b66-72d6-47fc-84b7-d1152e5cb105]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.919 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e2b36060-c404-4c1e-a4a8-737bf6616276]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.935 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cd695ad7-9b19-4cab-b585-9d07db364fbf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 479286, 'reachable_time': 24434, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226471, 'error': None, 'target': 'ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:11 compute-0 systemd[1]: run-netns-ovnmeta\x2d397ba44b\x2de27b\x2d4a2a\x2da10b\x2d7de0daa31656.mount: Deactivated successfully.
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.940 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-397ba44b-e27b-4a2a-a10b-7de0daa31656 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:05:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:11.940 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[d056fbdc-a4b6-4f00-b37a-6730901e6273]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.973 182939 DEBUG nova.compute.manager [req-f4ed836b-1617-47d7-8e80-e4f435673537 req-9e9bdbdf-7df7-449b-bf5d-56aa3e0713f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-unplugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.974 182939 DEBUG oslo_concurrency.lockutils [req-f4ed836b-1617-47d7-8e80-e4f435673537 req-9e9bdbdf-7df7-449b-bf5d-56aa3e0713f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.974 182939 DEBUG oslo_concurrency.lockutils [req-f4ed836b-1617-47d7-8e80-e4f435673537 req-9e9bdbdf-7df7-449b-bf5d-56aa3e0713f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.974 182939 DEBUG oslo_concurrency.lockutils [req-f4ed836b-1617-47d7-8e80-e4f435673537 req-9e9bdbdf-7df7-449b-bf5d-56aa3e0713f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.975 182939 DEBUG nova.compute.manager [req-f4ed836b-1617-47d7-8e80-e4f435673537 req-9e9bdbdf-7df7-449b-bf5d-56aa3e0713f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] No waiting events found dispatching network-vif-unplugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:05:11 compute-0 nova_compute[182935]: 2026-01-22 00:05:11.975 182939 DEBUG nova.compute.manager [req-f4ed836b-1617-47d7-8e80-e4f435673537 req-9e9bdbdf-7df7-449b-bf5d-56aa3e0713f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-unplugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:05:12 compute-0 nova_compute[182935]: 2026-01-22 00:05:12.003 182939 INFO nova.compute.manager [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Took 0.45 seconds to destroy the instance on the hypervisor.
Jan 22 00:05:12 compute-0 nova_compute[182935]: 2026-01-22 00:05:12.003 182939 DEBUG oslo.service.loopingcall [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:05:12 compute-0 nova_compute[182935]: 2026-01-22 00:05:12.004 182939 DEBUG nova.compute.manager [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:05:12 compute-0 nova_compute[182935]: 2026-01-22 00:05:12.004 182939 DEBUG nova.network.neutron [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:05:12 compute-0 nova_compute[182935]: 2026-01-22 00:05:12.833 182939 DEBUG nova.network.neutron [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:05:12 compute-0 nova_compute[182935]: 2026-01-22 00:05:12.862 182939 INFO nova.compute.manager [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Took 0.86 seconds to deallocate network for instance.
Jan 22 00:05:12 compute-0 nova_compute[182935]: 2026-01-22 00:05:12.956 182939 DEBUG oslo_concurrency.lockutils [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:12 compute-0 nova_compute[182935]: 2026-01-22 00:05:12.956 182939 DEBUG oslo_concurrency.lockutils [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:13 compute-0 nova_compute[182935]: 2026-01-22 00:05:13.016 182939 DEBUG nova.compute.manager [req-d82fe9c6-79ab-42c9-a357-22bc3d5abc58 req-2f0bafff-7ec6-4e86-9548-1a426085012d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-deleted-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:13 compute-0 nova_compute[182935]: 2026-01-22 00:05:13.023 182939 DEBUG nova.compute.provider_tree [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:05:13 compute-0 nova_compute[182935]: 2026-01-22 00:05:13.039 182939 DEBUG nova.scheduler.client.report [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:05:13 compute-0 nova_compute[182935]: 2026-01-22 00:05:13.059 182939 DEBUG oslo_concurrency.lockutils [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:13 compute-0 nova_compute[182935]: 2026-01-22 00:05:13.094 182939 INFO nova.scheduler.client.report [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Deleted allocations for instance 2cb6e3d6-f22a-49ea-aab8-900dd88605e9
Jan 22 00:05:13 compute-0 nova_compute[182935]: 2026-01-22 00:05:13.220 182939 DEBUG oslo_concurrency.lockutils [None req-62a71b33-0f52-40b7-8581-d26bf8d345c2 531ec5a088a94b78af6e2c3feda17c0c a7e425a4d1854533a17d5f0dcd9d87b9 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:14 compute-0 nova_compute[182935]: 2026-01-22 00:05:14.060 182939 DEBUG nova.compute.manager [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:14 compute-0 nova_compute[182935]: 2026-01-22 00:05:14.061 182939 DEBUG oslo_concurrency.lockutils [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:14 compute-0 nova_compute[182935]: 2026-01-22 00:05:14.061 182939 DEBUG oslo_concurrency.lockutils [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:14 compute-0 nova_compute[182935]: 2026-01-22 00:05:14.061 182939 DEBUG oslo_concurrency.lockutils [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "2cb6e3d6-f22a-49ea-aab8-900dd88605e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:14 compute-0 nova_compute[182935]: 2026-01-22 00:05:14.062 182939 DEBUG nova.compute.manager [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] No waiting events found dispatching network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:05:14 compute-0 nova_compute[182935]: 2026-01-22 00:05:14.062 182939 WARNING nova.compute.manager [req-fed6f15e-70df-43bd-8a9c-10d6f85d9f8c req-afa4ff64-66a5-42d5-a96a-6c6a09cf6bb0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Received unexpected event network-vif-plugged-8412a083-ca97-4457-bb0e-9c7bcd8bfb2f for instance with vm_state deleted and task_state None.
Jan 22 00:05:16 compute-0 nova_compute[182935]: 2026-01-22 00:05:16.606 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:16 compute-0 nova_compute[182935]: 2026-01-22 00:05:16.908 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:17 compute-0 podman[226473]: 2026-01-22 00:05:17.695986616 +0000 UTC m=+0.062550000 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:05:17 compute-0 podman[226472]: 2026-01-22 00:05:17.712615616 +0000 UTC m=+0.087615798 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Jan 22 00:05:18 compute-0 nova_compute[182935]: 2026-01-22 00:05:18.802 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:21 compute-0 nova_compute[182935]: 2026-01-22 00:05:21.608 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:21 compute-0 nova_compute[182935]: 2026-01-22 00:05:21.909 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:25 compute-0 podman[226522]: 2026-01-22 00:05:25.682119694 +0000 UTC m=+0.058122121 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:05:26 compute-0 nova_compute[182935]: 2026-01-22 00:05:26.610 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:26 compute-0 nova_compute[182935]: 2026-01-22 00:05:26.842 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040311.8412657, 2cb6e3d6-f22a-49ea-aab8-900dd88605e9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:05:26 compute-0 nova_compute[182935]: 2026-01-22 00:05:26.842 182939 INFO nova.compute.manager [-] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] VM Stopped (Lifecycle Event)
Jan 22 00:05:26 compute-0 nova_compute[182935]: 2026-01-22 00:05:26.865 182939 DEBUG nova.compute.manager [None req-210ab308-f186-4737-b663-751ef681688b - - - - - -] [instance: 2cb6e3d6-f22a-49ea-aab8-900dd88605e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:26 compute-0 nova_compute[182935]: 2026-01-22 00:05:26.913 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:29 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:29.220 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:05:29 compute-0 nova_compute[182935]: 2026-01-22 00:05:29.221 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:29 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:29.222 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:05:30 compute-0 podman[226548]: 2026-01-22 00:05:30.707238094 +0000 UTC m=+0.066512968 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:05:30 compute-0 nova_compute[182935]: 2026-01-22 00:05:30.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:30 compute-0 nova_compute[182935]: 2026-01-22 00:05:30.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:05:30 compute-0 nova_compute[182935]: 2026-01-22 00:05:30.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:05:30 compute-0 nova_compute[182935]: 2026-01-22 00:05:30.809 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:05:31 compute-0 nova_compute[182935]: 2026-01-22 00:05:31.612 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:31 compute-0 nova_compute[182935]: 2026-01-22 00:05:31.914 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:33.224 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.297 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Acquiring lock "23a189a6-71c2-49b7-b4b9-57715325d51e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.297 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.319 182939 DEBUG nova.compute.manager [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.425 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.426 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.439 182939 DEBUG nova.virt.hardware [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.439 182939 INFO nova.compute.claims [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.445 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.445 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.469 182939 DEBUG nova.compute.manager [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.607 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.630 182939 DEBUG nova.compute.provider_tree [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.645 182939 DEBUG nova.scheduler.client.report [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.679 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.680 182939 DEBUG nova.compute.manager [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.685 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.691 182939 DEBUG nova.virt.hardware [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.691 182939 INFO nova.compute.claims [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.770 182939 DEBUG nova.compute.manager [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.770 182939 DEBUG nova.network.neutron [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.793 182939 INFO nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.824 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.825 182939 DEBUG nova.compute.manager [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.904 182939 DEBUG nova.compute.provider_tree [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.937 182939 DEBUG nova.scheduler.client.report [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.983 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.984 182939 DEBUG nova.compute.manager [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.986 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.986 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.987 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.988 182939 DEBUG nova.compute.manager [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.989 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.989 182939 INFO nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Creating image(s)
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.990 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Acquiring lock "/var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.990 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "/var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:33 compute-0 nova_compute[182935]: 2026-01-22 00:05:33.991 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "/var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.004 182939 DEBUG oslo_concurrency.processutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.048 182939 DEBUG nova.compute.manager [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.049 182939 DEBUG nova.network.neutron [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.075 182939 INFO nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.078 182939 DEBUG oslo_concurrency.processutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.079 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.080 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.092 182939 DEBUG oslo_concurrency.processutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.115 182939 DEBUG nova.compute.manager [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.164 182939 DEBUG oslo_concurrency.processutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.164 182939 DEBUG oslo_concurrency.processutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.202 182939 DEBUG oslo_concurrency.processutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.203 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.203 182939 DEBUG oslo_concurrency.processutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.254 182939 DEBUG nova.compute.manager [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.256 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.256 182939 INFO nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Creating image(s)
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.257 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "/var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.257 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "/var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.258 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "/var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.270 182939 DEBUG oslo_concurrency.processutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.289 182939 DEBUG oslo_concurrency.processutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.291 182939 DEBUG nova.virt.disk.api [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Checking if we can resize image /var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.291 182939 DEBUG oslo_concurrency.processutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.330 182939 DEBUG oslo_concurrency.processutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.331 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.332 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.342 182939 DEBUG oslo_concurrency.processutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.365 182939 DEBUG nova.policy [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.369 182939 DEBUG oslo_concurrency.processutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.371 182939 DEBUG nova.virt.disk.api [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Cannot resize image /var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.371 182939 DEBUG nova.objects.instance [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lazy-loading 'migration_context' on Instance uuid 23a189a6-71c2-49b7-b4b9-57715325d51e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.385 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.386 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Ensure instance console log exists: /var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.386 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.387 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.387 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.397 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.399 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5734MB free_disk=73.12837219238281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.399 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.399 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.401 182939 DEBUG oslo_concurrency.processutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.401 182939 DEBUG oslo_concurrency.processutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.436 182939 DEBUG oslo_concurrency.processutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.437 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.438 182939 DEBUG oslo_concurrency.processutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.520 182939 DEBUG oslo_concurrency.processutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.521 182939 DEBUG nova.virt.disk.api [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Checking if we can resize image /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.521 182939 DEBUG oslo_concurrency.processutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.543 182939 DEBUG nova.policy [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b8f4c36c45874f0cb983bb4c419457b9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd7a87111c83c49e3a84542174682a417', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.546 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 23a189a6-71c2-49b7-b4b9-57715325d51e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.547 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance a642c12c-c01b-41e1-8377-aae56b8d6493 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.547 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.547 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.578 182939 DEBUG oslo_concurrency.processutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.579 182939 DEBUG nova.virt.disk.api [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Cannot resize image /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.579 182939 DEBUG nova.objects.instance [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'migration_context' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.593 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.594 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Ensure instance console log exists: /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.594 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.595 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.595 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.619 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.637 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.661 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:05:34 compute-0 nova_compute[182935]: 2026-01-22 00:05:34.662 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:34 compute-0 podman[226600]: 2026-01-22 00:05:34.68113871 +0000 UTC m=+0.052668907 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Jan 22 00:05:34 compute-0 podman[226601]: 2026-01-22 00:05:34.688399459 +0000 UTC m=+0.057406025 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:05:35 compute-0 nova_compute[182935]: 2026-01-22 00:05:35.660 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:35 compute-0 nova_compute[182935]: 2026-01-22 00:05:35.660 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:05:36 compute-0 nova_compute[182935]: 2026-01-22 00:05:36.613 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:36 compute-0 nova_compute[182935]: 2026-01-22 00:05:36.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:36 compute-0 nova_compute[182935]: 2026-01-22 00:05:36.951 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:37 compute-0 nova_compute[182935]: 2026-01-22 00:05:37.052 182939 DEBUG nova.network.neutron [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Successfully created port: 443c4c41-63d8-47ff-a528-5bf1231445e4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:05:37 compute-0 nova_compute[182935]: 2026-01-22 00:05:37.371 182939 DEBUG nova.network.neutron [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Successfully created port: d0d81653-4a8c-419b-805e-125ec18decf4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:05:37 compute-0 nova_compute[182935]: 2026-01-22 00:05:37.919 182939 DEBUG nova.network.neutron [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Successfully updated port: 443c4c41-63d8-47ff-a528-5bf1231445e4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:05:37 compute-0 nova_compute[182935]: 2026-01-22 00:05:37.934 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "refresh_cache-a642c12c-c01b-41e1-8377-aae56b8d6493" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:05:37 compute-0 nova_compute[182935]: 2026-01-22 00:05:37.935 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquired lock "refresh_cache-a642c12c-c01b-41e1-8377-aae56b8d6493" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:05:37 compute-0 nova_compute[182935]: 2026-01-22 00:05:37.935 182939 DEBUG nova.network.neutron [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.130 182939 DEBUG nova.compute.manager [req-87e5c61d-4989-44e1-9f01-11f22b823ca8 req-f3911a61-1d18-44ca-856a-5b6ab70fdde0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received event network-changed-443c4c41-63d8-47ff-a528-5bf1231445e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.130 182939 DEBUG nova.compute.manager [req-87e5c61d-4989-44e1-9f01-11f22b823ca8 req-f3911a61-1d18-44ca-856a-5b6ab70fdde0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Refreshing instance network info cache due to event network-changed-443c4c41-63d8-47ff-a528-5bf1231445e4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.130 182939 DEBUG oslo_concurrency.lockutils [req-87e5c61d-4989-44e1-9f01-11f22b823ca8 req-f3911a61-1d18-44ca-856a-5b6ab70fdde0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-a642c12c-c01b-41e1-8377-aae56b8d6493" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.225 182939 DEBUG nova.network.neutron [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.291 182939 DEBUG nova.network.neutron [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Successfully updated port: d0d81653-4a8c-419b-805e-125ec18decf4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.310 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Acquiring lock "refresh_cache-23a189a6-71c2-49b7-b4b9-57715325d51e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.311 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Acquired lock "refresh_cache-23a189a6-71c2-49b7-b4b9-57715325d51e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.311 182939 DEBUG nova.network.neutron [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.422 182939 DEBUG nova.compute.manager [req-b8154cdb-1829-48a7-9c03-4f56b0e7c4bb req-bb5dc389-e9b4-40dc-af86-ffc1b309d120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Received event network-changed-d0d81653-4a8c-419b-805e-125ec18decf4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.422 182939 DEBUG nova.compute.manager [req-b8154cdb-1829-48a7-9c03-4f56b0e7c4bb req-bb5dc389-e9b4-40dc-af86-ffc1b309d120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Refreshing instance network info cache due to event network-changed-d0d81653-4a8c-419b-805e-125ec18decf4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.422 182939 DEBUG oslo_concurrency.lockutils [req-b8154cdb-1829-48a7-9c03-4f56b0e7c4bb req-bb5dc389-e9b4-40dc-af86-ffc1b309d120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-23a189a6-71c2-49b7-b4b9-57715325d51e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.513 182939 DEBUG nova.network.neutron [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.787 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:38 compute-0 nova_compute[182935]: 2026-01-22 00:05:38.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.839 182939 DEBUG nova.network.neutron [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Updating instance_info_cache with network_info: [{"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.876 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Releasing lock "refresh_cache-a642c12c-c01b-41e1-8377-aae56b8d6493" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.877 182939 DEBUG nova.compute.manager [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance network_info: |[{"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.877 182939 DEBUG oslo_concurrency.lockutils [req-87e5c61d-4989-44e1-9f01-11f22b823ca8 req-f3911a61-1d18-44ca-856a-5b6ab70fdde0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-a642c12c-c01b-41e1-8377-aae56b8d6493" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.878 182939 DEBUG nova.network.neutron [req-87e5c61d-4989-44e1-9f01-11f22b823ca8 req-f3911a61-1d18-44ca-856a-5b6ab70fdde0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Refreshing network info cache for port 443c4c41-63d8-47ff-a528-5bf1231445e4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.880 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Start _get_guest_xml network_info=[{"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.884 182939 WARNING nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.889 182939 DEBUG nova.virt.libvirt.host [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.889 182939 DEBUG nova.virt.libvirt.host [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.892 182939 DEBUG nova.virt.libvirt.host [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.892 182939 DEBUG nova.virt.libvirt.host [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.893 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.893 182939 DEBUG nova.virt.hardware [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.894 182939 DEBUG nova.virt.hardware [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.894 182939 DEBUG nova.virt.hardware [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.894 182939 DEBUG nova.virt.hardware [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.894 182939 DEBUG nova.virt.hardware [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.895 182939 DEBUG nova.virt.hardware [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.895 182939 DEBUG nova.virt.hardware [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.895 182939 DEBUG nova.virt.hardware [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.895 182939 DEBUG nova.virt.hardware [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.896 182939 DEBUG nova.virt.hardware [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.896 182939 DEBUG nova.virt.hardware [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.899 182939 DEBUG nova.virt.libvirt.vif [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1570775958',display_name='tempest-tempest.common.compute-instance-1570775958',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1570775958',id=94,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c299d482d37e45169cca3d6f178e8555',ramdisk_id='',reservation_id='r-84dzguso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1347085859',owner_user_name='tempest-ServerActionsTestOtherA-1347085859-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:05:34Z,user_data=None,user_id='b4385295f46b45d8803b0c536a989822',uuid=a642c12c-c01b-41e1-8377-aae56b8d6493,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.900 182939 DEBUG nova.network.os_vif_util [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converting VIF {"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.900 182939 DEBUG nova.network.os_vif_util [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.901 182939 DEBUG nova.objects.instance [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'pci_devices' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.927 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:05:40 compute-0 nova_compute[182935]:   <uuid>a642c12c-c01b-41e1-8377-aae56b8d6493</uuid>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   <name>instance-0000005e</name>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <nova:name>tempest-tempest.common.compute-instance-1570775958</nova:name>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:05:40</nova:creationTime>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:05:40 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:05:40 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:05:40 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:05:40 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:05:40 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:05:40 compute-0 nova_compute[182935]:         <nova:user uuid="b4385295f46b45d8803b0c536a989822">tempest-ServerActionsTestOtherA-1347085859-project-member</nova:user>
Jan 22 00:05:40 compute-0 nova_compute[182935]:         <nova:project uuid="c299d482d37e45169cca3d6f178e8555">tempest-ServerActionsTestOtherA-1347085859</nova:project>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:05:40 compute-0 nova_compute[182935]:         <nova:port uuid="443c4c41-63d8-47ff-a528-5bf1231445e4">
Jan 22 00:05:40 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <system>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <entry name="serial">a642c12c-c01b-41e1-8377-aae56b8d6493</entry>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <entry name="uuid">a642c12c-c01b-41e1-8377-aae56b8d6493</entry>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     </system>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   <os>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   </os>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   <features>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   </features>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.config"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:a8:06:47"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <target dev="tap443c4c41-63"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/console.log" append="off"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <video>
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     </video>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:05:40 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:05:40 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:05:40 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:05:40 compute-0 nova_compute[182935]: </domain>
Jan 22 00:05:40 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.929 182939 DEBUG nova.compute.manager [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Preparing to wait for external event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.929 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.929 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.929 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.930 182939 DEBUG nova.virt.libvirt.vif [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1570775958',display_name='tempest-tempest.common.compute-instance-1570775958',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1570775958',id=94,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c299d482d37e45169cca3d6f178e8555',ramdisk_id='',reservation_id='r-84dzguso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1347085859',owner_user_name='tempest-ServerActionsTestOtherA-1347085859-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:05:34Z,user_data=None,user_id='b4385295f46b45d8803b0c536a989822',uuid=a642c12c-c01b-41e1-8377-aae56b8d6493,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.930 182939 DEBUG nova.network.os_vif_util [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converting VIF {"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.931 182939 DEBUG nova.network.os_vif_util [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.931 182939 DEBUG os_vif [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.932 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.932 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.932 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.937 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.937 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap443c4c41-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.937 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap443c4c41-63, col_values=(('external_ids', {'iface-id': '443c4c41-63d8-47ff-a528-5bf1231445e4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:06:47', 'vm-uuid': 'a642c12c-c01b-41e1-8377-aae56b8d6493'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:40 compute-0 NetworkManager[55139]: <info>  [1769040340.9404] manager: (tap443c4c41-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.938 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.940 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.944 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.945 182939 INFO os_vif [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63')
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.991 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.992 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.992 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] No VIF found with MAC fa:16:3e:a8:06:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:05:40 compute-0 nova_compute[182935]: 2026-01-22 00:05:40.993 182939 INFO nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Using config drive
Jan 22 00:05:41 compute-0 nova_compute[182935]: 2026-01-22 00:05:41.615 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.135 182939 INFO nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Creating config drive at /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.config
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.140 182939 DEBUG oslo_concurrency.processutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwt0ktqb1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.158 182939 DEBUG nova.network.neutron [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Updating instance_info_cache with network_info: [{"id": "d0d81653-4a8c-419b-805e-125ec18decf4", "address": "fa:16:3e:36:b5:b3", "network": {"id": "e8d466d4-b930-4b2a-945c-c5093f17e6d0", "bridge": "br-int", "label": "tempest-ServersTestJSON-1326164589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d7a87111c83c49e3a84542174682a417", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0d81653-4a", "ovs_interfaceid": "d0d81653-4a8c-419b-805e-125ec18decf4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.185 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Releasing lock "refresh_cache-23a189a6-71c2-49b7-b4b9-57715325d51e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.185 182939 DEBUG nova.compute.manager [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Instance network_info: |[{"id": "d0d81653-4a8c-419b-805e-125ec18decf4", "address": "fa:16:3e:36:b5:b3", "network": {"id": "e8d466d4-b930-4b2a-945c-c5093f17e6d0", "bridge": "br-int", "label": "tempest-ServersTestJSON-1326164589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d7a87111c83c49e3a84542174682a417", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0d81653-4a", "ovs_interfaceid": "d0d81653-4a8c-419b-805e-125ec18decf4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.186 182939 DEBUG oslo_concurrency.lockutils [req-b8154cdb-1829-48a7-9c03-4f56b0e7c4bb req-bb5dc389-e9b4-40dc-af86-ffc1b309d120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-23a189a6-71c2-49b7-b4b9-57715325d51e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.186 182939 DEBUG nova.network.neutron [req-b8154cdb-1829-48a7-9c03-4f56b0e7c4bb req-bb5dc389-e9b4-40dc-af86-ffc1b309d120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Refreshing network info cache for port d0d81653-4a8c-419b-805e-125ec18decf4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.189 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Start _get_guest_xml network_info=[{"id": "d0d81653-4a8c-419b-805e-125ec18decf4", "address": "fa:16:3e:36:b5:b3", "network": {"id": "e8d466d4-b930-4b2a-945c-c5093f17e6d0", "bridge": "br-int", "label": "tempest-ServersTestJSON-1326164589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d7a87111c83c49e3a84542174682a417", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0d81653-4a", "ovs_interfaceid": "d0d81653-4a8c-419b-805e-125ec18decf4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.193 182939 WARNING nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.197 182939 DEBUG nova.virt.libvirt.host [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.198 182939 DEBUG nova.virt.libvirt.host [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.202 182939 DEBUG nova.virt.libvirt.host [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.202 182939 DEBUG nova.virt.libvirt.host [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.203 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.203 182939 DEBUG nova.virt.hardware [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.204 182939 DEBUG nova.virt.hardware [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.204 182939 DEBUG nova.virt.hardware [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.204 182939 DEBUG nova.virt.hardware [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.204 182939 DEBUG nova.virt.hardware [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.204 182939 DEBUG nova.virt.hardware [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.205 182939 DEBUG nova.virt.hardware [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.205 182939 DEBUG nova.virt.hardware [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.205 182939 DEBUG nova.virt.hardware [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.205 182939 DEBUG nova.virt.hardware [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.206 182939 DEBUG nova.virt.hardware [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.209 182939 DEBUG nova.virt.libvirt.vif [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-728743880',display_name='tempest-ServersTestJSON-server-728743880',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-728743880',id=93,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO/kUhKXFSNj7NeptId7iGdcxjhHpGJW6c6PRytdnz0bDXVaXwNNhY8ejpA8vOuxeTYOYcQTBzkrITjsoC6uhEN6N78wXPj0b3eqvUOxU8sfqyVKG+qOfQrrVsPJlnhlNQ==',key_name='tempest-keypair-1305968714',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d7a87111c83c49e3a84542174682a417',ramdisk_id='',reservation_id='r-1cc0fjnr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1366490185',owner_user_name='tempest-ServersTestJSON-1366490185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:05:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b8f4c36c45874f0cb983bb4c419457b9',uuid=23a189a6-71c2-49b7-b4b9-57715325d51e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d0d81653-4a8c-419b-805e-125ec18decf4", "address": "fa:16:3e:36:b5:b3", "network": {"id": "e8d466d4-b930-4b2a-945c-c5093f17e6d0", "bridge": "br-int", "label": "tempest-ServersTestJSON-1326164589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d7a87111c83c49e3a84542174682a417", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0d81653-4a", "ovs_interfaceid": "d0d81653-4a8c-419b-805e-125ec18decf4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.210 182939 DEBUG nova.network.os_vif_util [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Converting VIF {"id": "d0d81653-4a8c-419b-805e-125ec18decf4", "address": "fa:16:3e:36:b5:b3", "network": {"id": "e8d466d4-b930-4b2a-945c-c5093f17e6d0", "bridge": "br-int", "label": "tempest-ServersTestJSON-1326164589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d7a87111c83c49e3a84542174682a417", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0d81653-4a", "ovs_interfaceid": "d0d81653-4a8c-419b-805e-125ec18decf4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.210 182939 DEBUG nova.network.os_vif_util [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:b5:b3,bridge_name='br-int',has_traffic_filtering=True,id=d0d81653-4a8c-419b-805e-125ec18decf4,network=Network(e8d466d4-b930-4b2a-945c-c5093f17e6d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0d81653-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.211 182939 DEBUG nova.objects.instance [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lazy-loading 'pci_devices' on Instance uuid 23a189a6-71c2-49b7-b4b9-57715325d51e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.233 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:05:42 compute-0 nova_compute[182935]:   <uuid>23a189a6-71c2-49b7-b4b9-57715325d51e</uuid>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   <name>instance-0000005d</name>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersTestJSON-server-728743880</nova:name>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:05:42</nova:creationTime>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:05:42 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:05:42 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:05:42 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:05:42 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:05:42 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:05:42 compute-0 nova_compute[182935]:         <nova:user uuid="b8f4c36c45874f0cb983bb4c419457b9">tempest-ServersTestJSON-1366490185-project-member</nova:user>
Jan 22 00:05:42 compute-0 nova_compute[182935]:         <nova:project uuid="d7a87111c83c49e3a84542174682a417">tempest-ServersTestJSON-1366490185</nova:project>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:05:42 compute-0 nova_compute[182935]:         <nova:port uuid="d0d81653-4a8c-419b-805e-125ec18decf4">
Jan 22 00:05:42 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <system>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <entry name="serial">23a189a6-71c2-49b7-b4b9-57715325d51e</entry>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <entry name="uuid">23a189a6-71c2-49b7-b4b9-57715325d51e</entry>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     </system>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   <os>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   </os>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   <features>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   </features>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk.config"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:36:b5:b3"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <target dev="tapd0d81653-4a"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/console.log" append="off"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <video>
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     </video>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:05:42 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:05:42 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:05:42 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:05:42 compute-0 nova_compute[182935]: </domain>
Jan 22 00:05:42 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.234 182939 DEBUG nova.compute.manager [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Preparing to wait for external event network-vif-plugged-d0d81653-4a8c-419b-805e-125ec18decf4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.234 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Acquiring lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.235 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.235 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.236 182939 DEBUG nova.virt.libvirt.vif [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-728743880',display_name='tempest-ServersTestJSON-server-728743880',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-728743880',id=93,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO/kUhKXFSNj7NeptId7iGdcxjhHpGJW6c6PRytdnz0bDXVaXwNNhY8ejpA8vOuxeTYOYcQTBzkrITjsoC6uhEN6N78wXPj0b3eqvUOxU8sfqyVKG+qOfQrrVsPJlnhlNQ==',key_name='tempest-keypair-1305968714',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d7a87111c83c49e3a84542174682a417',ramdisk_id='',reservation_id='r-1cc0fjnr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-1366490185',owner_user_name='tempest-ServersTestJSON-1366490185-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:05:33Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b8f4c36c45874f0cb983bb4c419457b9',uuid=23a189a6-71c2-49b7-b4b9-57715325d51e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d0d81653-4a8c-419b-805e-125ec18decf4", "address": "fa:16:3e:36:b5:b3", "network": {"id": "e8d466d4-b930-4b2a-945c-c5093f17e6d0", "bridge": "br-int", "label": "tempest-ServersTestJSON-1326164589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d7a87111c83c49e3a84542174682a417", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0d81653-4a", "ovs_interfaceid": "d0d81653-4a8c-419b-805e-125ec18decf4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.236 182939 DEBUG nova.network.os_vif_util [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Converting VIF {"id": "d0d81653-4a8c-419b-805e-125ec18decf4", "address": "fa:16:3e:36:b5:b3", "network": {"id": "e8d466d4-b930-4b2a-945c-c5093f17e6d0", "bridge": "br-int", "label": "tempest-ServersTestJSON-1326164589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d7a87111c83c49e3a84542174682a417", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0d81653-4a", "ovs_interfaceid": "d0d81653-4a8c-419b-805e-125ec18decf4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.237 182939 DEBUG nova.network.os_vif_util [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:b5:b3,bridge_name='br-int',has_traffic_filtering=True,id=d0d81653-4a8c-419b-805e-125ec18decf4,network=Network(e8d466d4-b930-4b2a-945c-c5093f17e6d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0d81653-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.237 182939 DEBUG os_vif [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:b5:b3,bridge_name='br-int',has_traffic_filtering=True,id=d0d81653-4a8c-419b-805e-125ec18decf4,network=Network(e8d466d4-b930-4b2a-945c-c5093f17e6d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0d81653-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.237 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.238 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.238 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.240 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.240 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd0d81653-4a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.241 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd0d81653-4a, col_values=(('external_ids', {'iface-id': 'd0d81653-4a8c-419b-805e-125ec18decf4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:b5:b3', 'vm-uuid': '23a189a6-71c2-49b7-b4b9-57715325d51e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.242 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 NetworkManager[55139]: <info>  [1769040342.2439] manager: (tapd0d81653-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.245 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.249 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.250 182939 INFO os_vif [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:b5:b3,bridge_name='br-int',has_traffic_filtering=True,id=d0d81653-4a8c-419b-805e-125ec18decf4,network=Network(e8d466d4-b930-4b2a-945c-c5093f17e6d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0d81653-4a')
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.265 182939 DEBUG oslo_concurrency.processutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwt0ktqb1" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.315 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.316 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.316 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] No VIF found with MAC fa:16:3e:36:b5:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.316 182939 INFO nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Using config drive
Jan 22 00:05:42 compute-0 kernel: tap443c4c41-63: entered promiscuous mode
Jan 22 00:05:42 compute-0 NetworkManager[55139]: <info>  [1769040342.3284] manager: (tap443c4c41-63): new Tun device (/org/freedesktop/NetworkManager/Devices/166)
Jan 22 00:05:42 compute-0 ovn_controller[95047]: 2026-01-22T00:05:42Z|00368|binding|INFO|Claiming lport 443c4c41-63d8-47ff-a528-5bf1231445e4 for this chassis.
Jan 22 00:05:42 compute-0 ovn_controller[95047]: 2026-01-22T00:05:42Z|00369|binding|INFO|443c4c41-63d8-47ff-a528-5bf1231445e4: Claiming fa:16:3e:a8:06:47 10.100.0.13
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.334 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.348 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 NetworkManager[55139]: <info>  [1769040342.3548] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/167)
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.355 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 NetworkManager[55139]: <info>  [1769040342.3556] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Jan 22 00:05:42 compute-0 systemd-udevd[226668]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:05:42 compute-0 systemd-machined[154182]: New machine qemu-48-instance-0000005e.
Jan 22 00:05:42 compute-0 NetworkManager[55139]: <info>  [1769040342.3750] device (tap443c4c41-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:05:42 compute-0 NetworkManager[55139]: <info>  [1769040342.3758] device (tap443c4c41-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.376 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:06:47 10.100.0.13'], port_security=['fa:16:3e:a8:06:47 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a642c12c-c01b-41e1-8377-aae56b8d6493', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c299d482d37e45169cca3d6f178e8555', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0559740a-e5c9-4749-8c84-0b7852b94df4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=47fc8aa5-cd00-4c23-8e55-87bda0bbf0d4, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=443c4c41-63d8-47ff-a528-5bf1231445e4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.378 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 443c4c41-63d8-47ff-a528-5bf1231445e4 in datapath b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 bound to our chassis
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.379 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.392 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c1703e-9988-4513-b45a-10fb75554e58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.393 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb3dacae7-b1 in ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.396 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb3dacae7-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.396 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[31201245-6505-439b-a96b-3f297a469043]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.397 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ff02c9-4196-4c6a-89c9-cbc33265a290]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 systemd[1]: Started Virtual Machine qemu-48-instance-0000005e.
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.411 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[5cd84d14-8a24-403d-8ebf-d14501b34c82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.462 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7260e12f-b272-4d9d-95d2-9ae60061b5e3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.492 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[df0ebd44-ad8e-444c-b123-8f26294e6fb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 NetworkManager[55139]: <info>  [1769040342.5235] manager: (tapb3dacae7-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/169)
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.522 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[eb4cd771-32ab-4179-8168-b9d989405d55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.544 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.556 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[86b90ac0-320b-41e4-bc46-2b3e7dfdda78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.560 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c3525a-1078-49ec-a2d3-a058e63299d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.567 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 ovn_controller[95047]: 2026-01-22T00:05:42Z|00370|binding|INFO|Setting lport 443c4c41-63d8-47ff-a528-5bf1231445e4 ovn-installed in OVS
Jan 22 00:05:42 compute-0 ovn_controller[95047]: 2026-01-22T00:05:42Z|00371|binding|INFO|Setting lport 443c4c41-63d8-47ff-a528-5bf1231445e4 up in Southbound
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.576 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 NetworkManager[55139]: <info>  [1769040342.5852] device (tapb3dacae7-b0): carrier: link connected
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.592 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9f7da2e3-cae5-4d8e-a5da-d9cf978cf37e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.609 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[496a89d4-c817-4a26-b79b-514f6e9f0067]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3dacae7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:f1:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484532, 'reachable_time': 38103, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226706, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.625 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[54997cb2-a5de-4ed0-90ea-f5413751eab8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8b:f1ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 484532, 'tstamp': 484532}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226707, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.643 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[30af7258-aca0-4666-8a2f-c418ab9d6fbc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3dacae7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:f1:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484532, 'reachable_time': 38103, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226708, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.673 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2a1f2453-d47c-4daa-83b2-55d9d2220027]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.734 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040342.733978, a642c12c-c01b-41e1-8377-aae56b8d6493 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.735 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] VM Started (Lifecycle Event)
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.735 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c0394be9-284b-4ed8-bae3-b04f93a30324]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.737 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3dacae7-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.737 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.737 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3dacae7-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.739 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 NetworkManager[55139]: <info>  [1769040342.7403] manager: (tapb3dacae7-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Jan 22 00:05:42 compute-0 kernel: tapb3dacae7-b0: entered promiscuous mode
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.743 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.744 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb3dacae7-b0, col_values=(('external_ids', {'iface-id': '90cfb65b-4764-45c8-aca6-274b0a687241'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.745 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 ovn_controller[95047]: 2026-01-22T00:05:42Z|00372|binding|INFO|Releasing lport 90cfb65b-4764-45c8-aca6-274b0a687241 from this chassis (sb_readonly=0)
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.757 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.760 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.761 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.762 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[76017e6c-6558-4671-94bf-217beff472ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.762 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.763 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.pid.haproxy
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:05:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:42.763 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'env', 'PROCESS_TAG=haproxy-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.765 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040342.7342098, a642c12c-c01b-41e1-8377-aae56b8d6493 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.765 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] VM Paused (Lifecycle Event)
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.806 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.810 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.837 182939 DEBUG nova.compute.manager [req-e1b89020-88bf-4728-baeb-2b3900859cba req-33221307-b989-499e-b17e-c1b7d5ff3b4b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.837 182939 DEBUG oslo_concurrency.lockutils [req-e1b89020-88bf-4728-baeb-2b3900859cba req-33221307-b989-499e-b17e-c1b7d5ff3b4b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.838 182939 DEBUG oslo_concurrency.lockutils [req-e1b89020-88bf-4728-baeb-2b3900859cba req-33221307-b989-499e-b17e-c1b7d5ff3b4b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.838 182939 DEBUG oslo_concurrency.lockutils [req-e1b89020-88bf-4728-baeb-2b3900859cba req-33221307-b989-499e-b17e-c1b7d5ff3b4b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.838 182939 DEBUG nova.compute.manager [req-e1b89020-88bf-4728-baeb-2b3900859cba req-33221307-b989-499e-b17e-c1b7d5ff3b4b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Processing event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.839 182939 DEBUG nova.compute.manager [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.840 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.842 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040342.8426058, a642c12c-c01b-41e1-8377-aae56b8d6493 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.843 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] VM Resumed (Lifecycle Event)
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.849 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.852 182939 INFO nova.virt.libvirt.driver [-] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance spawned successfully.
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.852 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.879 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.883 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.884 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.884 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.885 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.885 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.885 182939 DEBUG nova.virt.libvirt.driver [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.893 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.926 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.967 182939 INFO nova.compute.manager [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Took 8.71 seconds to spawn the instance on the hypervisor.
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.968 182939 DEBUG nova.compute.manager [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:42 compute-0 nova_compute[182935]: 2026-01-22 00:05:42.996 182939 INFO nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Creating config drive at /var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk.config
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.002 182939 DEBUG oslo_concurrency.processutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgtcnc5jg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.081 182939 INFO nova.compute.manager [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Took 9.51 seconds to build instance.
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.104 182939 DEBUG oslo_concurrency.lockutils [None req-61eeda84-9409-4aec-8d0d-a9a72eb7d326 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:43 compute-0 podman[226755]: 2026-01-22 00:05:43.127263845 +0000 UTC m=+0.047220104 container create ff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.130 182939 DEBUG oslo_concurrency.processutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgtcnc5jg" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:05:43 compute-0 systemd[1]: Started libpod-conmon-ff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e.scope.
Jan 22 00:05:43 compute-0 podman[226755]: 2026-01-22 00:05:43.103082829 +0000 UTC m=+0.023039108 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:05:43 compute-0 kernel: tapd0d81653-4a: entered promiscuous mode
Jan 22 00:05:43 compute-0 NetworkManager[55139]: <info>  [1769040343.2062] manager: (tapd0d81653-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/171)
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.207 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:43 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:05:43 compute-0 systemd-udevd[226698]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:05:43 compute-0 ovn_controller[95047]: 2026-01-22T00:05:43Z|00373|binding|INFO|Claiming lport d0d81653-4a8c-419b-805e-125ec18decf4 for this chassis.
Jan 22 00:05:43 compute-0 ovn_controller[95047]: 2026-01-22T00:05:43Z|00374|binding|INFO|d0d81653-4a8c-419b-805e-125ec18decf4: Claiming fa:16:3e:36:b5:b3 10.100.0.12
Jan 22 00:05:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed58bb533acd243441baea87a0a38deee388572468f5c425c1579d55dd492f0d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:05:43 compute-0 NetworkManager[55139]: <info>  [1769040343.2219] device (tapd0d81653-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:05:43 compute-0 NetworkManager[55139]: <info>  [1769040343.2231] device (tapd0d81653-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:05:43 compute-0 ovn_controller[95047]: 2026-01-22T00:05:43Z|00375|binding|INFO|Setting lport d0d81653-4a8c-419b-805e-125ec18decf4 ovn-installed in OVS
Jan 22 00:05:43 compute-0 ovn_controller[95047]: 2026-01-22T00:05:43Z|00376|binding|INFO|Setting lport d0d81653-4a8c-419b-805e-125ec18decf4 up in Southbound
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.224 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:b5:b3 10.100.0.12'], port_security=['fa:16:3e:36:b5:b3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '23a189a6-71c2-49b7-b4b9-57715325d51e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8d466d4-b930-4b2a-945c-c5093f17e6d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd7a87111c83c49e3a84542174682a417', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0f63dc10-b221-41ef-af36-8fe82532c693', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e66bfa38-6a10-4f04-ae82-4579b8788b23, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=d0d81653-4a8c-419b-805e-125ec18decf4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.225 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.227 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:43 compute-0 podman[226755]: 2026-01-22 00:05:43.235703945 +0000 UTC m=+0.155660224 container init ff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:05:43 compute-0 podman[226755]: 2026-01-22 00:05:43.241726952 +0000 UTC m=+0.161683211 container start ff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 22 00:05:43 compute-0 systemd-machined[154182]: New machine qemu-49-instance-0000005d.
Jan 22 00:05:43 compute-0 systemd[1]: Started Virtual Machine qemu-49-instance-0000005d.
Jan 22 00:05:43 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[226776]: [NOTICE]   (226790) : New worker (226792) forked
Jan 22 00:05:43 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[226776]: [NOTICE]   (226790) : Loading success.
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.306 104408 INFO neutron.agent.ovn.metadata.agent [-] Port d0d81653-4a8c-419b-805e-125ec18decf4 in datapath e8d466d4-b930-4b2a-945c-c5093f17e6d0 unbound from our chassis
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.308 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e8d466d4-b930-4b2a-945c-c5093f17e6d0
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.320 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d795b251-34f3-4488-9e9e-e1ede4a9739e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.322 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape8d466d4-b1 in ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.324 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape8d466d4-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.325 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ea0d24-0767-40eb-bb95-13477335888c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.325 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0c9e8bb5-a76e-4492-ba59-2bd5051f10a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.341 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ac78d0-c371-48e7-aa83-95d482f3b1db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.357 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2d2765f4-9582-4b18-b250-dc2d3a82ddf4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.391 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[cb40a468-6841-4785-bfeb-920e83aacba9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 NetworkManager[55139]: <info>  [1769040343.4011] manager: (tape8d466d4-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/172)
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.400 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5b24f34e-078a-474b-8d4f-00563952b7ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.438 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[11e1179d-19c4-4099-bb4b-26acb7908ee4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.442 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[71b4e1d3-2dec-4231-bd0b-d6b080cbe674]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 NetworkManager[55139]: <info>  [1769040343.4688] device (tape8d466d4-b0): carrier: link connected
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.477 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[910c9f65-5883-491f-b3d2-81f6e9d0f322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.505 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[de3d174f-9f7f-4144-a74f-864e946b9830]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8d466d4-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b4:c3:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484620, 'reachable_time': 20133, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226827, 'error': None, 'target': 'ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.526 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[757a75e1-e074-46de-91a3-a7e0036c41ca]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb4:c356'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 484620, 'tstamp': 484620}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226828, 'error': None, 'target': 'ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.530 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040343.529247, 23a189a6-71c2-49b7-b4b9-57715325d51e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.531 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] VM Started (Lifecycle Event)
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.545 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9a256285-4e15-487a-bcfd-a2c6ff95f075]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8d466d4-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b4:c3:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 117], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484620, 'reachable_time': 20133, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226829, 'error': None, 'target': 'ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.561 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.566 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040343.5302095, 23a189a6-71c2-49b7-b4b9-57715325d51e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.567 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] VM Paused (Lifecycle Event)
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.577 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d82e1663-d3d9-479e-9082-8cb78bfb043b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.607 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.612 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.643 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.645 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8d8b3597-3195-42d5-9454-d64eaf89d241]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.646 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8d466d4-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.647 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.647 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape8d466d4-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.688 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:43 compute-0 NetworkManager[55139]: <info>  [1769040343.6892] manager: (tape8d466d4-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Jan 22 00:05:43 compute-0 kernel: tape8d466d4-b0: entered promiscuous mode
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.693 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape8d466d4-b0, col_values=(('external_ids', {'iface-id': '6c86fb1f-92c6-4b86-b1d7-ff8fba065ce2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:43 compute-0 ovn_controller[95047]: 2026-01-22T00:05:43Z|00377|binding|INFO|Releasing lport 6c86fb1f-92c6-4b86-b1d7-ff8fba065ce2 from this chassis (sb_readonly=0)
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.695 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.696 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e8d466d4-b930-4b2a-945c-c5093f17e6d0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e8d466d4-b930-4b2a-945c-c5093f17e6d0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.696 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fb7b0ef0-1b8d-4796-b9d0-cc6f833509fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.697 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-e8d466d4-b930-4b2a-945c-c5093f17e6d0
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/e8d466d4-b930-4b2a-945c-c5093f17e6d0.pid.haproxy
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID e8d466d4-b930-4b2a-945c-c5093f17e6d0
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:05:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:43.698 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0', 'env', 'PROCESS_TAG=haproxy-e8d466d4-b930-4b2a-945c-c5093f17e6d0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e8d466d4-b930-4b2a-945c-c5093f17e6d0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.707 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.777 182939 DEBUG nova.compute.manager [req-0b747406-14b1-4125-9750-a91dcfb4bf64 req-da2dcbb0-0582-429d-8434-40f65462c0fb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Received event network-vif-plugged-d0d81653-4a8c-419b-805e-125ec18decf4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.778 182939 DEBUG oslo_concurrency.lockutils [req-0b747406-14b1-4125-9750-a91dcfb4bf64 req-da2dcbb0-0582-429d-8434-40f65462c0fb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.778 182939 DEBUG oslo_concurrency.lockutils [req-0b747406-14b1-4125-9750-a91dcfb4bf64 req-da2dcbb0-0582-429d-8434-40f65462c0fb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.778 182939 DEBUG oslo_concurrency.lockutils [req-0b747406-14b1-4125-9750-a91dcfb4bf64 req-da2dcbb0-0582-429d-8434-40f65462c0fb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.779 182939 DEBUG nova.compute.manager [req-0b747406-14b1-4125-9750-a91dcfb4bf64 req-da2dcbb0-0582-429d-8434-40f65462c0fb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Processing event network-vif-plugged-d0d81653-4a8c-419b-805e-125ec18decf4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.779 182939 DEBUG nova.compute.manager [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.792 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040343.7914832, 23a189a6-71c2-49b7-b4b9-57715325d51e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.792 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] VM Resumed (Lifecycle Event)
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.794 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.798 182939 INFO nova.virt.libvirt.driver [-] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Instance spawned successfully.
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.798 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.818 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.821 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.831 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.832 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.832 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.832 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.833 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.833 182939 DEBUG nova.virt.libvirt.driver [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.845 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.964 182939 INFO nova.compute.manager [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Took 9.98 seconds to spawn the instance on the hypervisor.
Jan 22 00:05:43 compute-0 nova_compute[182935]: 2026-01-22 00:05:43.966 182939 DEBUG nova.compute.manager [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:44 compute-0 nova_compute[182935]: 2026-01-22 00:05:44.062 182939 DEBUG nova.network.neutron [req-87e5c61d-4989-44e1-9f01-11f22b823ca8 req-f3911a61-1d18-44ca-856a-5b6ab70fdde0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Updated VIF entry in instance network info cache for port 443c4c41-63d8-47ff-a528-5bf1231445e4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:05:44 compute-0 nova_compute[182935]: 2026-01-22 00:05:44.064 182939 DEBUG nova.network.neutron [req-87e5c61d-4989-44e1-9f01-11f22b823ca8 req-f3911a61-1d18-44ca-856a-5b6ab70fdde0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Updating instance_info_cache with network_info: [{"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:05:44 compute-0 nova_compute[182935]: 2026-01-22 00:05:44.068 182939 INFO nova.compute.manager [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Took 10.69 seconds to build instance.
Jan 22 00:05:44 compute-0 podman[226861]: 2026-01-22 00:05:44.079293386 +0000 UTC m=+0.050344420 container create edf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:05:44 compute-0 nova_compute[182935]: 2026-01-22 00:05:44.094 182939 DEBUG oslo_concurrency.lockutils [req-87e5c61d-4989-44e1-9f01-11f22b823ca8 req-f3911a61-1d18-44ca-856a-5b6ab70fdde0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-a642c12c-c01b-41e1-8377-aae56b8d6493" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:05:44 compute-0 nova_compute[182935]: 2026-01-22 00:05:44.101 182939 DEBUG oslo_concurrency.lockutils [None req-50166248-fa22-46a3-9871-13304a363b5a b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:44 compute-0 systemd[1]: Started libpod-conmon-edf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7.scope.
Jan 22 00:05:44 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:05:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13c5401daba3ce99b876b6838cce3a644dd80fbfe49b717b97bf3d28ea6fe671/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:05:44 compute-0 podman[226861]: 2026-01-22 00:05:44.054541456 +0000 UTC m=+0.025592520 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:05:44 compute-0 podman[226861]: 2026-01-22 00:05:44.157384358 +0000 UTC m=+0.128435402 container init edf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 00:05:44 compute-0 podman[226861]: 2026-01-22 00:05:44.163774926 +0000 UTC m=+0.134825960 container start edf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:05:44 compute-0 neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0[226876]: [NOTICE]   (226880) : New worker (226882) forked
Jan 22 00:05:44 compute-0 neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0[226876]: [NOTICE]   (226880) : Loading success.
Jan 22 00:05:44 compute-0 nova_compute[182935]: 2026-01-22 00:05:44.260 182939 DEBUG nova.network.neutron [req-b8154cdb-1829-48a7-9c03-4f56b0e7c4bb req-bb5dc389-e9b4-40dc-af86-ffc1b309d120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Updated VIF entry in instance network info cache for port d0d81653-4a8c-419b-805e-125ec18decf4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:05:44 compute-0 nova_compute[182935]: 2026-01-22 00:05:44.261 182939 DEBUG nova.network.neutron [req-b8154cdb-1829-48a7-9c03-4f56b0e7c4bb req-bb5dc389-e9b4-40dc-af86-ffc1b309d120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Updating instance_info_cache with network_info: [{"id": "d0d81653-4a8c-419b-805e-125ec18decf4", "address": "fa:16:3e:36:b5:b3", "network": {"id": "e8d466d4-b930-4b2a-945c-c5093f17e6d0", "bridge": "br-int", "label": "tempest-ServersTestJSON-1326164589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d7a87111c83c49e3a84542174682a417", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0d81653-4a", "ovs_interfaceid": "d0d81653-4a8c-419b-805e-125ec18decf4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:05:44 compute-0 nova_compute[182935]: 2026-01-22 00:05:44.288 182939 DEBUG oslo_concurrency.lockutils [req-b8154cdb-1829-48a7-9c03-4f56b0e7c4bb req-bb5dc389-e9b4-40dc-af86-ffc1b309d120 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-23a189a6-71c2-49b7-b4b9-57715325d51e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:05:44 compute-0 nova_compute[182935]: 2026-01-22 00:05:44.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:45 compute-0 nova_compute[182935]: 2026-01-22 00:05:45.009 182939 DEBUG nova.compute.manager [req-5f21a626-f2d3-477c-994e-52aa09331406 req-a30c0e28-65f3-44f5-a84d-494e63a5cbb4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:45 compute-0 nova_compute[182935]: 2026-01-22 00:05:45.009 182939 DEBUG oslo_concurrency.lockutils [req-5f21a626-f2d3-477c-994e-52aa09331406 req-a30c0e28-65f3-44f5-a84d-494e63a5cbb4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:45 compute-0 nova_compute[182935]: 2026-01-22 00:05:45.009 182939 DEBUG oslo_concurrency.lockutils [req-5f21a626-f2d3-477c-994e-52aa09331406 req-a30c0e28-65f3-44f5-a84d-494e63a5cbb4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:45 compute-0 nova_compute[182935]: 2026-01-22 00:05:45.010 182939 DEBUG oslo_concurrency.lockutils [req-5f21a626-f2d3-477c-994e-52aa09331406 req-a30c0e28-65f3-44f5-a84d-494e63a5cbb4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:45 compute-0 nova_compute[182935]: 2026-01-22 00:05:45.010 182939 DEBUG nova.compute.manager [req-5f21a626-f2d3-477c-994e-52aa09331406 req-a30c0e28-65f3-44f5-a84d-494e63a5cbb4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] No waiting events found dispatching network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:05:45 compute-0 nova_compute[182935]: 2026-01-22 00:05:45.010 182939 WARNING nova.compute.manager [req-5f21a626-f2d3-477c-994e-52aa09331406 req-a30c0e28-65f3-44f5-a84d-494e63a5cbb4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received unexpected event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 for instance with vm_state active and task_state None.
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.121 182939 DEBUG oslo_concurrency.lockutils [None req-59a69c3f-53ec-408a-b54b-09dfa277c37a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.122 182939 DEBUG oslo_concurrency.lockutils [None req-59a69c3f-53ec-408a-b54b-09dfa277c37a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.122 182939 DEBUG nova.compute.manager [None req-59a69c3f-53ec-408a-b54b-09dfa277c37a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.127 182939 DEBUG nova.compute.manager [None req-59a69c3f-53ec-408a-b54b-09dfa277c37a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.128 182939 DEBUG nova.objects.instance [None req-59a69c3f-53ec-408a-b54b-09dfa277c37a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'flavor' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.176 182939 DEBUG nova.objects.instance [None req-59a69c3f-53ec-408a-b54b-09dfa277c37a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'info_cache' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.208 182939 DEBUG nova.virt.libvirt.driver [None req-59a69c3f-53ec-408a-b54b-09dfa277c37a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.234 182939 DEBUG nova.compute.manager [req-b346fcc0-b383-4252-ad4e-21f69f7e59bb req-5433bf72-0209-43cf-a0e4-fe71d9793f80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Received event network-vif-plugged-d0d81653-4a8c-419b-805e-125ec18decf4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.234 182939 DEBUG oslo_concurrency.lockutils [req-b346fcc0-b383-4252-ad4e-21f69f7e59bb req-5433bf72-0209-43cf-a0e4-fe71d9793f80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.235 182939 DEBUG oslo_concurrency.lockutils [req-b346fcc0-b383-4252-ad4e-21f69f7e59bb req-5433bf72-0209-43cf-a0e4-fe71d9793f80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.235 182939 DEBUG oslo_concurrency.lockutils [req-b346fcc0-b383-4252-ad4e-21f69f7e59bb req-5433bf72-0209-43cf-a0e4-fe71d9793f80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.235 182939 DEBUG nova.compute.manager [req-b346fcc0-b383-4252-ad4e-21f69f7e59bb req-5433bf72-0209-43cf-a0e4-fe71d9793f80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] No waiting events found dispatching network-vif-plugged-d0d81653-4a8c-419b-805e-125ec18decf4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.235 182939 WARNING nova.compute.manager [req-b346fcc0-b383-4252-ad4e-21f69f7e59bb req-5433bf72-0209-43cf-a0e4-fe71d9793f80 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Received unexpected event network-vif-plugged-d0d81653-4a8c-419b-805e-125ec18decf4 for instance with vm_state active and task_state None.
Jan 22 00:05:46 compute-0 nova_compute[182935]: 2026-01-22 00:05:46.618 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:47 compute-0 nova_compute[182935]: 2026-01-22 00:05:47.243 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:47 compute-0 sshd-session[226891]: Invalid user svn from 188.166.69.60 port 54446
Jan 22 00:05:47 compute-0 sshd-session[226891]: Connection closed by invalid user svn 188.166.69.60 port 54446 [preauth]
Jan 22 00:05:48 compute-0 podman[226894]: 2026-01-22 00:05:48.699234919 +0000 UTC m=+0.067891802 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:05:48 compute-0 podman[226893]: 2026-01-22 00:05:48.760940769 +0000 UTC m=+0.115954407 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:05:49 compute-0 nova_compute[182935]: 2026-01-22 00:05:49.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:05:50 compute-0 nova_compute[182935]: 2026-01-22 00:05:50.401 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:50 compute-0 nova_compute[182935]: 2026-01-22 00:05:50.666 182939 DEBUG nova.compute.manager [req-5b7d9a7f-ebae-48eb-8509-046f16733921 req-1e952111-7bf1-4f99-9753-1945c860d93e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Received event network-changed-d0d81653-4a8c-419b-805e-125ec18decf4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:50 compute-0 nova_compute[182935]: 2026-01-22 00:05:50.666 182939 DEBUG nova.compute.manager [req-5b7d9a7f-ebae-48eb-8509-046f16733921 req-1e952111-7bf1-4f99-9753-1945c860d93e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Refreshing instance network info cache due to event network-changed-d0d81653-4a8c-419b-805e-125ec18decf4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:05:50 compute-0 nova_compute[182935]: 2026-01-22 00:05:50.667 182939 DEBUG oslo_concurrency.lockutils [req-5b7d9a7f-ebae-48eb-8509-046f16733921 req-1e952111-7bf1-4f99-9753-1945c860d93e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-23a189a6-71c2-49b7-b4b9-57715325d51e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:05:50 compute-0 nova_compute[182935]: 2026-01-22 00:05:50.667 182939 DEBUG oslo_concurrency.lockutils [req-5b7d9a7f-ebae-48eb-8509-046f16733921 req-1e952111-7bf1-4f99-9753-1945c860d93e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-23a189a6-71c2-49b7-b4b9-57715325d51e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:05:50 compute-0 nova_compute[182935]: 2026-01-22 00:05:50.667 182939 DEBUG nova.network.neutron [req-5b7d9a7f-ebae-48eb-8509-046f16733921 req-1e952111-7bf1-4f99-9753-1945c860d93e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Refreshing network info cache for port d0d81653-4a8c-419b-805e-125ec18decf4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:05:51 compute-0 nova_compute[182935]: 2026-01-22 00:05:51.622 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:52 compute-0 nova_compute[182935]: 2026-01-22 00:05:52.246 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:55 compute-0 ovn_controller[95047]: 2026-01-22T00:05:55Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a8:06:47 10.100.0.13
Jan 22 00:05:55 compute-0 ovn_controller[95047]: 2026-01-22T00:05:55Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a8:06:47 10.100.0.13
Jan 22 00:05:55 compute-0 nova_compute[182935]: 2026-01-22 00:05:55.987 182939 DEBUG nova.network.neutron [req-5b7d9a7f-ebae-48eb-8509-046f16733921 req-1e952111-7bf1-4f99-9753-1945c860d93e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Updated VIF entry in instance network info cache for port d0d81653-4a8c-419b-805e-125ec18decf4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:05:55 compute-0 nova_compute[182935]: 2026-01-22 00:05:55.988 182939 DEBUG nova.network.neutron [req-5b7d9a7f-ebae-48eb-8509-046f16733921 req-1e952111-7bf1-4f99-9753-1945c860d93e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Updating instance_info_cache with network_info: [{"id": "d0d81653-4a8c-419b-805e-125ec18decf4", "address": "fa:16:3e:36:b5:b3", "network": {"id": "e8d466d4-b930-4b2a-945c-c5093f17e6d0", "bridge": "br-int", "label": "tempest-ServersTestJSON-1326164589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d7a87111c83c49e3a84542174682a417", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0d81653-4a", "ovs_interfaceid": "d0d81653-4a8c-419b-805e-125ec18decf4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:05:56 compute-0 nova_compute[182935]: 2026-01-22 00:05:56.014 182939 DEBUG oslo_concurrency.lockutils [req-5b7d9a7f-ebae-48eb-8509-046f16733921 req-1e952111-7bf1-4f99-9753-1945c860d93e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-23a189a6-71c2-49b7-b4b9-57715325d51e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:05:56 compute-0 nova_compute[182935]: 2026-01-22 00:05:56.257 182939 DEBUG nova.virt.libvirt.driver [None req-59a69c3f-53ec-408a-b54b-09dfa277c37a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:05:56 compute-0 nova_compute[182935]: 2026-01-22 00:05:56.625 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:56 compute-0 podman[226978]: 2026-01-22 00:05:56.691079638 +0000 UTC m=+0.062723285 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:05:57 compute-0 nova_compute[182935]: 2026-01-22 00:05:57.248 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:57 compute-0 ovn_controller[95047]: 2026-01-22T00:05:57Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:36:b5:b3 10.100.0.12
Jan 22 00:05:57 compute-0 ovn_controller[95047]: 2026-01-22T00:05:57Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:36:b5:b3 10.100.0.12
Jan 22 00:05:58 compute-0 kernel: tap443c4c41-63 (unregistering): left promiscuous mode
Jan 22 00:05:58 compute-0 NetworkManager[55139]: <info>  [1769040358.6296] device (tap443c4c41-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:05:58 compute-0 ovn_controller[95047]: 2026-01-22T00:05:58Z|00378|binding|INFO|Releasing lport 443c4c41-63d8-47ff-a528-5bf1231445e4 from this chassis (sb_readonly=0)
Jan 22 00:05:58 compute-0 ovn_controller[95047]: 2026-01-22T00:05:58Z|00379|binding|INFO|Setting lport 443c4c41-63d8-47ff-a528-5bf1231445e4 down in Southbound
Jan 22 00:05:58 compute-0 ovn_controller[95047]: 2026-01-22T00:05:58Z|00380|binding|INFO|Removing iface tap443c4c41-63 ovn-installed in OVS
Jan 22 00:05:58 compute-0 nova_compute[182935]: 2026-01-22 00:05:58.661 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:58 compute-0 nova_compute[182935]: 2026-01-22 00:05:58.664 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:58 compute-0 nova_compute[182935]: 2026-01-22 00:05:58.675 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:58 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Jan 22 00:05:58 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000005e.scope: Consumed 13.688s CPU time.
Jan 22 00:05:58 compute-0 systemd-machined[154182]: Machine qemu-48-instance-0000005e terminated.
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.114 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:06:47 10.100.0.13'], port_security=['fa:16:3e:a8:06:47 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a642c12c-c01b-41e1-8377-aae56b8d6493', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c299d482d37e45169cca3d6f178e8555', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0559740a-e5c9-4749-8c84-0b7852b94df4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=47fc8aa5-cd00-4c23-8e55-87bda0bbf0d4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=443c4c41-63d8-47ff-a528-5bf1231445e4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.116 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 443c4c41-63d8-47ff-a528-5bf1231445e4 in datapath b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 unbound from our chassis
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.118 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.120 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[59460362-d6a0-4212-b7d0-4705056685c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.120 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 namespace which is not needed anymore
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.272 182939 INFO nova.virt.libvirt.driver [None req-59a69c3f-53ec-408a-b54b-09dfa277c37a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance shutdown successfully after 13 seconds.
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.281 182939 INFO nova.virt.libvirt.driver [-] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance destroyed successfully.
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.282 182939 DEBUG nova.objects.instance [None req-59a69c3f-53ec-408a-b54b-09dfa277c37a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'numa_topology' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:05:59 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[226776]: [NOTICE]   (226790) : haproxy version is 2.8.14-c23fe91
Jan 22 00:05:59 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[226776]: [NOTICE]   (226790) : path to executable is /usr/sbin/haproxy
Jan 22 00:05:59 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[226776]: [WARNING]  (226790) : Exiting Master process...
Jan 22 00:05:59 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[226776]: [ALERT]    (226790) : Current worker (226792) exited with code 143 (Terminated)
Jan 22 00:05:59 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[226776]: [WARNING]  (226790) : All workers exited. Exiting... (0)
Jan 22 00:05:59 compute-0 systemd[1]: libpod-ff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e.scope: Deactivated successfully.
Jan 22 00:05:59 compute-0 podman[227041]: 2026-01-22 00:05:59.335297437 +0000 UTC m=+0.061385353 container died ff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.369 182939 DEBUG nova.compute.manager [None req-59a69c3f-53ec-408a-b54b-09dfa277c37a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:05:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e-userdata-shm.mount: Deactivated successfully.
Jan 22 00:05:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-ed58bb533acd243441baea87a0a38deee388572468f5c425c1579d55dd492f0d-merged.mount: Deactivated successfully.
Jan 22 00:05:59 compute-0 podman[227041]: 2026-01-22 00:05:59.399460346 +0000 UTC m=+0.125548222 container cleanup ff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:05:59 compute-0 systemd[1]: libpod-conmon-ff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e.scope: Deactivated successfully.
Jan 22 00:05:59 compute-0 podman[227072]: 2026-01-22 00:05:59.502831791 +0000 UTC m=+0.069336238 container remove ff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.505 182939 DEBUG oslo_concurrency.lockutils [None req-59a69c3f-53ec-408a-b54b-09dfa277c37a b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.512 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0feb41af-fd90-4635-88bc-976ee2218dd9]: (4, ('Thu Jan 22 12:05:59 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 (ff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e)\nff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e\nThu Jan 22 12:05:59 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 (ff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e)\nff9063102c93e36b1a3c70c7cc223702a6843afa66015b00711698ff5c11305e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.515 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0f06842a-5533-4bfa-8092-26370787a751]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.517 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3dacae7-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.519 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:59 compute-0 kernel: tapb3dacae7-b0: left promiscuous mode
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.540 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1f1a1807-622e-4e85-b63f-602f3c88a56b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.563 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[76942eb3-0d27-4b66-8ade-0d6c7e264789]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.565 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[01405224-0dc2-4c9b-bc44-ae1a58c2216d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.594 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fa840120-c728-42eb-aea5-266e0703959c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484522, 'reachable_time': 20737, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227091, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.598 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:05:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:05:59.599 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[8b1b45e7-44a2-4c49-887d-f0eb8909d31f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:05:59 compute-0 systemd[1]: run-netns-ovnmeta\x2db3dacae7\x2db9cd\x2d426c\x2daa4a\x2d3a6b971c7ee5.mount: Deactivated successfully.
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.755 182939 DEBUG nova.compute.manager [req-7ac21dac-cb78-443b-b90d-2e3a43c33425 req-dbd711c6-db40-4001-a1de-9e27a3b48688 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received event network-vif-unplugged-443c4c41-63d8-47ff-a528-5bf1231445e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.755 182939 DEBUG oslo_concurrency.lockutils [req-7ac21dac-cb78-443b-b90d-2e3a43c33425 req-dbd711c6-db40-4001-a1de-9e27a3b48688 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.756 182939 DEBUG oslo_concurrency.lockutils [req-7ac21dac-cb78-443b-b90d-2e3a43c33425 req-dbd711c6-db40-4001-a1de-9e27a3b48688 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.757 182939 DEBUG oslo_concurrency.lockutils [req-7ac21dac-cb78-443b-b90d-2e3a43c33425 req-dbd711c6-db40-4001-a1de-9e27a3b48688 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.757 182939 DEBUG nova.compute.manager [req-7ac21dac-cb78-443b-b90d-2e3a43c33425 req-dbd711c6-db40-4001-a1de-9e27a3b48688 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] No waiting events found dispatching network-vif-unplugged-443c4c41-63d8-47ff-a528-5bf1231445e4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:05:59 compute-0 nova_compute[182935]: 2026-01-22 00:05:59.758 182939 WARNING nova.compute.manager [req-7ac21dac-cb78-443b-b90d-2e3a43c33425 req-dbd711c6-db40-4001-a1de-9e27a3b48688 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received unexpected event network-vif-unplugged-443c4c41-63d8-47ff-a528-5bf1231445e4 for instance with vm_state stopped and task_state None.
Jan 22 00:06:01 compute-0 nova_compute[182935]: 2026-01-22 00:06:01.626 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:01 compute-0 podman[227092]: 2026-01-22 00:06:01.750396072 +0000 UTC m=+0.105562620 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:06:01 compute-0 nova_compute[182935]: 2026-01-22 00:06:01.887 182939 DEBUG nova.compute.manager [req-b90b0c4a-21ca-4858-a5e5-411b846d9159 req-79879fe7-892a-4b61-8116-d383665ccaf4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:01 compute-0 nova_compute[182935]: 2026-01-22 00:06:01.887 182939 DEBUG oslo_concurrency.lockutils [req-b90b0c4a-21ca-4858-a5e5-411b846d9159 req-79879fe7-892a-4b61-8116-d383665ccaf4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:01 compute-0 nova_compute[182935]: 2026-01-22 00:06:01.888 182939 DEBUG oslo_concurrency.lockutils [req-b90b0c4a-21ca-4858-a5e5-411b846d9159 req-79879fe7-892a-4b61-8116-d383665ccaf4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:01 compute-0 nova_compute[182935]: 2026-01-22 00:06:01.888 182939 DEBUG oslo_concurrency.lockutils [req-b90b0c4a-21ca-4858-a5e5-411b846d9159 req-79879fe7-892a-4b61-8116-d383665ccaf4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:01 compute-0 nova_compute[182935]: 2026-01-22 00:06:01.888 182939 DEBUG nova.compute.manager [req-b90b0c4a-21ca-4858-a5e5-411b846d9159 req-79879fe7-892a-4b61-8116-d383665ccaf4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] No waiting events found dispatching network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:01 compute-0 nova_compute[182935]: 2026-01-22 00:06:01.888 182939 WARNING nova.compute.manager [req-b90b0c4a-21ca-4858-a5e5-411b846d9159 req-79879fe7-892a-4b61-8116-d383665ccaf4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received unexpected event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 for instance with vm_state stopped and task_state rebuilding.
Jan 22 00:06:01 compute-0 nova_compute[182935]: 2026-01-22 00:06:01.952 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.251 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.470 182939 INFO nova.compute.manager [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Rebuilding instance
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.770 182939 DEBUG nova.compute.manager [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.853 182939 DEBUG nova.objects.instance [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'pci_requests' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.868 182939 DEBUG nova.objects.instance [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'pci_devices' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.885 182939 DEBUG nova.objects.instance [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'resources' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.894 182939 DEBUG nova.objects.instance [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'migration_context' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.903 182939 DEBUG nova.objects.instance [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.906 182939 INFO nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance already shutdown.
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.912 182939 INFO nova.virt.libvirt.driver [-] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance destroyed successfully.
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.916 182939 INFO nova.virt.libvirt.driver [-] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance destroyed successfully.
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.916 182939 DEBUG nova.virt.libvirt.vif [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1570775958',display_name='tempest-tempest.common.compute-instance-1570775958',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1570775958',id=94,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:05:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='c299d482d37e45169cca3d6f178e8555',ramdisk_id='',reservation_id='r-84dzguso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1347085859',owner_user_name='tempest-ServerActionsTestOtherA-1347085859-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:01Z,user_data=None,user_id='b4385295f46b45d8803b0c536a989822',uuid=a642c12c-c01b-41e1-8377-aae56b8d6493,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.917 182939 DEBUG nova.network.os_vif_util [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converting VIF {"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.917 182939 DEBUG nova.network.os_vif_util [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.918 182939 DEBUG os_vif [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.919 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.920 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap443c4c41-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.964 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.966 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.974 182939 INFO os_vif [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63')
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.975 182939 INFO nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Deleting instance files /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493_del
Jan 22 00:06:02 compute-0 nova_compute[182935]: 2026-01-22 00:06:02.976 182939 INFO nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Deletion of /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493_del complete
Jan 22 00:06:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:03.200 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:03.200 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:03.201 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.303 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.304 182939 INFO nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Creating image(s)
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.304 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "/var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.305 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "/var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.305 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "/var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.319 182939 DEBUG oslo_concurrency.processutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.405 182939 DEBUG oslo_concurrency.processutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.406 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.406 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.418 182939 DEBUG oslo_concurrency.processutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.478 182939 DEBUG oslo_concurrency.processutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.479 182939 DEBUG oslo_concurrency.processutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.515 182939 DEBUG oslo_concurrency.processutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.517 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.517 182939 DEBUG oslo_concurrency.processutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.576 182939 DEBUG oslo_concurrency.processutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.578 182939 DEBUG nova.virt.disk.api [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Checking if we can resize image /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.578 182939 DEBUG oslo_concurrency.processutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.640 182939 DEBUG oslo_concurrency.processutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.641 182939 DEBUG nova.virt.disk.api [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Cannot resize image /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.642 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.642 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Ensure instance console log exists: /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.643 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.643 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.644 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.646 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Start _get_guest_xml network_info=[{"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.653 182939 WARNING nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.665 182939 DEBUG nova.virt.libvirt.host [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.666 182939 DEBUG nova.virt.libvirt.host [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.670 182939 DEBUG nova.virt.libvirt.host [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.671 182939 DEBUG nova.virt.libvirt.host [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.672 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.673 182939 DEBUG nova.virt.hardware [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.673 182939 DEBUG nova.virt.hardware [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.673 182939 DEBUG nova.virt.hardware [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.674 182939 DEBUG nova.virt.hardware [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.674 182939 DEBUG nova.virt.hardware [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.674 182939 DEBUG nova.virt.hardware [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.674 182939 DEBUG nova.virt.hardware [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.674 182939 DEBUG nova.virt.hardware [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.675 182939 DEBUG nova.virt.hardware [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.675 182939 DEBUG nova.virt.hardware [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.675 182939 DEBUG nova.virt.hardware [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.675 182939 DEBUG nova.objects.instance [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'vcpu_model' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.703 182939 DEBUG nova.virt.libvirt.vif [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1570775958',display_name='tempest-tempest.common.compute-instance-1570775958',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1570775958',id=94,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:05:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='c299d482d37e45169cca3d6f178e8555',ramdisk_id='',reservation_id='r-84dzguso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1347085859',owner_user_name='tempest-ServerActionsTestOtherA-1347085859-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:03Z,user_data=None,user_id='b4385295f46b45d8803b0c536a989822',uuid=a642c12c-c01b-41e1-8377-aae56b8d6493,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.704 182939 DEBUG nova.network.os_vif_util [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converting VIF {"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.705 182939 DEBUG nova.network.os_vif_util [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.707 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:06:03 compute-0 nova_compute[182935]:   <uuid>a642c12c-c01b-41e1-8377-aae56b8d6493</uuid>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   <name>instance-0000005e</name>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <nova:name>tempest-tempest.common.compute-instance-1570775958</nova:name>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:06:03</nova:creationTime>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:06:03 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:06:03 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:06:03 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:06:03 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:06:03 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:06:03 compute-0 nova_compute[182935]:         <nova:user uuid="b4385295f46b45d8803b0c536a989822">tempest-ServerActionsTestOtherA-1347085859-project-member</nova:user>
Jan 22 00:06:03 compute-0 nova_compute[182935]:         <nova:project uuid="c299d482d37e45169cca3d6f178e8555">tempest-ServerActionsTestOtherA-1347085859</nova:project>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="3e1dda74-3c6a-4d29-8792-32134d1c36c5"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:06:03 compute-0 nova_compute[182935]:         <nova:port uuid="443c4c41-63d8-47ff-a528-5bf1231445e4">
Jan 22 00:06:03 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <system>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <entry name="serial">a642c12c-c01b-41e1-8377-aae56b8d6493</entry>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <entry name="uuid">a642c12c-c01b-41e1-8377-aae56b8d6493</entry>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     </system>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   <os>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   </os>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   <features>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   </features>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.config"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:a8:06:47"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <target dev="tap443c4c41-63"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/console.log" append="off"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <video>
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     </video>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:06:03 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:06:03 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:06:03 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:06:03 compute-0 nova_compute[182935]: </domain>
Jan 22 00:06:03 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.707 182939 DEBUG nova.compute.manager [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Preparing to wait for external event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.708 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.708 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.708 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.709 182939 DEBUG nova.virt.libvirt.vif [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1570775958',display_name='tempest-tempest.common.compute-instance-1570775958',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1570775958',id=94,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:05:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='c299d482d37e45169cca3d6f178e8555',ramdisk_id='',reservation_id='r-84dzguso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1347085859',owner_user_name='tempest-ServerActionsTestOtherA-1347085859-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:03Z,user_data=None,user_id='b4385295f46b45d8803b0c536a989822',uuid=a642c12c-c01b-41e1-8377-aae56b8d6493,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.710 182939 DEBUG nova.network.os_vif_util [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converting VIF {"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.710 182939 DEBUG nova.network.os_vif_util [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.711 182939 DEBUG os_vif [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.712 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.713 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.713 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.717 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.718 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap443c4c41-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.719 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap443c4c41-63, col_values=(('external_ids', {'iface-id': '443c4c41-63d8-47ff-a528-5bf1231445e4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:06:47', 'vm-uuid': 'a642c12c-c01b-41e1-8377-aae56b8d6493'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.721 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:03 compute-0 NetworkManager[55139]: <info>  [1769040363.7224] manager: (tap443c4c41-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.724 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.729 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.731 182939 INFO os_vif [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63')
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.806 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.806 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.807 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] No VIF found with MAC fa:16:3e:a8:06:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.807 182939 INFO nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Using config drive
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.842 182939 DEBUG nova.objects.instance [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'ec2_ids' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:03 compute-0 nova_compute[182935]: 2026-01-22 00:06:03.885 182939 DEBUG nova.objects.instance [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'keypairs' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:04 compute-0 nova_compute[182935]: 2026-01-22 00:06:04.291 182939 INFO nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Creating config drive at /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.config
Jan 22 00:06:04 compute-0 nova_compute[182935]: 2026-01-22 00:06:04.297 182939 DEBUG oslo_concurrency.processutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq60y6pym execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:04 compute-0 nova_compute[182935]: 2026-01-22 00:06:04.439 182939 DEBUG oslo_concurrency.processutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq60y6pym" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:04 compute-0 kernel: tap443c4c41-63: entered promiscuous mode
Jan 22 00:06:04 compute-0 ovn_controller[95047]: 2026-01-22T00:06:04Z|00381|binding|INFO|Claiming lport 443c4c41-63d8-47ff-a528-5bf1231445e4 for this chassis.
Jan 22 00:06:04 compute-0 ovn_controller[95047]: 2026-01-22T00:06:04Z|00382|binding|INFO|443c4c41-63d8-47ff-a528-5bf1231445e4: Claiming fa:16:3e:a8:06:47 10.100.0.13
Jan 22 00:06:04 compute-0 NetworkManager[55139]: <info>  [1769040364.5244] manager: (tap443c4c41-63): new Tun device (/org/freedesktop/NetworkManager/Devices/175)
Jan 22 00:06:04 compute-0 nova_compute[182935]: 2026-01-22 00:06:04.523 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:04 compute-0 ovn_controller[95047]: 2026-01-22T00:06:04Z|00383|binding|INFO|Setting lport 443c4c41-63d8-47ff-a528-5bf1231445e4 ovn-installed in OVS
Jan 22 00:06:04 compute-0 nova_compute[182935]: 2026-01-22 00:06:04.542 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:04 compute-0 systemd-udevd[227145]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:06:04 compute-0 systemd-machined[154182]: New machine qemu-50-instance-0000005e.
Jan 22 00:06:04 compute-0 NetworkManager[55139]: <info>  [1769040364.5768] device (tap443c4c41-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:06:04 compute-0 NetworkManager[55139]: <info>  [1769040364.5775] device (tap443c4c41-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:06:04 compute-0 systemd[1]: Started Virtual Machine qemu-50-instance-0000005e.
Jan 22 00:06:04 compute-0 ovn_controller[95047]: 2026-01-22T00:06:04Z|00384|binding|INFO|Setting lport 443c4c41-63d8-47ff-a528-5bf1231445e4 up in Southbound
Jan 22 00:06:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:04.902 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:06:47 10.100.0.13'], port_security=['fa:16:3e:a8:06:47 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a642c12c-c01b-41e1-8377-aae56b8d6493', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c299d482d37e45169cca3d6f178e8555', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0559740a-e5c9-4749-8c84-0b7852b94df4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=47fc8aa5-cd00-4c23-8e55-87bda0bbf0d4, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=443c4c41-63d8-47ff-a528-5bf1231445e4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:06:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:04.904 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 443c4c41-63d8-47ff-a528-5bf1231445e4 in datapath b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 bound to our chassis
Jan 22 00:06:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:04.905 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5
Jan 22 00:06:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:04.918 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[71f7ca85-3087-4872-b1c3-955d286100ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:04.919 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb3dacae7-b1 in ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:06:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:04.924 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb3dacae7-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:06:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:04.924 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f73ebc2b-a862-4116-bb32-ef7399b36259]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:04.925 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[604de411-c520-444a-bb8a-5f7aac2f2651]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:04.944 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[bd5b256d-55ac-42a3-8639-34225cb6a105]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:04.965 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a963cb20-1c78-46cb-b1d1-5b2a25da3448]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:04.999 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c733b85b-ae62-4188-b4d2-eec9d3cb679e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.006 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[487d9733-9495-452e-88d9-d57d3b518d7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:05 compute-0 NetworkManager[55139]: <info>  [1769040365.0094] manager: (tapb3dacae7-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/176)
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.052 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[14e65a56-39e7-4f07-ae03-b933a3b72d5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.056 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[411b8863-090c-42a2-88ad-529baccf5123]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:05 compute-0 podman[227159]: 2026-01-22 00:06:05.058095457 +0000 UTC m=+0.080464693 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:06:05 compute-0 NetworkManager[55139]: <info>  [1769040365.0870] device (tapb3dacae7-b0): carrier: link connected
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.091 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[cf04fa00-f14a-4ccc-8c8c-22b50c9e181a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:05 compute-0 podman[227157]: 2026-01-22 00:06:05.101112186 +0000 UTC m=+0.116644744 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.120 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1e363f55-c30c-4586-b149-00b3fcf803e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3dacae7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:f1:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486782, 'reachable_time': 40096, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227218, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.137 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[272a0884-b08b-4124-9034-83eb8cd0a834]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8b:f1ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486782, 'tstamp': 486782}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227219, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.159 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1b0f734b-90ce-4bf3-9325-6064c47fe34c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3dacae7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:f1:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486782, 'reachable_time': 40096, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227220, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.206 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8d525ff7-5d3e-4b3c-8f3d-b23804a48fce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.295 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0abb6bd2-b2fa-43df-b5d3-f9751b18a3b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.297 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3dacae7-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.298 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.298 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3dacae7-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.301 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:05 compute-0 NetworkManager[55139]: <info>  [1769040365.3022] manager: (tapb3dacae7-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Jan 22 00:06:05 compute-0 kernel: tapb3dacae7-b0: entered promiscuous mode
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.305 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb3dacae7-b0, col_values=(('external_ids', {'iface-id': '90cfb65b-4764-45c8-aca6-274b0a687241'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:05 compute-0 ovn_controller[95047]: 2026-01-22T00:06:05Z|00385|binding|INFO|Releasing lport 90cfb65b-4764-45c8-aca6-274b0a687241 from this chassis (sb_readonly=0)
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.306 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.312 182939 DEBUG nova.compute.manager [req-e5b30fa6-8440-444b-9bc6-7659173d565a req-afc31d7c-3a12-45a1-bed3-36a36c78c757 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.312 182939 DEBUG oslo_concurrency.lockutils [req-e5b30fa6-8440-444b-9bc6-7659173d565a req-afc31d7c-3a12-45a1-bed3-36a36c78c757 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.313 182939 DEBUG oslo_concurrency.lockutils [req-e5b30fa6-8440-444b-9bc6-7659173d565a req-afc31d7c-3a12-45a1-bed3-36a36c78c757 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.313 182939 DEBUG oslo_concurrency.lockutils [req-e5b30fa6-8440-444b-9bc6-7659173d565a req-afc31d7c-3a12-45a1-bed3-36a36c78c757 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.313 182939 DEBUG nova.compute.manager [req-e5b30fa6-8440-444b-9bc6-7659173d565a req-afc31d7c-3a12-45a1-bed3-36a36c78c757 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Processing event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.319 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.320 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.322 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5885ef4f-f7d3-4cd0-8d2e-5bac1dfa53ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.322 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.pid.haproxy
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.323 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'env', 'PROCESS_TAG=haproxy-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.460 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for a642c12c-c01b-41e1-8377-aae56b8d6493 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.461 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040365.4600859, a642c12c-c01b-41e1-8377-aae56b8d6493 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.461 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] VM Started (Lifecycle Event)
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.464 182939 DEBUG nova.compute.manager [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.493 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.497 182939 INFO nova.virt.libvirt.driver [-] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance spawned successfully.
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.499 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.500 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.504 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.535 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.535 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040365.463752, a642c12c-c01b-41e1-8377-aae56b8d6493 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.536 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] VM Paused (Lifecycle Event)
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.541 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.541 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.541 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.542 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.542 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.542 182939 DEBUG nova.virt.libvirt.driver [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.570 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.575 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040365.4672632, a642c12c-c01b-41e1-8377-aae56b8d6493 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.576 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] VM Resumed (Lifecycle Event)
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.607 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.610 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.637 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.648 182939 DEBUG nova.compute.manager [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:05 compute-0 podman[227259]: 2026-01-22 00:06:05.759849475 +0000 UTC m=+0.061677169 container create bbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.779 182939 INFO nova.compute.manager [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] bringing vm to original state: 'stopped'
Jan 22 00:06:05 compute-0 systemd[1]: Started libpod-conmon-bbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42.scope.
Jan 22 00:06:05 compute-0 podman[227259]: 2026-01-22 00:06:05.731057177 +0000 UTC m=+0.032884891 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:06:05 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:06:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8da81188e968eeda24266a84474254db1d3d87adfc5ecbb423640b09d7e7fc89/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:06:05 compute-0 podman[227259]: 2026-01-22 00:06:05.860999836 +0000 UTC m=+0.162827550 container init bbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:06:05 compute-0 podman[227259]: 2026-01-22 00:06:05.867470475 +0000 UTC m=+0.169298169 container start bbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.880 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.881 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.881 182939 DEBUG nova.compute.manager [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:05 compute-0 ovn_controller[95047]: 2026-01-22T00:06:05Z|00386|binding|INFO|Releasing lport 90cfb65b-4764-45c8-aca6-274b0a687241 from this chassis (sb_readonly=0)
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.885 182939 DEBUG nova.compute.manager [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 22 00:06:05 compute-0 ovn_controller[95047]: 2026-01-22T00:06:05Z|00387|binding|INFO|Releasing lport 6c86fb1f-92c6-4b86-b1d7-ff8fba065ce2 from this chassis (sb_readonly=0)
Jan 22 00:06:05 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227274]: [NOTICE]   (227278) : New worker (227280) forked
Jan 22 00:06:05 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227274]: [NOTICE]   (227278) : Loading success.
Jan 22 00:06:05 compute-0 kernel: tap443c4c41-63 (unregistering): left promiscuous mode
Jan 22 00:06:05 compute-0 NetworkManager[55139]: <info>  [1769040365.9520] device (tap443c4c41-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.955 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:05 compute-0 ovn_controller[95047]: 2026-01-22T00:06:05Z|00388|binding|INFO|Releasing lport 443c4c41-63d8-47ff-a528-5bf1231445e4 from this chassis (sb_readonly=0)
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.963 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:05 compute-0 ovn_controller[95047]: 2026-01-22T00:06:05Z|00389|binding|INFO|Setting lport 443c4c41-63d8-47ff-a528-5bf1231445e4 down in Southbound
Jan 22 00:06:05 compute-0 ovn_controller[95047]: 2026-01-22T00:06:05Z|00390|binding|INFO|Removing iface tap443c4c41-63 ovn-installed in OVS
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.967 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:05 compute-0 nova_compute[182935]: 2026-01-22 00:06:05.977 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.996 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:06:47 10.100.0.13'], port_security=['fa:16:3e:a8:06:47 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a642c12c-c01b-41e1-8377-aae56b8d6493', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c299d482d37e45169cca3d6f178e8555', 'neutron:revision_number': '6', 'neutron:security_group_ids': '0559740a-e5c9-4749-8c84-0b7852b94df4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=47fc8aa5-cd00-4c23-8e55-87bda0bbf0d4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=443c4c41-63d8-47ff-a528-5bf1231445e4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:06:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:05.999 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 443c4c41-63d8-47ff-a528-5bf1231445e4 in datapath b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 unbound from our chassis
Jan 22 00:06:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:06.001 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:06:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:06.003 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8dfff5-5742-43d9-83af-700b34100370]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:06.004 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 namespace which is not needed anymore
Jan 22 00:06:06 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Jan 22 00:06:06 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000005e.scope: Consumed 1.227s CPU time.
Jan 22 00:06:06 compute-0 systemd-machined[154182]: Machine qemu-50-instance-0000005e terminated.
Jan 22 00:06:06 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227274]: [NOTICE]   (227278) : haproxy version is 2.8.14-c23fe91
Jan 22 00:06:06 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227274]: [NOTICE]   (227278) : path to executable is /usr/sbin/haproxy
Jan 22 00:06:06 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227274]: [WARNING]  (227278) : Exiting Master process...
Jan 22 00:06:06 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227274]: [WARNING]  (227278) : Exiting Master process...
Jan 22 00:06:06 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227274]: [ALERT]    (227278) : Current worker (227280) exited with code 143 (Terminated)
Jan 22 00:06:06 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227274]: [WARNING]  (227278) : All workers exited. Exiting... (0)
Jan 22 00:06:06 compute-0 systemd[1]: libpod-bbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42.scope: Deactivated successfully.
Jan 22 00:06:06 compute-0 podman[227311]: 2026-01-22 00:06:06.179468098 +0000 UTC m=+0.049765937 container died bbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:06:06 compute-0 nova_compute[182935]: 2026-01-22 00:06:06.192 182939 INFO nova.virt.libvirt.driver [-] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance destroyed successfully.
Jan 22 00:06:06 compute-0 nova_compute[182935]: 2026-01-22 00:06:06.193 182939 DEBUG nova.compute.manager [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42-userdata-shm.mount: Deactivated successfully.
Jan 22 00:06:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-8da81188e968eeda24266a84474254db1d3d87adfc5ecbb423640b09d7e7fc89-merged.mount: Deactivated successfully.
Jan 22 00:06:06 compute-0 podman[227311]: 2026-01-22 00:06:06.221344749 +0000 UTC m=+0.091642628 container cleanup bbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:06:06 compute-0 systemd[1]: libpod-conmon-bbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42.scope: Deactivated successfully.
Jan 22 00:06:06 compute-0 nova_compute[182935]: 2026-01-22 00:06:06.291 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 0.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:06 compute-0 podman[227358]: 2026-01-22 00:06:06.297690858 +0000 UTC m=+0.048603437 container remove bbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:06:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:06.303 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6def27e6-9148-463d-9712-26ab5ed9fd6d]: (4, ('Thu Jan 22 12:06:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 (bbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42)\nbbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42\nThu Jan 22 12:06:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 (bbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42)\nbbc6d185501adc0cef175326034ec193ce86dd7fa73a013d2fcf4cc8812a8a42\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:06.306 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[146feede-0496-411f-99ed-5eebc0100e71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:06.308 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3dacae7-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:06 compute-0 nova_compute[182935]: 2026-01-22 00:06:06.310 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:06 compute-0 kernel: tapb3dacae7-b0: left promiscuous mode
Jan 22 00:06:06 compute-0 nova_compute[182935]: 2026-01-22 00:06:06.335 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:06.338 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[488197a8-9b23-46a0-a2cf-ff791769af7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:06 compute-0 nova_compute[182935]: 2026-01-22 00:06:06.338 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:06 compute-0 nova_compute[182935]: 2026-01-22 00:06:06.339 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:06 compute-0 nova_compute[182935]: 2026-01-22 00:06:06.339 182939 DEBUG nova.objects.instance [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 00:06:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:06.359 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[47e6c4bb-3810-45b2-87ef-764d853744e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:06.361 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0b602fad-4888-4757-a211-58ad4624c368]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:06.384 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c0834075-5250-4800-a273-55363b700433]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486773, 'reachable_time': 43424, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227377, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:06 compute-0 systemd[1]: run-netns-ovnmeta\x2db3dacae7\x2db9cd\x2d426c\x2daa4a\x2d3a6b971c7ee5.mount: Deactivated successfully.
Jan 22 00:06:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:06.388 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:06:06 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:06.388 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca89fdc-a5f7-4b11-bd5f-e7464fa29883]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:06 compute-0 nova_compute[182935]: 2026-01-22 00:06:06.431 182939 DEBUG oslo_concurrency.lockutils [None req-0b7d8149-1462-4794-8367-e9abdbfcd7ae b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:06 compute-0 nova_compute[182935]: 2026-01-22 00:06:06.628 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:07 compute-0 nova_compute[182935]: 2026-01-22 00:06:07.449 182939 DEBUG nova.compute.manager [req-4851889e-52c9-4999-9e24-24b879c131e6 req-d5136ba2-4278-482d-8ae8-35100e6cb592 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:07 compute-0 nova_compute[182935]: 2026-01-22 00:06:07.450 182939 DEBUG oslo_concurrency.lockutils [req-4851889e-52c9-4999-9e24-24b879c131e6 req-d5136ba2-4278-482d-8ae8-35100e6cb592 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:07 compute-0 nova_compute[182935]: 2026-01-22 00:06:07.451 182939 DEBUG oslo_concurrency.lockutils [req-4851889e-52c9-4999-9e24-24b879c131e6 req-d5136ba2-4278-482d-8ae8-35100e6cb592 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:07 compute-0 nova_compute[182935]: 2026-01-22 00:06:07.451 182939 DEBUG oslo_concurrency.lockutils [req-4851889e-52c9-4999-9e24-24b879c131e6 req-d5136ba2-4278-482d-8ae8-35100e6cb592 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:07 compute-0 nova_compute[182935]: 2026-01-22 00:06:07.452 182939 DEBUG nova.compute.manager [req-4851889e-52c9-4999-9e24-24b879c131e6 req-d5136ba2-4278-482d-8ae8-35100e6cb592 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] No waiting events found dispatching network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:07 compute-0 nova_compute[182935]: 2026-01-22 00:06:07.452 182939 WARNING nova.compute.manager [req-4851889e-52c9-4999-9e24-24b879c131e6 req-d5136ba2-4278-482d-8ae8-35100e6cb592 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received unexpected event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 for instance with vm_state stopped and task_state None.
Jan 22 00:06:08 compute-0 nova_compute[182935]: 2026-01-22 00:06:08.761 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.192 182939 DEBUG oslo_concurrency.lockutils [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Acquiring lock "23a189a6-71c2-49b7-b4b9-57715325d51e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.193 182939 DEBUG oslo_concurrency.lockutils [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.193 182939 DEBUG oslo_concurrency.lockutils [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Acquiring lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.194 182939 DEBUG oslo_concurrency.lockutils [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.194 182939 DEBUG oslo_concurrency.lockutils [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.206 182939 INFO nova.compute.manager [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Terminating instance
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.220 182939 DEBUG nova.compute.manager [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:06:09 compute-0 kernel: tapd0d81653-4a (unregistering): left promiscuous mode
Jan 22 00:06:09 compute-0 NetworkManager[55139]: <info>  [1769040369.2510] device (tapd0d81653-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.254 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.300 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 ovn_controller[95047]: 2026-01-22T00:06:09Z|00391|binding|INFO|Releasing lport d0d81653-4a8c-419b-805e-125ec18decf4 from this chassis (sb_readonly=0)
Jan 22 00:06:09 compute-0 ovn_controller[95047]: 2026-01-22T00:06:09Z|00392|binding|INFO|Setting lport d0d81653-4a8c-419b-805e-125ec18decf4 down in Southbound
Jan 22 00:06:09 compute-0 ovn_controller[95047]: 2026-01-22T00:06:09Z|00393|binding|INFO|Removing iface tapd0d81653-4a ovn-installed in OVS
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.304 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.314 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:b5:b3 10.100.0.12'], port_security=['fa:16:3e:36:b5:b3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '23a189a6-71c2-49b7-b4b9-57715325d51e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8d466d4-b930-4b2a-945c-c5093f17e6d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd7a87111c83c49e3a84542174682a417', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0f63dc10-b221-41ef-af36-8fe82532c693', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.222'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e66bfa38-6a10-4f04-ae82-4579b8788b23, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=d0d81653-4a8c-419b-805e-125ec18decf4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.315 104408 INFO neutron.agent.ovn.metadata.agent [-] Port d0d81653-4a8c-419b-805e-125ec18decf4 in datapath e8d466d4-b930-4b2a-945c-c5093f17e6d0 unbound from our chassis
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.317 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e8d466d4-b930-4b2a-945c-c5093f17e6d0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.317 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.318 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c1c14d30-5962-4a18-9050-a054c9301312]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.318 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0 namespace which is not needed anymore
Jan 22 00:06:09 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000005d.scope: Deactivated successfully.
Jan 22 00:06:09 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000005d.scope: Consumed 13.531s CPU time.
Jan 22 00:06:09 compute-0 systemd-machined[154182]: Machine qemu-49-instance-0000005d terminated.
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.424 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.424 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0[226876]: [NOTICE]   (226880) : haproxy version is 2.8.14-c23fe91
Jan 22 00:06:09 compute-0 neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0[226876]: [NOTICE]   (226880) : path to executable is /usr/sbin/haproxy
Jan 22 00:06:09 compute-0 neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0[226876]: [WARNING]  (226880) : Exiting Master process...
Jan 22 00:06:09 compute-0 neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0[226876]: [ALERT]    (226880) : Current worker (226882) exited with code 143 (Terminated)
Jan 22 00:06:09 compute-0 neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0[226876]: [WARNING]  (226880) : All workers exited. Exiting... (0)
Jan 22 00:06:09 compute-0 systemd[1]: libpod-edf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7.scope: Deactivated successfully.
Jan 22 00:06:09 compute-0 podman[227404]: 2026-01-22 00:06:09.479450951 +0000 UTC m=+0.056340819 container died edf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.489 182939 INFO nova.virt.libvirt.driver [-] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Instance destroyed successfully.
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.489 182939 DEBUG nova.objects.instance [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lazy-loading 'resources' on Instance uuid 23a189a6-71c2-49b7-b4b9-57715325d51e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-edf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7-userdata-shm.mount: Deactivated successfully.
Jan 22 00:06:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-13c5401daba3ce99b876b6838cce3a644dd80fbfe49b717b97bf3d28ea6fe671-merged.mount: Deactivated successfully.
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.511 182939 DEBUG nova.virt.libvirt.vif [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-728743880',display_name='tempest-ServersTestJSON-server-728743880',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-728743880',id=93,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO/kUhKXFSNj7NeptId7iGdcxjhHpGJW6c6PRytdnz0bDXVaXwNNhY8ejpA8vOuxeTYOYcQTBzkrITjsoC6uhEN6N78wXPj0b3eqvUOxU8sfqyVKG+qOfQrrVsPJlnhlNQ==',key_name='tempest-keypair-1305968714',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:05:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d7a87111c83c49e3a84542174682a417',ramdisk_id='',reservation_id='r-1cc0fjnr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-1366490185',owner_user_name='tempest-ServersTestJSON-1366490185-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:05:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b8f4c36c45874f0cb983bb4c419457b9',uuid=23a189a6-71c2-49b7-b4b9-57715325d51e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d0d81653-4a8c-419b-805e-125ec18decf4", "address": "fa:16:3e:36:b5:b3", "network": {"id": "e8d466d4-b930-4b2a-945c-c5093f17e6d0", "bridge": "br-int", "label": "tempest-ServersTestJSON-1326164589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d7a87111c83c49e3a84542174682a417", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0d81653-4a", "ovs_interfaceid": "d0d81653-4a8c-419b-805e-125ec18decf4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.511 182939 DEBUG nova.network.os_vif_util [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Converting VIF {"id": "d0d81653-4a8c-419b-805e-125ec18decf4", "address": "fa:16:3e:36:b5:b3", "network": {"id": "e8d466d4-b930-4b2a-945c-c5093f17e6d0", "bridge": "br-int", "label": "tempest-ServersTestJSON-1326164589-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d7a87111c83c49e3a84542174682a417", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd0d81653-4a", "ovs_interfaceid": "d0d81653-4a8c-419b-805e-125ec18decf4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.512 182939 DEBUG nova.network.os_vif_util [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:b5:b3,bridge_name='br-int',has_traffic_filtering=True,id=d0d81653-4a8c-419b-805e-125ec18decf4,network=Network(e8d466d4-b930-4b2a-945c-c5093f17e6d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0d81653-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.513 182939 DEBUG os_vif [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:b5:b3,bridge_name='br-int',has_traffic_filtering=True,id=d0d81653-4a8c-419b-805e-125ec18decf4,network=Network(e8d466d4-b930-4b2a-945c-c5093f17e6d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0d81653-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.514 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 podman[227404]: 2026-01-22 00:06:09.514834602 +0000 UTC m=+0.091724470 container cleanup edf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.515 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd0d81653-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.517 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.519 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.523 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.526 182939 INFO os_vif [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:b5:b3,bridge_name='br-int',has_traffic_filtering=True,id=d0d81653-4a8c-419b-805e-125ec18decf4,network=Network(e8d466d4-b930-4b2a-945c-c5093f17e6d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd0d81653-4a')
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.527 182939 INFO nova.virt.libvirt.driver [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Deleting instance files /var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e_del
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.529 182939 INFO nova.virt.libvirt.driver [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Deletion of /var/lib/nova/instances/23a189a6-71c2-49b7-b4b9-57715325d51e_del complete
Jan 22 00:06:09 compute-0 systemd[1]: libpod-conmon-edf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7.scope: Deactivated successfully.
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.585 182939 DEBUG oslo_concurrency.lockutils [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.586 182939 DEBUG oslo_concurrency.lockutils [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.586 182939 DEBUG oslo_concurrency.lockutils [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.586 182939 DEBUG oslo_concurrency.lockutils [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.586 182939 DEBUG oslo_concurrency.lockutils [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:09 compute-0 podman[227453]: 2026-01-22 00:06:09.591026208 +0000 UTC m=+0.044313582 container remove edf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.596 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[525d1b19-cca9-4c41-bb20-e1ea4d9c5184]: (4, ('Thu Jan 22 12:06:09 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0 (edf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7)\nedf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7\nThu Jan 22 12:06:09 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0 (edf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7)\nedf4cc127cc0bf1dc3b012503038083924b376d29c3b726b32086fd347bee6e7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.598 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[03c558ba-7fc6-4dca-86ee-e1a9006e8875]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.599 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8d466d4-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.601 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 kernel: tape8d466d4-b0: left promiscuous mode
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.604 182939 INFO nova.compute.manager [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Terminating instance
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.615 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.618 182939 DEBUG nova.compute.manager [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.618 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[47f320b7-b9ae-4684-9a3a-79107bf6475d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.626 182939 INFO nova.virt.libvirt.driver [-] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Instance destroyed successfully.
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.626 182939 DEBUG nova.objects.instance [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'resources' on Instance uuid a642c12c-c01b-41e1-8377-aae56b8d6493 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.634 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dbddb8a6-a10c-48a6-9cdb-a1aced639919]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.635 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1980ecfa-3748-47c3-9686-051a2c0bdf97]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.651 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7a5feaca-e767-4eb5-afe2-6efad8a504b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 484612, 'reachable_time': 28924, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227472, 'error': None, 'target': 'ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.652 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e8d466d4-b930-4b2a-945c-c5093f17e6d0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.653 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[63282b15-f8ff-476c-9033-9cda8919bd7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:09.653 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:06:09 compute-0 systemd[1]: run-netns-ovnmeta\x2de8d466d4\x2db930\x2d4b2a\x2d945c\x2dc5093f17e6d0.mount: Deactivated successfully.
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.663 182939 DEBUG nova.virt.libvirt.vif [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1570775958',display_name='tempest-tempest.common.compute-instance-1570775958',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1570775958',id=94,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:06:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c299d482d37e45169cca3d6f178e8555',ramdisk_id='',reservation_id='r-84dzguso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1347085859',owner_user_name='tempest-ServerActionsTestOtherA-1347085859-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:06:06Z,user_data=None,user_id='b4385295f46b45d8803b0c536a989822',uuid=a642c12c-c01b-41e1-8377-aae56b8d6493,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.664 182939 DEBUG nova.network.os_vif_util [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converting VIF {"id": "443c4c41-63d8-47ff-a528-5bf1231445e4", "address": "fa:16:3e:a8:06:47", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap443c4c41-63", "ovs_interfaceid": "443c4c41-63d8-47ff-a528-5bf1231445e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.664 182939 DEBUG nova.network.os_vif_util [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.665 182939 DEBUG os_vif [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.666 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.666 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap443c4c41-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.669 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.672 182939 INFO os_vif [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:06:47,bridge_name='br-int',has_traffic_filtering=True,id=443c4c41-63d8-47ff-a528-5bf1231445e4,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap443c4c41-63')
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.672 182939 INFO nova.virt.libvirt.driver [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Deleting instance files /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493_del
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.672 182939 INFO nova.virt.libvirt.driver [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Deletion of /var/lib/nova/instances/a642c12c-c01b-41e1-8377-aae56b8d6493_del complete
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.683 182939 DEBUG nova.compute.manager [req-28aa81ef-f457-41d0-9c31-ca24b26c9a7b req-f1553499-c606-42c8-b4b1-e8e49a72023e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received event network-vif-unplugged-443c4c41-63d8-47ff-a528-5bf1231445e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.683 182939 DEBUG oslo_concurrency.lockutils [req-28aa81ef-f457-41d0-9c31-ca24b26c9a7b req-f1553499-c606-42c8-b4b1-e8e49a72023e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.683 182939 DEBUG oslo_concurrency.lockutils [req-28aa81ef-f457-41d0-9c31-ca24b26c9a7b req-f1553499-c606-42c8-b4b1-e8e49a72023e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.684 182939 DEBUG oslo_concurrency.lockutils [req-28aa81ef-f457-41d0-9c31-ca24b26c9a7b req-f1553499-c606-42c8-b4b1-e8e49a72023e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.684 182939 DEBUG nova.compute.manager [req-28aa81ef-f457-41d0-9c31-ca24b26c9a7b req-f1553499-c606-42c8-b4b1-e8e49a72023e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] No waiting events found dispatching network-vif-unplugged-443c4c41-63d8-47ff-a528-5bf1231445e4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.684 182939 DEBUG nova.compute.manager [req-28aa81ef-f457-41d0-9c31-ca24b26c9a7b req-f1553499-c606-42c8-b4b1-e8e49a72023e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received event network-vif-unplugged-443c4c41-63d8-47ff-a528-5bf1231445e4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.684 182939 DEBUG nova.compute.manager [req-28aa81ef-f457-41d0-9c31-ca24b26c9a7b req-f1553499-c606-42c8-b4b1-e8e49a72023e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.684 182939 DEBUG oslo_concurrency.lockutils [req-28aa81ef-f457-41d0-9c31-ca24b26c9a7b req-f1553499-c606-42c8-b4b1-e8e49a72023e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.684 182939 DEBUG oslo_concurrency.lockutils [req-28aa81ef-f457-41d0-9c31-ca24b26c9a7b req-f1553499-c606-42c8-b4b1-e8e49a72023e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.685 182939 DEBUG oslo_concurrency.lockutils [req-28aa81ef-f457-41d0-9c31-ca24b26c9a7b req-f1553499-c606-42c8-b4b1-e8e49a72023e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.685 182939 DEBUG nova.compute.manager [req-28aa81ef-f457-41d0-9c31-ca24b26c9a7b req-f1553499-c606-42c8-b4b1-e8e49a72023e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] No waiting events found dispatching network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.685 182939 WARNING nova.compute.manager [req-28aa81ef-f457-41d0-9c31-ca24b26c9a7b req-f1553499-c606-42c8-b4b1-e8e49a72023e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received unexpected event network-vif-plugged-443c4c41-63d8-47ff-a528-5bf1231445e4 for instance with vm_state stopped and task_state deleting.
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.704 182939 DEBUG nova.compute.manager [req-770367f1-61af-4ac0-a089-b3f6567c5b68 req-171adab8-a9a3-4d73-8a22-286095b999d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Received event network-vif-unplugged-d0d81653-4a8c-419b-805e-125ec18decf4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.705 182939 DEBUG oslo_concurrency.lockutils [req-770367f1-61af-4ac0-a089-b3f6567c5b68 req-171adab8-a9a3-4d73-8a22-286095b999d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.705 182939 DEBUG oslo_concurrency.lockutils [req-770367f1-61af-4ac0-a089-b3f6567c5b68 req-171adab8-a9a3-4d73-8a22-286095b999d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.705 182939 DEBUG oslo_concurrency.lockutils [req-770367f1-61af-4ac0-a089-b3f6567c5b68 req-171adab8-a9a3-4d73-8a22-286095b999d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.705 182939 DEBUG nova.compute.manager [req-770367f1-61af-4ac0-a089-b3f6567c5b68 req-171adab8-a9a3-4d73-8a22-286095b999d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] No waiting events found dispatching network-vif-unplugged-d0d81653-4a8c-419b-805e-125ec18decf4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.705 182939 DEBUG nova.compute.manager [req-770367f1-61af-4ac0-a089-b3f6567c5b68 req-171adab8-a9a3-4d73-8a22-286095b999d7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Received event network-vif-unplugged-d0d81653-4a8c-419b-805e-125ec18decf4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.719 182939 INFO nova.compute.manager [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Took 0.50 seconds to destroy the instance on the hypervisor.
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.719 182939 DEBUG oslo.service.loopingcall [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.719 182939 DEBUG nova.compute.manager [-] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.720 182939 DEBUG nova.network.neutron [-] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.781 182939 INFO nova.compute.manager [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Took 0.16 seconds to destroy the instance on the hypervisor.
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.781 182939 DEBUG oslo.service.loopingcall [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.781 182939 DEBUG nova.compute.manager [-] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:06:09 compute-0 nova_compute[182935]: 2026-01-22 00:06:09.782 182939 DEBUG nova.network.neutron [-] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.590 182939 DEBUG nova.network.neutron [-] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.623 182939 INFO nova.compute.manager [-] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Took 1.84 seconds to deallocate network for instance.
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.634 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.778 182939 DEBUG oslo_concurrency.lockutils [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.778 182939 DEBUG oslo_concurrency.lockutils [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.811 182939 DEBUG nova.network.neutron [-] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.828 182939 INFO nova.compute.manager [-] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Took 2.11 seconds to deallocate network for instance.
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.916 182939 DEBUG nova.compute.manager [req-e8f8d747-0ba6-4dd6-b10b-54be8efaf64b req-bbae644a-5f40-4c79-894d-8e568df00323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Received event network-vif-plugged-d0d81653-4a8c-419b-805e-125ec18decf4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.916 182939 DEBUG oslo_concurrency.lockutils [req-e8f8d747-0ba6-4dd6-b10b-54be8efaf64b req-bbae644a-5f40-4c79-894d-8e568df00323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.917 182939 DEBUG oslo_concurrency.lockutils [req-e8f8d747-0ba6-4dd6-b10b-54be8efaf64b req-bbae644a-5f40-4c79-894d-8e568df00323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.917 182939 DEBUG oslo_concurrency.lockutils [req-e8f8d747-0ba6-4dd6-b10b-54be8efaf64b req-bbae644a-5f40-4c79-894d-8e568df00323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.917 182939 DEBUG nova.compute.manager [req-e8f8d747-0ba6-4dd6-b10b-54be8efaf64b req-bbae644a-5f40-4c79-894d-8e568df00323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] No waiting events found dispatching network-vif-plugged-d0d81653-4a8c-419b-805e-125ec18decf4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.917 182939 WARNING nova.compute.manager [req-e8f8d747-0ba6-4dd6-b10b-54be8efaf64b req-bbae644a-5f40-4c79-894d-8e568df00323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Received unexpected event network-vif-plugged-d0d81653-4a8c-419b-805e-125ec18decf4 for instance with vm_state deleted and task_state None.
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.918 182939 DEBUG nova.compute.manager [req-e8f8d747-0ba6-4dd6-b10b-54be8efaf64b req-bbae644a-5f40-4c79-894d-8e568df00323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Received event network-vif-deleted-443c4c41-63d8-47ff-a528-5bf1231445e4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.937 182939 DEBUG oslo_concurrency.lockutils [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:11 compute-0 nova_compute[182935]: 2026-01-22 00:06:11.997 182939 DEBUG nova.compute.provider_tree [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:06:12 compute-0 nova_compute[182935]: 2026-01-22 00:06:12.019 182939 DEBUG nova.scheduler.client.report [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:06:12 compute-0 nova_compute[182935]: 2026-01-22 00:06:12.027 182939 DEBUG nova.compute.manager [req-b507ed7d-77dd-4d42-8c7a-f1a3ebdfd9a0 req-2eedeb77-46d8-4ecd-b564-0949fad0925c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Received event network-vif-deleted-d0d81653-4a8c-419b-805e-125ec18decf4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:12 compute-0 nova_compute[182935]: 2026-01-22 00:06:12.051 182939 DEBUG oslo_concurrency.lockutils [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:12 compute-0 nova_compute[182935]: 2026-01-22 00:06:12.053 182939 DEBUG oslo_concurrency.lockutils [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:12 compute-0 nova_compute[182935]: 2026-01-22 00:06:12.358 182939 INFO nova.scheduler.client.report [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Deleted allocations for instance a642c12c-c01b-41e1-8377-aae56b8d6493
Jan 22 00:06:12 compute-0 nova_compute[182935]: 2026-01-22 00:06:12.394 182939 DEBUG nova.compute.provider_tree [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:06:12 compute-0 nova_compute[182935]: 2026-01-22 00:06:12.458 182939 DEBUG nova.scheduler.client.report [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:06:12 compute-0 nova_compute[182935]: 2026-01-22 00:06:12.504 182939 DEBUG oslo_concurrency.lockutils [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.451s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:12 compute-0 nova_compute[182935]: 2026-01-22 00:06:12.522 182939 DEBUG oslo_concurrency.lockutils [None req-eefa372a-f796-4474-8ef2-263c932efc31 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "a642c12c-c01b-41e1-8377-aae56b8d6493" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:12 compute-0 nova_compute[182935]: 2026-01-22 00:06:12.539 182939 INFO nova.scheduler.client.report [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Deleted allocations for instance 23a189a6-71c2-49b7-b4b9-57715325d51e
Jan 22 00:06:12 compute-0 nova_compute[182935]: 2026-01-22 00:06:12.634 182939 DEBUG oslo_concurrency.lockutils [None req-98f7e5c3-240c-4416-89d8-c9564732d496 b8f4c36c45874f0cb983bb4c419457b9 d7a87111c83c49e3a84542174682a417 - - default default] Lock "23a189a6-71c2-49b7-b4b9-57715325d51e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:12.655 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:14 compute-0 nova_compute[182935]: 2026-01-22 00:06:14.359 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:14 compute-0 nova_compute[182935]: 2026-01-22 00:06:14.551 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:14 compute-0 nova_compute[182935]: 2026-01-22 00:06:14.668 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:16 compute-0 nova_compute[182935]: 2026-01-22 00:06:16.631 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:19 compute-0 nova_compute[182935]: 2026-01-22 00:06:19.669 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:19 compute-0 podman[227475]: 2026-01-22 00:06:19.685629223 +0000 UTC m=+0.051691114 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:06:19 compute-0 podman[227474]: 2026-01-22 00:06:19.711773017 +0000 UTC m=+0.080801661 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:06:21 compute-0 nova_compute[182935]: 2026-01-22 00:06:21.193 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040366.1898417, a642c12c-c01b-41e1-8377-aae56b8d6493 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:21 compute-0 nova_compute[182935]: 2026-01-22 00:06:21.194 182939 INFO nova.compute.manager [-] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] VM Stopped (Lifecycle Event)
Jan 22 00:06:21 compute-0 nova_compute[182935]: 2026-01-22 00:06:21.293 182939 DEBUG nova.compute.manager [None req-656b18b6-a586-4d11-a513-8d783bb927b0 - - - - - -] [instance: a642c12c-c01b-41e1-8377-aae56b8d6493] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:21 compute-0 nova_compute[182935]: 2026-01-22 00:06:21.634 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:06:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:06:24 compute-0 nova_compute[182935]: 2026-01-22 00:06:24.488 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040369.4861693, 23a189a6-71c2-49b7-b4b9-57715325d51e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:24 compute-0 nova_compute[182935]: 2026-01-22 00:06:24.489 182939 INFO nova.compute.manager [-] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] VM Stopped (Lifecycle Event)
Jan 22 00:06:24 compute-0 nova_compute[182935]: 2026-01-22 00:06:24.519 182939 DEBUG nova.compute.manager [None req-62b00a8a-5a4d-4206-8591-1f24a392a2f3 - - - - - -] [instance: 23a189a6-71c2-49b7-b4b9-57715325d51e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:24 compute-0 nova_compute[182935]: 2026-01-22 00:06:24.671 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.510 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "12480564-c85f-4cf7-bf02-29faf18a3117" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.512 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.534 182939 DEBUG nova.compute.manager [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.639 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.640 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.648 182939 DEBUG nova.virt.hardware [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.649 182939 INFO nova.compute.claims [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.822 182939 DEBUG nova.compute.provider_tree [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.839 182939 DEBUG nova.scheduler.client.report [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.871 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.872 182939 DEBUG nova.compute.manager [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.947 182939 DEBUG nova.compute.manager [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.947 182939 DEBUG nova.network.neutron [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.968 182939 INFO nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:06:25 compute-0 nova_compute[182935]: 2026-01-22 00:06:25.988 182939 DEBUG nova.compute.manager [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.127 182939 DEBUG nova.compute.manager [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.128 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.129 182939 INFO nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Creating image(s)
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.130 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "/var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.130 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "/var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.131 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "/var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.145 182939 DEBUG oslo_concurrency.processutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.170 182939 DEBUG nova.policy [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b4385295f46b45d8803b0c536a989822', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c299d482d37e45169cca3d6f178e8555', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.205 182939 DEBUG oslo_concurrency.processutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.207 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.207 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.218 182939 DEBUG oslo_concurrency.processutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.276 182939 DEBUG oslo_concurrency.processutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.277 182939 DEBUG oslo_concurrency.processutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.317 182939 DEBUG oslo_concurrency.processutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.318 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.318 182939 DEBUG oslo_concurrency.processutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.380 182939 DEBUG oslo_concurrency.processutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.381 182939 DEBUG nova.virt.disk.api [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Checking if we can resize image /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.381 182939 DEBUG oslo_concurrency.processutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.439 182939 DEBUG oslo_concurrency.processutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.440 182939 DEBUG nova.virt.disk.api [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Cannot resize image /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.441 182939 DEBUG nova.objects.instance [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'migration_context' on Instance uuid 12480564-c85f-4cf7-bf02-29faf18a3117 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.457 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.458 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Ensure instance console log exists: /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.459 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.459 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.459 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:26 compute-0 nova_compute[182935]: 2026-01-22 00:06:26.636 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:27 compute-0 nova_compute[182935]: 2026-01-22 00:06:27.065 182939 DEBUG nova.network.neutron [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Successfully created port: 289acc47-d17b-40ae-bf63-6bb022c242bb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:06:27 compute-0 podman[227539]: 2026-01-22 00:06:27.675641477 +0000 UTC m=+0.046192439 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:06:29 compute-0 nova_compute[182935]: 2026-01-22 00:06:29.146 182939 DEBUG nova.network.neutron [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Successfully updated port: 289acc47-d17b-40ae-bf63-6bb022c242bb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:06:29 compute-0 nova_compute[182935]: 2026-01-22 00:06:29.178 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "refresh_cache-12480564-c85f-4cf7-bf02-29faf18a3117" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:06:29 compute-0 nova_compute[182935]: 2026-01-22 00:06:29.179 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquired lock "refresh_cache-12480564-c85f-4cf7-bf02-29faf18a3117" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:06:29 compute-0 nova_compute[182935]: 2026-01-22 00:06:29.179 182939 DEBUG nova.network.neutron [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:06:29 compute-0 nova_compute[182935]: 2026-01-22 00:06:29.278 182939 DEBUG nova.compute.manager [req-18b5f85f-1994-4600-a1f4-fd1a0c2dc1cb req-c7434d08-2fa4-4fbc-8eb1-8d8acb33f19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Received event network-changed-289acc47-d17b-40ae-bf63-6bb022c242bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:29 compute-0 nova_compute[182935]: 2026-01-22 00:06:29.278 182939 DEBUG nova.compute.manager [req-18b5f85f-1994-4600-a1f4-fd1a0c2dc1cb req-c7434d08-2fa4-4fbc-8eb1-8d8acb33f19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Refreshing instance network info cache due to event network-changed-289acc47-d17b-40ae-bf63-6bb022c242bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:06:29 compute-0 nova_compute[182935]: 2026-01-22 00:06:29.278 182939 DEBUG oslo_concurrency.lockutils [req-18b5f85f-1994-4600-a1f4-fd1a0c2dc1cb req-c7434d08-2fa4-4fbc-8eb1-8d8acb33f19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-12480564-c85f-4cf7-bf02-29faf18a3117" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:06:29 compute-0 nova_compute[182935]: 2026-01-22 00:06:29.428 182939 DEBUG nova.network.neutron [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:06:29 compute-0 nova_compute[182935]: 2026-01-22 00:06:29.673 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.610 182939 DEBUG nova.network.neutron [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Updating instance_info_cache with network_info: [{"id": "289acc47-d17b-40ae-bf63-6bb022c242bb", "address": "fa:16:3e:d3:d4:ee", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap289acc47-d1", "ovs_interfaceid": "289acc47-d17b-40ae-bf63-6bb022c242bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.644 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Releasing lock "refresh_cache-12480564-c85f-4cf7-bf02-29faf18a3117" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.648 182939 DEBUG nova.compute.manager [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Instance network_info: |[{"id": "289acc47-d17b-40ae-bf63-6bb022c242bb", "address": "fa:16:3e:d3:d4:ee", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap289acc47-d1", "ovs_interfaceid": "289acc47-d17b-40ae-bf63-6bb022c242bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.651 182939 DEBUG oslo_concurrency.lockutils [req-18b5f85f-1994-4600-a1f4-fd1a0c2dc1cb req-c7434d08-2fa4-4fbc-8eb1-8d8acb33f19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-12480564-c85f-4cf7-bf02-29faf18a3117" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.652 182939 DEBUG nova.network.neutron [req-18b5f85f-1994-4600-a1f4-fd1a0c2dc1cb req-c7434d08-2fa4-4fbc-8eb1-8d8acb33f19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Refreshing network info cache for port 289acc47-d17b-40ae-bf63-6bb022c242bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.654 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Start _get_guest_xml network_info=[{"id": "289acc47-d17b-40ae-bf63-6bb022c242bb", "address": "fa:16:3e:d3:d4:ee", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap289acc47-d1", "ovs_interfaceid": "289acc47-d17b-40ae-bf63-6bb022c242bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.659 182939 WARNING nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.664 182939 DEBUG nova.virt.libvirt.host [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.665 182939 DEBUG nova.virt.libvirt.host [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.669 182939 DEBUG nova.virt.libvirt.host [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.669 182939 DEBUG nova.virt.libvirt.host [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.671 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.671 182939 DEBUG nova.virt.hardware [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.671 182939 DEBUG nova.virt.hardware [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.672 182939 DEBUG nova.virt.hardware [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.672 182939 DEBUG nova.virt.hardware [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.672 182939 DEBUG nova.virt.hardware [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.672 182939 DEBUG nova.virt.hardware [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.673 182939 DEBUG nova.virt.hardware [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.673 182939 DEBUG nova.virt.hardware [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.673 182939 DEBUG nova.virt.hardware [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.673 182939 DEBUG nova.virt.hardware [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.674 182939 DEBUG nova.virt.hardware [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.677 182939 DEBUG nova.virt.libvirt.vif [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:06:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-609823566',display_name='tempest-ServerActionsTestOtherA-server-609823566',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-609823566',id=95,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c299d482d37e45169cca3d6f178e8555',ramdisk_id='',reservation_id='r-qlzsmz8u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1347085859',owner_user_name='tempest-ServerActionsTestOtherA-1347085859-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:26Z,user_data=None,user_id='b4385295f46b45d8803b0c536a989822',uuid=12480564-c85f-4cf7-bf02-29faf18a3117,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "289acc47-d17b-40ae-bf63-6bb022c242bb", "address": "fa:16:3e:d3:d4:ee", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap289acc47-d1", "ovs_interfaceid": "289acc47-d17b-40ae-bf63-6bb022c242bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.677 182939 DEBUG nova.network.os_vif_util [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converting VIF {"id": "289acc47-d17b-40ae-bf63-6bb022c242bb", "address": "fa:16:3e:d3:d4:ee", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap289acc47-d1", "ovs_interfaceid": "289acc47-d17b-40ae-bf63-6bb022c242bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.678 182939 DEBUG nova.network.os_vif_util [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d4:ee,bridge_name='br-int',has_traffic_filtering=True,id=289acc47-d17b-40ae-bf63-6bb022c242bb,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap289acc47-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.679 182939 DEBUG nova.objects.instance [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'pci_devices' on Instance uuid 12480564-c85f-4cf7-bf02-29faf18a3117 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.710 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:06:30 compute-0 nova_compute[182935]:   <uuid>12480564-c85f-4cf7-bf02-29faf18a3117</uuid>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   <name>instance-0000005f</name>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerActionsTestOtherA-server-609823566</nova:name>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:06:30</nova:creationTime>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:06:30 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:06:30 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:06:30 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:06:30 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:06:30 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:06:30 compute-0 nova_compute[182935]:         <nova:user uuid="b4385295f46b45d8803b0c536a989822">tempest-ServerActionsTestOtherA-1347085859-project-member</nova:user>
Jan 22 00:06:30 compute-0 nova_compute[182935]:         <nova:project uuid="c299d482d37e45169cca3d6f178e8555">tempest-ServerActionsTestOtherA-1347085859</nova:project>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:06:30 compute-0 nova_compute[182935]:         <nova:port uuid="289acc47-d17b-40ae-bf63-6bb022c242bb">
Jan 22 00:06:30 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <system>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <entry name="serial">12480564-c85f-4cf7-bf02-29faf18a3117</entry>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <entry name="uuid">12480564-c85f-4cf7-bf02-29faf18a3117</entry>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     </system>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   <os>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   </os>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   <features>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   </features>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk.config"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:d3:d4:ee"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <target dev="tap289acc47-d1"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/console.log" append="off"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <video>
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     </video>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:06:30 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:06:30 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:06:30 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:06:30 compute-0 nova_compute[182935]: </domain>
Jan 22 00:06:30 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.711 182939 DEBUG nova.compute.manager [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Preparing to wait for external event network-vif-plugged-289acc47-d17b-40ae-bf63-6bb022c242bb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.712 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.712 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.712 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.713 182939 DEBUG nova.virt.libvirt.vif [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:06:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-609823566',display_name='tempest-ServerActionsTestOtherA-server-609823566',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-609823566',id=95,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c299d482d37e45169cca3d6f178e8555',ramdisk_id='',reservation_id='r-qlzsmz8u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-1347085859',owner_user_name='tempest-ServerActionsTestOtherA-1347085859-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:26Z,user_data=None,user_id='b4385295f46b45d8803b0c536a989822',uuid=12480564-c85f-4cf7-bf02-29faf18a3117,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "289acc47-d17b-40ae-bf63-6bb022c242bb", "address": "fa:16:3e:d3:d4:ee", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap289acc47-d1", "ovs_interfaceid": "289acc47-d17b-40ae-bf63-6bb022c242bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.713 182939 DEBUG nova.network.os_vif_util [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converting VIF {"id": "289acc47-d17b-40ae-bf63-6bb022c242bb", "address": "fa:16:3e:d3:d4:ee", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap289acc47-d1", "ovs_interfaceid": "289acc47-d17b-40ae-bf63-6bb022c242bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.714 182939 DEBUG nova.network.os_vif_util [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d4:ee,bridge_name='br-int',has_traffic_filtering=True,id=289acc47-d17b-40ae-bf63-6bb022c242bb,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap289acc47-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.714 182939 DEBUG os_vif [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d4:ee,bridge_name='br-int',has_traffic_filtering=True,id=289acc47-d17b-40ae-bf63-6bb022c242bb,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap289acc47-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.714 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.715 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.715 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.717 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.718 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap289acc47-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.718 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap289acc47-d1, col_values=(('external_ids', {'iface-id': '289acc47-d17b-40ae-bf63-6bb022c242bb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:d4:ee', 'vm-uuid': '12480564-c85f-4cf7-bf02-29faf18a3117'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.719 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:30 compute-0 NetworkManager[55139]: <info>  [1769040390.7211] manager: (tap289acc47-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.722 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.725 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.726 182939 INFO os_vif [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d4:ee,bridge_name='br-int',has_traffic_filtering=True,id=289acc47-d17b-40ae-bf63-6bb022c242bb,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap289acc47-d1')
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.835 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.836 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.836 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] No VIF found with MAC fa:16:3e:d3:d4:ee, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:06:30 compute-0 nova_compute[182935]: 2026-01-22 00:06:30.836 182939 INFO nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Using config drive
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.270 182939 INFO nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Creating config drive at /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk.config
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.275 182939 DEBUG oslo_concurrency.processutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqykod0wx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.411 182939 DEBUG oslo_concurrency.processutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqykod0wx" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:31 compute-0 kernel: tap289acc47-d1: entered promiscuous mode
Jan 22 00:06:31 compute-0 NetworkManager[55139]: <info>  [1769040391.4801] manager: (tap289acc47-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/179)
Jan 22 00:06:31 compute-0 systemd-udevd[227582]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:06:31 compute-0 ovn_controller[95047]: 2026-01-22T00:06:31Z|00394|binding|INFO|Claiming lport 289acc47-d17b-40ae-bf63-6bb022c242bb for this chassis.
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.522 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:31 compute-0 ovn_controller[95047]: 2026-01-22T00:06:31Z|00395|binding|INFO|289acc47-d17b-40ae-bf63-6bb022c242bb: Claiming fa:16:3e:d3:d4:ee 10.100.0.7
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.528 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:31 compute-0 NetworkManager[55139]: <info>  [1769040391.5353] device (tap289acc47-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:06:31 compute-0 NetworkManager[55139]: <info>  [1769040391.5365] device (tap289acc47-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:06:31 compute-0 NetworkManager[55139]: <info>  [1769040391.5382] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/180)
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.537 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:31 compute-0 NetworkManager[55139]: <info>  [1769040391.5390] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/181)
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.542 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:d4:ee 10.100.0.7'], port_security=['fa:16:3e:d3:d4:ee 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '12480564-c85f-4cf7-bf02-29faf18a3117', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c299d482d37e45169cca3d6f178e8555', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0559740a-e5c9-4749-8c84-0b7852b94df4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=47fc8aa5-cd00-4c23-8e55-87bda0bbf0d4, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=289acc47-d17b-40ae-bf63-6bb022c242bb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.543 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 289acc47-d17b-40ae-bf63-6bb022c242bb in datapath b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 bound to our chassis
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.545 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5
Jan 22 00:06:31 compute-0 systemd-machined[154182]: New machine qemu-51-instance-0000005f.
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.558 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b9598be4-5d56-4761-ba19-143afccd2e34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.559 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb3dacae7-b1 in ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.561 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb3dacae7-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.561 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dd49bbea-b103-40fc-a6be-9d01952a9bf3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.562 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d1604cda-6526-4a77-8746-933a2371bc97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.575 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[17292747-c575-43c7-9e9c-20e60d713747]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.604 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3c872f76-fc0b-4893-8b14-57ca0ff6d182]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.637 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[02976425-b694-460b-830d-b315526ae3bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 systemd[1]: Started Virtual Machine qemu-51-instance-0000005f.
Jan 22 00:06:31 compute-0 NetworkManager[55139]: <info>  [1769040391.6555] manager: (tapb3dacae7-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/182)
Jan 22 00:06:31 compute-0 systemd-udevd[227586]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.654 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[35a189e5-c185-4ce2-97fd-42216ff8be26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.672 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.674 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.697 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.698 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[3f29dd6c-6588-473e-96ee-40ce15667703]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.701 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[4e9a12db-5bb4-40f3-bae2-ea9106ad5bc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_controller[95047]: 2026-01-22T00:06:31Z|00396|binding|INFO|Setting lport 289acc47-d17b-40ae-bf63-6bb022c242bb ovn-installed in OVS
Jan 22 00:06:31 compute-0 ovn_controller[95047]: 2026-01-22T00:06:31Z|00397|binding|INFO|Setting lport 289acc47-d17b-40ae-bf63-6bb022c242bb up in Southbound
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.705 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:31 compute-0 NetworkManager[55139]: <info>  [1769040391.7245] device (tapb3dacae7-b0): carrier: link connected
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.732 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4b1591-625b-4477-8749-2311b86478ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.750 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[824895cc-ed2f-4801-8c94-19f963202496]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3dacae7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:f1:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489446, 'reachable_time': 32298, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227617, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.770 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ca63ae9f-80d5-45ea-a0da-a5ca934efece]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8b:f1ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489446, 'tstamp': 489446}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227619, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.789 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[76211754-cb21-41a4-b784-84bef3b8bf09]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb3dacae7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8b:f1:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489446, 'reachable_time': 32298, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227620, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.815 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.815 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.829 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[13fdabb6-c303-4b84-912c-0e6e1e9d4662]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.885 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[63bc6a90-4293-4a2a-8d93-cc552ab2a146]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.887 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3dacae7-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.887 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.887 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb3dacae7-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:31 compute-0 kernel: tapb3dacae7-b0: entered promiscuous mode
Jan 22 00:06:31 compute-0 NetworkManager[55139]: <info>  [1769040391.8901] manager: (tapb3dacae7-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.889 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.895 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb3dacae7-b0, col_values=(('external_ids', {'iface-id': '90cfb65b-4764-45c8-aca6-274b0a687241'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:31 compute-0 ovn_controller[95047]: 2026-01-22T00:06:31Z|00398|binding|INFO|Releasing lport 90cfb65b-4764-45c8-aca6-274b0a687241 from this chassis (sb_readonly=0)
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.895 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.898 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.899 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9638e81b-08e1-4ef7-81de-c07476269cd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.900 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.pid.haproxy
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:06:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:31.901 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'env', 'PROCESS_TAG=haproxy-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:06:31 compute-0 nova_compute[182935]: 2026-01-22 00:06:31.907 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:32 compute-0 podman[227657]: 2026-01-22 00:06:32.279900805 +0000 UTC m=+0.064275724 container create f52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.285 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040392.2853332, 12480564-c85f-4cf7-bf02-29faf18a3117 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.286 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] VM Started (Lifecycle Event)
Jan 22 00:06:32 compute-0 systemd[1]: Started libpod-conmon-f52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9.scope.
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.312 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.318 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040392.2855277, 12480564-c85f-4cf7-bf02-29faf18a3117 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.318 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] VM Paused (Lifecycle Event)
Jan 22 00:06:32 compute-0 podman[227657]: 2026-01-22 00:06:32.241058338 +0000 UTC m=+0.025433317 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:06:32 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.343 182939 DEBUG nova.compute.manager [req-87c4d25a-3986-496a-a750-d874bfd09e1b req-0c5a1d53-f8b9-4bf5-9a00-2860c2cdf7e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Received event network-vif-plugged-289acc47-d17b-40ae-bf63-6bb022c242bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.344 182939 DEBUG oslo_concurrency.lockutils [req-87c4d25a-3986-496a-a750-d874bfd09e1b req-0c5a1d53-f8b9-4bf5-9a00-2860c2cdf7e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e182719e4bc0b1518a6a98e0d5ce77729f7650886c8c1f894c973718efe7e691/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.344 182939 DEBUG oslo_concurrency.lockutils [req-87c4d25a-3986-496a-a750-d874bfd09e1b req-0c5a1d53-f8b9-4bf5-9a00-2860c2cdf7e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.345 182939 DEBUG oslo_concurrency.lockutils [req-87c4d25a-3986-496a-a750-d874bfd09e1b req-0c5a1d53-f8b9-4bf5-9a00-2860c2cdf7e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.345 182939 DEBUG nova.compute.manager [req-87c4d25a-3986-496a-a750-d874bfd09e1b req-0c5a1d53-f8b9-4bf5-9a00-2860c2cdf7e9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Processing event network-vif-plugged-289acc47-d17b-40ae-bf63-6bb022c242bb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.347 182939 DEBUG nova.compute.manager [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.348 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.353 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.357 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.362 182939 INFO nova.virt.libvirt.driver [-] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Instance spawned successfully.
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.363 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:06:32 compute-0 podman[227657]: 2026-01-22 00:06:32.365491462 +0000 UTC m=+0.149866411 container init f52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 00:06:32 compute-0 podman[227657]: 2026-01-22 00:06:32.373729635 +0000 UTC m=+0.158104544 container start f52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:06:32 compute-0 podman[227673]: 2026-01-22 00:06:32.380872811 +0000 UTC m=+0.063695919 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.395 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.395 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040392.3534045, 12480564-c85f-4cf7-bf02-29faf18a3117 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.395 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] VM Resumed (Lifecycle Event)
Jan 22 00:06:32 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227676]: [NOTICE]   (227697) : New worker (227699) forked
Jan 22 00:06:32 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227676]: [NOTICE]   (227697) : Loading success.
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.404 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.405 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.405 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.405 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.406 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.406 182939 DEBUG nova.virt.libvirt.driver [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.418 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.423 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.467 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.501 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.504 182939 INFO nova.compute.manager [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Took 6.38 seconds to spawn the instance on the hypervisor.
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.504 182939 DEBUG nova.compute.manager [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.608 182939 INFO nova.compute.manager [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Took 7.01 seconds to build instance.
Jan 22 00:06:32 compute-0 sshd-session[227630]: Invalid user svn from 188.166.69.60 port 51130
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.631 182939 DEBUG oslo_concurrency.lockutils [None req-41347a32-c355-480c-895b-d0b0aa3d7d39 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:32 compute-0 sshd-session[227630]: Connection closed by invalid user svn 188.166.69.60 port 51130 [preauth]
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.856 182939 DEBUG nova.network.neutron [req-18b5f85f-1994-4600-a1f4-fd1a0c2dc1cb req-c7434d08-2fa4-4fbc-8eb1-8d8acb33f19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Updated VIF entry in instance network info cache for port 289acc47-d17b-40ae-bf63-6bb022c242bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.857 182939 DEBUG nova.network.neutron [req-18b5f85f-1994-4600-a1f4-fd1a0c2dc1cb req-c7434d08-2fa4-4fbc-8eb1-8d8acb33f19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Updating instance_info_cache with network_info: [{"id": "289acc47-d17b-40ae-bf63-6bb022c242bb", "address": "fa:16:3e:d3:d4:ee", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap289acc47-d1", "ovs_interfaceid": "289acc47-d17b-40ae-bf63-6bb022c242bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:06:32 compute-0 nova_compute[182935]: 2026-01-22 00:06:32.908 182939 DEBUG oslo_concurrency.lockutils [req-18b5f85f-1994-4600-a1f4-fd1a0c2dc1cb req-c7434d08-2fa4-4fbc-8eb1-8d8acb33f19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-12480564-c85f-4cf7-bf02-29faf18a3117" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:06:33 compute-0 nova_compute[182935]: 2026-01-22 00:06:33.337 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:34 compute-0 nova_compute[182935]: 2026-01-22 00:06:34.550 182939 DEBUG nova.compute.manager [req-de8f4196-5158-43b4-85a7-721fedaf867d req-b251dc69-95f4-4a50-a871-982e62ec03ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Received event network-vif-plugged-289acc47-d17b-40ae-bf63-6bb022c242bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:34 compute-0 nova_compute[182935]: 2026-01-22 00:06:34.550 182939 DEBUG oslo_concurrency.lockutils [req-de8f4196-5158-43b4-85a7-721fedaf867d req-b251dc69-95f4-4a50-a871-982e62ec03ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:34 compute-0 nova_compute[182935]: 2026-01-22 00:06:34.551 182939 DEBUG oslo_concurrency.lockutils [req-de8f4196-5158-43b4-85a7-721fedaf867d req-b251dc69-95f4-4a50-a871-982e62ec03ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:34 compute-0 nova_compute[182935]: 2026-01-22 00:06:34.551 182939 DEBUG oslo_concurrency.lockutils [req-de8f4196-5158-43b4-85a7-721fedaf867d req-b251dc69-95f4-4a50-a871-982e62ec03ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:34 compute-0 nova_compute[182935]: 2026-01-22 00:06:34.551 182939 DEBUG nova.compute.manager [req-de8f4196-5158-43b4-85a7-721fedaf867d req-b251dc69-95f4-4a50-a871-982e62ec03ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] No waiting events found dispatching network-vif-plugged-289acc47-d17b-40ae-bf63-6bb022c242bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:34 compute-0 nova_compute[182935]: 2026-01-22 00:06:34.551 182939 WARNING nova.compute.manager [req-de8f4196-5158-43b4-85a7-721fedaf867d req-b251dc69-95f4-4a50-a871-982e62ec03ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Received unexpected event network-vif-plugged-289acc47-d17b-40ae-bf63-6bb022c242bb for instance with vm_state active and task_state None.
Jan 22 00:06:34 compute-0 nova_compute[182935]: 2026-01-22 00:06:34.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:34 compute-0 nova_compute[182935]: 2026-01-22 00:06:34.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:34 compute-0 nova_compute[182935]: 2026-01-22 00:06:34.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:06:35 compute-0 podman[227709]: 2026-01-22 00:06:35.699667688 +0000 UTC m=+0.063620858 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:06:35 compute-0 podman[227708]: 2026-01-22 00:06:35.7090981 +0000 UTC m=+0.078113695 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.720 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.818 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.818 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.819 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.819 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.880 182939 DEBUG nova.compute.manager [req-46089eb7-0489-485e-9417-ee491463f07e req-160ac555-1687-468e-8686-af536d90bfd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Received event network-changed-289acc47-d17b-40ae-bf63-6bb022c242bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.882 182939 DEBUG nova.compute.manager [req-46089eb7-0489-485e-9417-ee491463f07e req-160ac555-1687-468e-8686-af536d90bfd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Refreshing instance network info cache due to event network-changed-289acc47-d17b-40ae-bf63-6bb022c242bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.882 182939 DEBUG oslo_concurrency.lockutils [req-46089eb7-0489-485e-9417-ee491463f07e req-160ac555-1687-468e-8686-af536d90bfd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-12480564-c85f-4cf7-bf02-29faf18a3117" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.882 182939 DEBUG oslo_concurrency.lockutils [req-46089eb7-0489-485e-9417-ee491463f07e req-160ac555-1687-468e-8686-af536d90bfd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-12480564-c85f-4cf7-bf02-29faf18a3117" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.882 182939 DEBUG nova.network.neutron [req-46089eb7-0489-485e-9417-ee491463f07e req-160ac555-1687-468e-8686-af536d90bfd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Refreshing network info cache for port 289acc47-d17b-40ae-bf63-6bb022c242bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.890 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.949 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:35 compute-0 nova_compute[182935]: 2026-01-22 00:06:35.950 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.007 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.177 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.179 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5544MB free_disk=73.12743759155273GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.179 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.179 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.258 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 12480564-c85f-4cf7-bf02-29faf18a3117 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.259 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.259 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.321 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.336 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.357 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.357 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.681 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.842 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.842 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.870 182939 DEBUG nova.compute.manager [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.977 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.978 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.986 182939 DEBUG nova.virt.hardware [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:06:36 compute-0 nova_compute[182935]: 2026-01-22 00:06:36.987 182939 INFO nova.compute.claims [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.162 182939 DEBUG nova.compute.provider_tree [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.177 182939 DEBUG nova.scheduler.client.report [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.207 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.208 182939 DEBUG nova.compute.manager [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.279 182939 DEBUG nova.compute.manager [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.280 182939 DEBUG nova.network.neutron [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.397 182939 DEBUG nova.network.neutron [req-46089eb7-0489-485e-9417-ee491463f07e req-160ac555-1687-468e-8686-af536d90bfd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Updated VIF entry in instance network info cache for port 289acc47-d17b-40ae-bf63-6bb022c242bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.398 182939 DEBUG nova.network.neutron [req-46089eb7-0489-485e-9417-ee491463f07e req-160ac555-1687-468e-8686-af536d90bfd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Updating instance_info_cache with network_info: [{"id": "289acc47-d17b-40ae-bf63-6bb022c242bb", "address": "fa:16:3e:d3:d4:ee", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap289acc47-d1", "ovs_interfaceid": "289acc47-d17b-40ae-bf63-6bb022c242bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.449 182939 INFO nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.500 182939 DEBUG nova.policy [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.528 182939 DEBUG oslo_concurrency.lockutils [req-46089eb7-0489-485e-9417-ee491463f07e req-160ac555-1687-468e-8686-af536d90bfd5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-12480564-c85f-4cf7-bf02-29faf18a3117" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.568 182939 DEBUG nova.compute.manager [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.974 182939 DEBUG nova.compute.manager [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.975 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.976 182939 INFO nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Creating image(s)
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.976 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "/var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.976 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.977 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:37 compute-0 nova_compute[182935]: 2026-01-22 00:06:37.989 182939 DEBUG oslo_concurrency.processutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.062 182939 DEBUG oslo_concurrency.processutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.064 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.064 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.075 182939 DEBUG oslo_concurrency.processutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.136 182939 DEBUG oslo_concurrency.processutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.137 182939 DEBUG oslo_concurrency.processutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.190 182939 DEBUG oslo_concurrency.processutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.192 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.193 182939 DEBUG oslo_concurrency.processutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.259 182939 DEBUG oslo_concurrency.processutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.261 182939 DEBUG nova.virt.disk.api [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Checking if we can resize image /var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.261 182939 DEBUG oslo_concurrency.processutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.359 182939 DEBUG oslo_concurrency.processutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.360 182939 DEBUG nova.virt.disk.api [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Cannot resize image /var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.361 182939 DEBUG nova.objects.instance [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'migration_context' on Instance uuid 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.416 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.417 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Ensure instance console log exists: /var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.417 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.418 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:38 compute-0 nova_compute[182935]: 2026-01-22 00:06:38.418 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:39 compute-0 nova_compute[182935]: 2026-01-22 00:06:39.358 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:39 compute-0 nova_compute[182935]: 2026-01-22 00:06:39.508 182939 DEBUG nova.network.neutron [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Successfully created port: 3bab73ac-3e6b-4687-baeb-3313f2704c0c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:06:39 compute-0 nova_compute[182935]: 2026-01-22 00:06:39.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.127 182939 DEBUG oslo_concurrency.lockutils [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "12480564-c85f-4cf7-bf02-29faf18a3117" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.127 182939 DEBUG oslo_concurrency.lockutils [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.128 182939 DEBUG oslo_concurrency.lockutils [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.128 182939 DEBUG oslo_concurrency.lockutils [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.129 182939 DEBUG oslo_concurrency.lockutils [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.142 182939 INFO nova.compute.manager [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Terminating instance
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.154 182939 DEBUG nova.compute.manager [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:06:40 compute-0 kernel: tap289acc47-d1 (unregistering): left promiscuous mode
Jan 22 00:06:40 compute-0 NetworkManager[55139]: <info>  [1769040400.1834] device (tap289acc47-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:06:40 compute-0 ovn_controller[95047]: 2026-01-22T00:06:40Z|00399|binding|INFO|Releasing lport 289acc47-d17b-40ae-bf63-6bb022c242bb from this chassis (sb_readonly=0)
Jan 22 00:06:40 compute-0 ovn_controller[95047]: 2026-01-22T00:06:40Z|00400|binding|INFO|Setting lport 289acc47-d17b-40ae-bf63-6bb022c242bb down in Southbound
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.200 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:40 compute-0 ovn_controller[95047]: 2026-01-22T00:06:40Z|00401|binding|INFO|Removing iface tap289acc47-d1 ovn-installed in OVS
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.208 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:d4:ee 10.100.0.7'], port_security=['fa:16:3e:d3:d4:ee 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '12480564-c85f-4cf7-bf02-29faf18a3117', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c299d482d37e45169cca3d6f178e8555', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=47fc8aa5-cd00-4c23-8e55-87bda0bbf0d4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=289acc47-d17b-40ae-bf63-6bb022c242bb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.210 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 289acc47-d17b-40ae-bf63-6bb022c242bb in datapath b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 unbound from our chassis
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.211 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.213 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e5659998-f71f-4f33-a586-3e35a5e582b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.213 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 namespace which is not needed anymore
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.222 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:40 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Jan 22 00:06:40 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000005f.scope: Consumed 8.557s CPU time.
Jan 22 00:06:40 compute-0 systemd-machined[154182]: Machine qemu-51-instance-0000005f terminated.
Jan 22 00:06:40 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227676]: [NOTICE]   (227697) : haproxy version is 2.8.14-c23fe91
Jan 22 00:06:40 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227676]: [NOTICE]   (227697) : path to executable is /usr/sbin/haproxy
Jan 22 00:06:40 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227676]: [WARNING]  (227697) : Exiting Master process...
Jan 22 00:06:40 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227676]: [WARNING]  (227697) : Exiting Master process...
Jan 22 00:06:40 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227676]: [ALERT]    (227697) : Current worker (227699) exited with code 143 (Terminated)
Jan 22 00:06:40 compute-0 neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5[227676]: [WARNING]  (227697) : All workers exited. Exiting... (0)
Jan 22 00:06:40 compute-0 systemd[1]: libpod-f52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9.scope: Deactivated successfully.
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.382 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:40 compute-0 podman[227793]: 2026-01-22 00:06:40.383392723 +0000 UTC m=+0.050794443 container died f52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.387 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9-userdata-shm.mount: Deactivated successfully.
Jan 22 00:06:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-e182719e4bc0b1518a6a98e0d5ce77729f7650886c8c1f894c973718efe7e691-merged.mount: Deactivated successfully.
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.440 182939 INFO nova.virt.libvirt.driver [-] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Instance destroyed successfully.
Jan 22 00:06:40 compute-0 podman[227793]: 2026-01-22 00:06:40.441217886 +0000 UTC m=+0.108619596 container cleanup f52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.441 182939 DEBUG nova.objects.instance [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lazy-loading 'resources' on Instance uuid 12480564-c85f-4cf7-bf02-29faf18a3117 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:40 compute-0 systemd[1]: libpod-conmon-f52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9.scope: Deactivated successfully.
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.471 182939 DEBUG nova.virt.libvirt.vif [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:06:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-609823566',display_name='tempest-ServerActionsTestOtherA-server-609823566',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-609823566',id=95,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:06:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c299d482d37e45169cca3d6f178e8555',ramdisk_id='',reservation_id='r-qlzsmz8u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-1347085859',owner_user_name='tempest-ServerActionsTestOtherA-1347085859-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:06:32Z,user_data=None,user_id='b4385295f46b45d8803b0c536a989822',uuid=12480564-c85f-4cf7-bf02-29faf18a3117,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "289acc47-d17b-40ae-bf63-6bb022c242bb", "address": "fa:16:3e:d3:d4:ee", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap289acc47-d1", "ovs_interfaceid": "289acc47-d17b-40ae-bf63-6bb022c242bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.471 182939 DEBUG nova.network.os_vif_util [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converting VIF {"id": "289acc47-d17b-40ae-bf63-6bb022c242bb", "address": "fa:16:3e:d3:d4:ee", "network": {"id": "b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1112750792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c299d482d37e45169cca3d6f178e8555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap289acc47-d1", "ovs_interfaceid": "289acc47-d17b-40ae-bf63-6bb022c242bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.472 182939 DEBUG nova.network.os_vif_util [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d3:d4:ee,bridge_name='br-int',has_traffic_filtering=True,id=289acc47-d17b-40ae-bf63-6bb022c242bb,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap289acc47-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.472 182939 DEBUG os_vif [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:d4:ee,bridge_name='br-int',has_traffic_filtering=True,id=289acc47-d17b-40ae-bf63-6bb022c242bb,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap289acc47-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.475 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.475 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap289acc47-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.480 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.483 182939 INFO os_vif [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:d4:ee,bridge_name='br-int',has_traffic_filtering=True,id=289acc47-d17b-40ae-bf63-6bb022c242bb,network=Network(b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap289acc47-d1')
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.483 182939 INFO nova.virt.libvirt.driver [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Deleting instance files /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117_del
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.484 182939 INFO nova.virt.libvirt.driver [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Deletion of /var/lib/nova/instances/12480564-c85f-4cf7-bf02-29faf18a3117_del complete
Jan 22 00:06:40 compute-0 podman[227837]: 2026-01-22 00:06:40.556854344 +0000 UTC m=+0.091535536 container remove f52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.563 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ae928a2e-8cac-4dcb-a861-a79c8f82b5ba]: (4, ('Thu Jan 22 12:06:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 (f52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9)\nf52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9\nThu Jan 22 12:06:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 (f52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9)\nf52a1562d860a2328198f598161fb50dd2f813d9fc94e82f83f044d93cf68ad9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.565 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[10cae68e-4e72-4d8c-b16e-1b6304e29077]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.566 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb3dacae7-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.568 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:40 compute-0 kernel: tapb3dacae7-b0: left promiscuous mode
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.594 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.597 182939 DEBUG nova.compute.manager [req-192f0f01-7f2a-4239-a9b2-86472e1d370b req-a8083685-ae80-496a-8228-5bc8811e0d66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Received event network-vif-unplugged-289acc47-d17b-40ae-bf63-6bb022c242bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.597 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f0762ecd-16e5-4f20-9b0f-9e8c4d5b191c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.598 182939 DEBUG oslo_concurrency.lockutils [req-192f0f01-7f2a-4239-a9b2-86472e1d370b req-a8083685-ae80-496a-8228-5bc8811e0d66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.598 182939 DEBUG oslo_concurrency.lockutils [req-192f0f01-7f2a-4239-a9b2-86472e1d370b req-a8083685-ae80-496a-8228-5bc8811e0d66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.598 182939 DEBUG oslo_concurrency.lockutils [req-192f0f01-7f2a-4239-a9b2-86472e1d370b req-a8083685-ae80-496a-8228-5bc8811e0d66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.598 182939 DEBUG nova.compute.manager [req-192f0f01-7f2a-4239-a9b2-86472e1d370b req-a8083685-ae80-496a-8228-5bc8811e0d66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] No waiting events found dispatching network-vif-unplugged-289acc47-d17b-40ae-bf63-6bb022c242bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.599 182939 DEBUG nova.compute.manager [req-192f0f01-7f2a-4239-a9b2-86472e1d370b req-a8083685-ae80-496a-8228-5bc8811e0d66 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Received event network-vif-unplugged-289acc47-d17b-40ae-bf63-6bb022c242bb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.606 182939 INFO nova.compute.manager [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Took 0.45 seconds to destroy the instance on the hypervisor.
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.607 182939 DEBUG oslo.service.loopingcall [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.608 182939 DEBUG nova.compute.manager [-] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.609 182939 DEBUG nova.network.neutron [-] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.612 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[08c10964-5156-4b1c-8d0c-a6db2204114b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.613 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2140e0dc-cf8e-4450-ace4-14d2c9875338]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.635 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0a0ddd4e-9005-4b29-af2f-85b9e4b02c91]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489436, 'reachable_time': 24169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227852, 'error': None, 'target': 'ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:40 compute-0 systemd[1]: run-netns-ovnmeta\x2db3dacae7\x2db9cd\x2d426c\x2daa4a\x2d3a6b971c7ee5.mount: Deactivated successfully.
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.640 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b3dacae7-b9cd-426c-aa4a-3a6b971c7ee5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:06:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:40.640 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[0b8cde37-9b3b-4a72-9aac-83abd6ece6e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:40 compute-0 nova_compute[182935]: 2026-01-22 00:06:40.992 182939 DEBUG nova.network.neutron [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Successfully updated port: 3bab73ac-3e6b-4687-baeb-3313f2704c0c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.008 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "refresh_cache-63ff0ae4-702a-490d-a4dd-7d5d32cb5eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.009 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquired lock "refresh_cache-63ff0ae4-702a-490d-a4dd-7d5d32cb5eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.009 182939 DEBUG nova.network.neutron [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.228 182939 DEBUG nova.network.neutron [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.472 182939 DEBUG nova.network.neutron [-] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.498 182939 INFO nova.compute.manager [-] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Took 0.89 seconds to deallocate network for instance.
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.620 182939 DEBUG oslo_concurrency.lockutils [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.621 182939 DEBUG oslo_concurrency.lockutils [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.684 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.690 182939 DEBUG nova.compute.provider_tree [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.708 182939 DEBUG nova.scheduler.client.report [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.741 182939 DEBUG oslo_concurrency.lockutils [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.770 182939 INFO nova.scheduler.client.report [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Deleted allocations for instance 12480564-c85f-4cf7-bf02-29faf18a3117
Jan 22 00:06:41 compute-0 nova_compute[182935]: 2026-01-22 00:06:41.873 182939 DEBUG oslo_concurrency.lockutils [None req-8f95663d-58c3-431d-b862-40e458aca183 b4385295f46b45d8803b0c536a989822 c299d482d37e45169cca3d6f178e8555 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.149 182939 DEBUG nova.network.neutron [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Updating instance_info_cache with network_info: [{"id": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "address": "fa:16:3e:15:a3:c4", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bab73ac-3e", "ovs_interfaceid": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.177 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Releasing lock "refresh_cache-63ff0ae4-702a-490d-a4dd-7d5d32cb5eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.178 182939 DEBUG nova.compute.manager [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Instance network_info: |[{"id": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "address": "fa:16:3e:15:a3:c4", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bab73ac-3e", "ovs_interfaceid": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.181 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Start _get_guest_xml network_info=[{"id": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "address": "fa:16:3e:15:a3:c4", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bab73ac-3e", "ovs_interfaceid": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.187 182939 WARNING nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.192 182939 DEBUG nova.virt.libvirt.host [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.193 182939 DEBUG nova.virt.libvirt.host [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.198 182939 DEBUG nova.virt.libvirt.host [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.198 182939 DEBUG nova.virt.libvirt.host [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.200 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.200 182939 DEBUG nova.virt.hardware [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.201 182939 DEBUG nova.virt.hardware [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.201 182939 DEBUG nova.virt.hardware [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.201 182939 DEBUG nova.virt.hardware [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.202 182939 DEBUG nova.virt.hardware [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.202 182939 DEBUG nova.virt.hardware [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.202 182939 DEBUG nova.virt.hardware [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.203 182939 DEBUG nova.virt.hardware [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.203 182939 DEBUG nova.virt.hardware [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.203 182939 DEBUG nova.virt.hardware [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.204 182939 DEBUG nova.virt.hardware [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.208 182939 DEBUG nova.virt.libvirt.vif [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:06:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1607163862',display_name='tempest-DeleteServersTestJSON-server-1607163862',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1607163862',id=97,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-lh90kdt6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:37Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=63ff0ae4-702a-490d-a4dd-7d5d32cb5eca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "address": "fa:16:3e:15:a3:c4", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bab73ac-3e", "ovs_interfaceid": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.209 182939 DEBUG nova.network.os_vif_util [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "address": "fa:16:3e:15:a3:c4", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bab73ac-3e", "ovs_interfaceid": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.210 182939 DEBUG nova.network.os_vif_util [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:a3:c4,bridge_name='br-int',has_traffic_filtering=True,id=3bab73ac-3e6b-4687-baeb-3313f2704c0c,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bab73ac-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.211 182939 DEBUG nova.objects.instance [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.230 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:06:42 compute-0 nova_compute[182935]:   <uuid>63ff0ae4-702a-490d-a4dd-7d5d32cb5eca</uuid>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   <name>instance-00000061</name>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <nova:name>tempest-DeleteServersTestJSON-server-1607163862</nova:name>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:06:42</nova:creationTime>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:06:42 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:06:42 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:06:42 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:06:42 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:06:42 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:06:42 compute-0 nova_compute[182935]:         <nova:user uuid="74ad1bf274924c52af96aa4c6d431410">tempest-DeleteServersTestJSON-2033458913-project-member</nova:user>
Jan 22 00:06:42 compute-0 nova_compute[182935]:         <nova:project uuid="3822e32efd5647aebf2d79a3dd038bd4">tempest-DeleteServersTestJSON-2033458913</nova:project>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:06:42 compute-0 nova_compute[182935]:         <nova:port uuid="3bab73ac-3e6b-4687-baeb-3313f2704c0c">
Jan 22 00:06:42 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <system>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <entry name="serial">63ff0ae4-702a-490d-a4dd-7d5d32cb5eca</entry>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <entry name="uuid">63ff0ae4-702a-490d-a4dd-7d5d32cb5eca</entry>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     </system>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   <os>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   </os>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   <features>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   </features>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk.config"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:15:a3:c4"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <target dev="tap3bab73ac-3e"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/console.log" append="off"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <video>
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     </video>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:06:42 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:06:42 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:06:42 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:06:42 compute-0 nova_compute[182935]: </domain>
Jan 22 00:06:42 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.232 182939 DEBUG nova.compute.manager [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Preparing to wait for external event network-vif-plugged-3bab73ac-3e6b-4687-baeb-3313f2704c0c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.232 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.233 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.233 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.234 182939 DEBUG nova.virt.libvirt.vif [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:06:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1607163862',display_name='tempest-DeleteServersTestJSON-server-1607163862',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1607163862',id=97,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-lh90kdt6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:37Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=63ff0ae4-702a-490d-a4dd-7d5d32cb5eca,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "address": "fa:16:3e:15:a3:c4", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bab73ac-3e", "ovs_interfaceid": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.235 182939 DEBUG nova.network.os_vif_util [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "address": "fa:16:3e:15:a3:c4", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bab73ac-3e", "ovs_interfaceid": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.236 182939 DEBUG nova.network.os_vif_util [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:a3:c4,bridge_name='br-int',has_traffic_filtering=True,id=3bab73ac-3e6b-4687-baeb-3313f2704c0c,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bab73ac-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.236 182939 DEBUG os_vif [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:a3:c4,bridge_name='br-int',has_traffic_filtering=True,id=3bab73ac-3e6b-4687-baeb-3313f2704c0c,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bab73ac-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.237 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.238 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.238 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.243 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.244 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3bab73ac-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.244 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3bab73ac-3e, col_values=(('external_ids', {'iface-id': '3bab73ac-3e6b-4687-baeb-3313f2704c0c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:a3:c4', 'vm-uuid': '63ff0ae4-702a-490d-a4dd-7d5d32cb5eca'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.288 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:42 compute-0 NetworkManager[55139]: <info>  [1769040402.2896] manager: (tap3bab73ac-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.292 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.296 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.297 182939 INFO os_vif [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:a3:c4,bridge_name='br-int',has_traffic_filtering=True,id=3bab73ac-3e6b-4687-baeb-3313f2704c0c,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bab73ac-3e')
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.361 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.361 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.361 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No VIF found with MAC fa:16:3e:15:a3:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.362 182939 INFO nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Using config drive
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.700 182939 DEBUG nova.compute.manager [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Received event network-vif-plugged-289acc47-d17b-40ae-bf63-6bb022c242bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.700 182939 DEBUG oslo_concurrency.lockutils [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.701 182939 DEBUG oslo_concurrency.lockutils [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.701 182939 DEBUG oslo_concurrency.lockutils [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "12480564-c85f-4cf7-bf02-29faf18a3117-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.701 182939 DEBUG nova.compute.manager [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] No waiting events found dispatching network-vif-plugged-289acc47-d17b-40ae-bf63-6bb022c242bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.701 182939 WARNING nova.compute.manager [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Received unexpected event network-vif-plugged-289acc47-d17b-40ae-bf63-6bb022c242bb for instance with vm_state deleted and task_state None.
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.701 182939 DEBUG nova.compute.manager [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Received event network-changed-3bab73ac-3e6b-4687-baeb-3313f2704c0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.702 182939 DEBUG nova.compute.manager [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Refreshing instance network info cache due to event network-changed-3bab73ac-3e6b-4687-baeb-3313f2704c0c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.702 182939 DEBUG oslo_concurrency.lockutils [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-63ff0ae4-702a-490d-a4dd-7d5d32cb5eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.702 182939 DEBUG oslo_concurrency.lockutils [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-63ff0ae4-702a-490d-a4dd-7d5d32cb5eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.702 182939 DEBUG nova.network.neutron [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Refreshing network info cache for port 3bab73ac-3e6b-4687-baeb-3313f2704c0c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.991 182939 INFO nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Creating config drive at /var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk.config
Jan 22 00:06:42 compute-0 nova_compute[182935]: 2026-01-22 00:06:42.997 182939 DEBUG oslo_concurrency.processutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_ubaymz6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.128 182939 DEBUG oslo_concurrency.processutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_ubaymz6" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:43 compute-0 kernel: tap3bab73ac-3e: entered promiscuous mode
Jan 22 00:06:43 compute-0 ovn_controller[95047]: 2026-01-22T00:06:43Z|00402|binding|INFO|Claiming lport 3bab73ac-3e6b-4687-baeb-3313f2704c0c for this chassis.
Jan 22 00:06:43 compute-0 ovn_controller[95047]: 2026-01-22T00:06:43Z|00403|binding|INFO|3bab73ac-3e6b-4687-baeb-3313f2704c0c: Claiming fa:16:3e:15:a3:c4 10.100.0.11
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.226 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:43 compute-0 NetworkManager[55139]: <info>  [1769040403.2291] manager: (tap3bab73ac-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/185)
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.239 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:a3:c4 10.100.0.11'], port_security=['fa:16:3e:15:a3:c4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '63ff0ae4-702a-490d-a4dd-7d5d32cb5eca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=3bab73ac-3e6b-4687-baeb-3313f2704c0c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.241 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 3bab73ac-3e6b-4687-baeb-3313f2704c0c in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e bound to our chassis
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.242 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:06:43 compute-0 ovn_controller[95047]: 2026-01-22T00:06:43Z|00404|binding|INFO|Setting lport 3bab73ac-3e6b-4687-baeb-3313f2704c0c ovn-installed in OVS
Jan 22 00:06:43 compute-0 ovn_controller[95047]: 2026-01-22T00:06:43Z|00405|binding|INFO|Setting lport 3bab73ac-3e6b-4687-baeb-3313f2704c0c up in Southbound
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.246 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.247 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:43 compute-0 systemd-udevd[227872]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.255 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bd21ca59-b66f-4469-804a-6d4334825af1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.256 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd94993bc-71 in ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.258 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd94993bc-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.258 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cbd56fa1-3852-4a81-867d-67059f8c93da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.260 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[742c6cc7-b569-4de8-865b-13ebbda11052]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 NetworkManager[55139]: <info>  [1769040403.2671] device (tap3bab73ac-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:06:43 compute-0 NetworkManager[55139]: <info>  [1769040403.2678] device (tap3bab73ac-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:06:43 compute-0 systemd-machined[154182]: New machine qemu-52-instance-00000061.
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.272 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[55d0aa54-b2e5-4999-9759-5946010cdd5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 systemd[1]: Started Virtual Machine qemu-52-instance-00000061.
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.295 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5e692ca4-0a81-489f-9adc-1e185238cbd7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.330 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[af512fa0-c040-4929-a36c-06552416e3b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.335 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e9aaaa-80a0-40e4-acea-e8a5277b7f8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 NetworkManager[55139]: <info>  [1769040403.3386] manager: (tapd94993bc-70): new Veth device (/org/freedesktop/NetworkManager/Devices/186)
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.378 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[71c6566b-54bd-4a5a-abad-c7b4e57ac386]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.384 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[ee538611-6115-4a63-a795-8bae073f9a65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 NetworkManager[55139]: <info>  [1769040403.4099] device (tapd94993bc-70): carrier: link connected
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.413 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[3f9cba22-d763-4578-a8ef-dc9acc50e900]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.431 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[73928021-3c58-4863-a9d4-f6529ec26ad6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 127], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490614, 'reachable_time': 43375, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227906, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.448 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[822d815c-7c87-4d5d-ac9a-1bd59befd6a6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:eecd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490614, 'tstamp': 490614}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227907, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.468 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[37b58f6e-2912-44a1-a379-4e2e0a252bbc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 127], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490614, 'reachable_time': 43375, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227908, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.503 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf9a385-ce6a-49f8-b333-2d0087b996c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.577 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed453c2-2c4b-4709-ac10-be4bb865dfb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.578 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.579 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.579 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd94993bc-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.580 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:43 compute-0 kernel: tapd94993bc-70: entered promiscuous mode
Jan 22 00:06:43 compute-0 NetworkManager[55139]: <info>  [1769040403.5827] manager: (tapd94993bc-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.583 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.586 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd94993bc-70, col_values=(('external_ids', {'iface-id': 'd921ee25-8f8a-4375-9839-6c54ab328e88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.587 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:43 compute-0 ovn_controller[95047]: 2026-01-22T00:06:43Z|00406|binding|INFO|Releasing lport d921ee25-8f8a-4375-9839-6c54ab328e88 from this chassis (sb_readonly=0)
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.599 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.602 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.604 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5baa22e1-916f-41e7-ae61-71568dcf5a9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.605 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:06:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:43.606 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'env', 'PROCESS_TAG=haproxy-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d94993bc-77ac-42d2-88cb-3b0110dff29e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.616 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040403.6152155, 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.616 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] VM Started (Lifecycle Event)
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.691 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.699 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040403.616021, 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.700 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] VM Paused (Lifecycle Event)
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.768 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.771 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:06:43 compute-0 nova_compute[182935]: 2026-01-22 00:06:43.824 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:06:44 compute-0 podman[227947]: 2026-01-22 00:06:44.026627768 +0000 UTC m=+0.051757476 container create 4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 00:06:44 compute-0 systemd[1]: Started libpod-conmon-4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054.scope.
Jan 22 00:06:44 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:06:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9877462c20d9a20828044401b499faefa511280d27ce40d89808b6e3b2690cb0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:06:44 compute-0 podman[227947]: 2026-01-22 00:06:44.001664213 +0000 UTC m=+0.026793941 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:06:44 compute-0 podman[227947]: 2026-01-22 00:06:44.108748539 +0000 UTC m=+0.133878437 container init 4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:06:44 compute-0 podman[227947]: 2026-01-22 00:06:44.11447582 +0000 UTC m=+0.139605528 container start 4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:06:44 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[227963]: [NOTICE]   (227967) : New worker (227969) forked
Jan 22 00:06:44 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[227963]: [NOTICE]   (227967) : Loading success.
Jan 22 00:06:44 compute-0 nova_compute[182935]: 2026-01-22 00:06:44.674 182939 DEBUG nova.network.neutron [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Updated VIF entry in instance network info cache for port 3bab73ac-3e6b-4687-baeb-3313f2704c0c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:06:44 compute-0 nova_compute[182935]: 2026-01-22 00:06:44.674 182939 DEBUG nova.network.neutron [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Updating instance_info_cache with network_info: [{"id": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "address": "fa:16:3e:15:a3:c4", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bab73ac-3e", "ovs_interfaceid": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:06:44 compute-0 nova_compute[182935]: 2026-01-22 00:06:44.727 182939 DEBUG oslo_concurrency.lockutils [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-63ff0ae4-702a-490d-a4dd-7d5d32cb5eca" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:06:44 compute-0 nova_compute[182935]: 2026-01-22 00:06:44.727 182939 DEBUG nova.compute.manager [req-962ea686-3fec-4deb-ad04-cede837083c1 req-6f54e322-27ce-4946-9cc9-975e70206793 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Received event network-vif-deleted-289acc47-d17b-40ae-bf63-6bb022c242bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.243 182939 DEBUG nova.compute.manager [req-a844f9f0-8f0b-49d3-ae0c-29498b340a7b req-697f575e-918d-4163-aa66-ef3225a1e19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Received event network-vif-plugged-3bab73ac-3e6b-4687-baeb-3313f2704c0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.244 182939 DEBUG oslo_concurrency.lockutils [req-a844f9f0-8f0b-49d3-ae0c-29498b340a7b req-697f575e-918d-4163-aa66-ef3225a1e19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.244 182939 DEBUG oslo_concurrency.lockutils [req-a844f9f0-8f0b-49d3-ae0c-29498b340a7b req-697f575e-918d-4163-aa66-ef3225a1e19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.244 182939 DEBUG oslo_concurrency.lockutils [req-a844f9f0-8f0b-49d3-ae0c-29498b340a7b req-697f575e-918d-4163-aa66-ef3225a1e19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.245 182939 DEBUG nova.compute.manager [req-a844f9f0-8f0b-49d3-ae0c-29498b340a7b req-697f575e-918d-4163-aa66-ef3225a1e19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Processing event network-vif-plugged-3bab73ac-3e6b-4687-baeb-3313f2704c0c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.245 182939 DEBUG nova.compute.manager [req-a844f9f0-8f0b-49d3-ae0c-29498b340a7b req-697f575e-918d-4163-aa66-ef3225a1e19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Received event network-vif-plugged-3bab73ac-3e6b-4687-baeb-3313f2704c0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.245 182939 DEBUG oslo_concurrency.lockutils [req-a844f9f0-8f0b-49d3-ae0c-29498b340a7b req-697f575e-918d-4163-aa66-ef3225a1e19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.245 182939 DEBUG oslo_concurrency.lockutils [req-a844f9f0-8f0b-49d3-ae0c-29498b340a7b req-697f575e-918d-4163-aa66-ef3225a1e19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.245 182939 DEBUG oslo_concurrency.lockutils [req-a844f9f0-8f0b-49d3-ae0c-29498b340a7b req-697f575e-918d-4163-aa66-ef3225a1e19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.246 182939 DEBUG nova.compute.manager [req-a844f9f0-8f0b-49d3-ae0c-29498b340a7b req-697f575e-918d-4163-aa66-ef3225a1e19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] No waiting events found dispatching network-vif-plugged-3bab73ac-3e6b-4687-baeb-3313f2704c0c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.246 182939 WARNING nova.compute.manager [req-a844f9f0-8f0b-49d3-ae0c-29498b340a7b req-697f575e-918d-4163-aa66-ef3225a1e19a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Received unexpected event network-vif-plugged-3bab73ac-3e6b-4687-baeb-3313f2704c0c for instance with vm_state building and task_state spawning.
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.247 182939 DEBUG nova.compute.manager [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.250 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040405.250767, 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.251 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] VM Resumed (Lifecycle Event)
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.253 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.257 182939 INFO nova.virt.libvirt.driver [-] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Instance spawned successfully.
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.259 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.279 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.287 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.300 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.301 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.302 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.302 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.302 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.303 182939 DEBUG nova.virt.libvirt.driver [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.308 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.419 182939 INFO nova.compute.manager [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Took 7.44 seconds to spawn the instance on the hypervisor.
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.420 182939 DEBUG nova.compute.manager [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.600 182939 INFO nova.compute.manager [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Took 8.65 seconds to build instance.
Jan 22 00:06:45 compute-0 nova_compute[182935]: 2026-01-22 00:06:45.627 182939 DEBUG oslo_concurrency.lockutils [None req-8f09b0fe-1504-47d0-9b5a-2db50aaa6c33 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:46 compute-0 nova_compute[182935]: 2026-01-22 00:06:46.686 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:47 compute-0 nova_compute[182935]: 2026-01-22 00:06:47.289 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:47 compute-0 nova_compute[182935]: 2026-01-22 00:06:47.889 182939 DEBUG oslo_concurrency.lockutils [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:47 compute-0 nova_compute[182935]: 2026-01-22 00:06:47.890 182939 DEBUG oslo_concurrency.lockutils [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:47 compute-0 nova_compute[182935]: 2026-01-22 00:06:47.891 182939 DEBUG oslo_concurrency.lockutils [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:47 compute-0 nova_compute[182935]: 2026-01-22 00:06:47.891 182939 DEBUG oslo_concurrency.lockutils [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:47 compute-0 nova_compute[182935]: 2026-01-22 00:06:47.891 182939 DEBUG oslo_concurrency.lockutils [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:47 compute-0 nova_compute[182935]: 2026-01-22 00:06:47.904 182939 INFO nova.compute.manager [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Terminating instance
Jan 22 00:06:47 compute-0 nova_compute[182935]: 2026-01-22 00:06:47.916 182939 DEBUG nova.compute.manager [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:06:47 compute-0 kernel: tap3bab73ac-3e (unregistering): left promiscuous mode
Jan 22 00:06:47 compute-0 NetworkManager[55139]: <info>  [1769040407.9476] device (tap3bab73ac-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:06:47 compute-0 ovn_controller[95047]: 2026-01-22T00:06:47Z|00407|binding|INFO|Releasing lport 3bab73ac-3e6b-4687-baeb-3313f2704c0c from this chassis (sb_readonly=0)
Jan 22 00:06:47 compute-0 ovn_controller[95047]: 2026-01-22T00:06:47Z|00408|binding|INFO|Setting lport 3bab73ac-3e6b-4687-baeb-3313f2704c0c down in Southbound
Jan 22 00:06:47 compute-0 nova_compute[182935]: 2026-01-22 00:06:47.964 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:47 compute-0 ovn_controller[95047]: 2026-01-22T00:06:47Z|00409|binding|INFO|Removing iface tap3bab73ac-3e ovn-installed in OVS
Jan 22 00:06:47 compute-0 nova_compute[182935]: 2026-01-22 00:06:47.968 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:47.976 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:a3:c4 10.100.0.11'], port_security=['fa:16:3e:15:a3:c4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '63ff0ae4-702a-490d-a4dd-7d5d32cb5eca', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=3bab73ac-3e6b-4687-baeb-3313f2704c0c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:06:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:47.978 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 3bab73ac-3e6b-4687-baeb-3313f2704c0c in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e unbound from our chassis
Jan 22 00:06:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:47.981 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d94993bc-77ac-42d2-88cb-3b0110dff29e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:06:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:47.983 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e6a72dca-bc4c-426d-954c-b0524b04e617]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:47.984 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace which is not needed anymore
Jan 22 00:06:47 compute-0 nova_compute[182935]: 2026-01-22 00:06:47.988 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:48 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000061.scope: Deactivated successfully.
Jan 22 00:06:48 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000061.scope: Consumed 3.032s CPU time.
Jan 22 00:06:48 compute-0 systemd-machined[154182]: Machine qemu-52-instance-00000061 terminated.
Jan 22 00:06:48 compute-0 kernel: tap3bab73ac-3e: entered promiscuous mode
Jan 22 00:06:48 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[227963]: [NOTICE]   (227967) : haproxy version is 2.8.14-c23fe91
Jan 22 00:06:48 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[227963]: [NOTICE]   (227967) : path to executable is /usr/sbin/haproxy
Jan 22 00:06:48 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[227963]: [WARNING]  (227967) : Exiting Master process...
Jan 22 00:06:48 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[227963]: [ALERT]    (227967) : Current worker (227969) exited with code 143 (Terminated)
Jan 22 00:06:48 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[227963]: [WARNING]  (227967) : All workers exited. Exiting... (0)
Jan 22 00:06:48 compute-0 kernel: tap3bab73ac-3e (unregistering): left promiscuous mode
Jan 22 00:06:48 compute-0 NetworkManager[55139]: <info>  [1769040408.1503] manager: (tap3bab73ac-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Jan 22 00:06:48 compute-0 systemd[1]: libpod-4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054.scope: Deactivated successfully.
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.156 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:48 compute-0 podman[228003]: 2026-01-22 00:06:48.158192187 +0000 UTC m=+0.058418040 container died 4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:06:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054-userdata-shm.mount: Deactivated successfully.
Jan 22 00:06:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-9877462c20d9a20828044401b499faefa511280d27ce40d89808b6e3b2690cb0-merged.mount: Deactivated successfully.
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.209 182939 INFO nova.virt.libvirt.driver [-] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Instance destroyed successfully.
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.210 182939 DEBUG nova.objects.instance [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'resources' on Instance uuid 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:48 compute-0 podman[228003]: 2026-01-22 00:06:48.214573815 +0000 UTC m=+0.114799668 container cleanup 4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:06:48 compute-0 systemd[1]: libpod-conmon-4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054.scope: Deactivated successfully.
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.231 182939 DEBUG nova.virt.libvirt.vif [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:06:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1607163862',display_name='tempest-DeleteServersTestJSON-server-1607163862',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1607163862',id=97,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:06:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-lh90kdt6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:06:45Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=63ff0ae4-702a-490d-a4dd-7d5d32cb5eca,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "address": "fa:16:3e:15:a3:c4", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bab73ac-3e", "ovs_interfaceid": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.232 182939 DEBUG nova.network.os_vif_util [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "address": "fa:16:3e:15:a3:c4", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3bab73ac-3e", "ovs_interfaceid": "3bab73ac-3e6b-4687-baeb-3313f2704c0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.233 182939 DEBUG nova.network.os_vif_util [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:a3:c4,bridge_name='br-int',has_traffic_filtering=True,id=3bab73ac-3e6b-4687-baeb-3313f2704c0c,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bab73ac-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.233 182939 DEBUG os_vif [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:a3:c4,bridge_name='br-int',has_traffic_filtering=True,id=3bab73ac-3e6b-4687-baeb-3313f2704c0c,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bab73ac-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.235 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.235 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3bab73ac-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.283 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.285 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.289 182939 INFO os_vif [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:a3:c4,bridge_name='br-int',has_traffic_filtering=True,id=3bab73ac-3e6b-4687-baeb-3313f2704c0c,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3bab73ac-3e')
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.290 182939 INFO nova.virt.libvirt.driver [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Deleting instance files /var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca_del
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.291 182939 INFO nova.virt.libvirt.driver [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Deletion of /var/lib/nova/instances/63ff0ae4-702a-490d-a4dd-7d5d32cb5eca_del complete
Jan 22 00:06:48 compute-0 podman[228044]: 2026-01-22 00:06:48.301886034 +0000 UTC m=+0.056252576 container remove 4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:06:48 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:48.309 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[76558f71-6075-432a-97fa-bbd6631aef84]: (4, ('Thu Jan 22 12:06:48 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054)\n4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054\nThu Jan 22 12:06:48 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054)\n4eb0cd456c54f5f93a31557610c68d52347cebc806528592cc92e1a544c04054\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:48 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:48.311 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1aa19729-9e62-40c0-bac6-c701e9e3a7a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:48 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:48.312 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:06:48 compute-0 kernel: tapd94993bc-70: left promiscuous mode
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.314 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.337 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:48 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:48.340 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[289fdc6d-7e5a-429a-9248-a85ecf441463]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:48 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:48.366 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ba98c6af-1b0a-4abb-88d5-66dcc50741fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:48 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:48.368 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[685c1807-5d62-4e9a-96a4-85f774551846]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.375 182939 INFO nova.compute.manager [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Took 0.46 seconds to destroy the instance on the hypervisor.
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.376 182939 DEBUG oslo.service.loopingcall [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.377 182939 DEBUG nova.compute.manager [-] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:06:48 compute-0 nova_compute[182935]: 2026-01-22 00:06:48.377 182939 DEBUG nova.network.neutron [-] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:06:48 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:48.386 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e81873e6-643d-4600-955a-a7de2450b074]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490606, 'reachable_time': 40960, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228059, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:48 compute-0 systemd[1]: run-netns-ovnmeta\x2dd94993bc\x2d77ac\x2d42d2\x2d88cb\x2d3b0110dff29e.mount: Deactivated successfully.
Jan 22 00:06:48 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:48.392 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:06:48 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:06:48.392 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[83f3ba2d-8f17-4cb8-a69b-7b398b75a2e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.011 182939 DEBUG nova.network.neutron [-] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.035 182939 INFO nova.compute.manager [-] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Took 0.66 seconds to deallocate network for instance.
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.097 182939 DEBUG nova.compute.manager [req-ce3f023a-8150-40ed-9c83-6712afab5756 req-43dd58a2-e3e0-4808-8a84-ce58623522b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Received event network-vif-deleted-3bab73ac-3e6b-4687-baeb-3313f2704c0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.122 182939 DEBUG oslo_concurrency.lockutils [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.123 182939 DEBUG oslo_concurrency.lockutils [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.195 182939 DEBUG nova.compute.provider_tree [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.217 182939 DEBUG nova.scheduler.client.report [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.248 182939 DEBUG oslo_concurrency.lockutils [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.295 182939 INFO nova.scheduler.client.report [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Deleted allocations for instance 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.397 182939 DEBUG oslo_concurrency.lockutils [None req-a9d7e275-3237-4aa3-ab43-d5af6ae0bfbb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.507s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.480 182939 DEBUG nova.compute.manager [req-3e335fae-b4f2-4492-8292-bd2faa3bd0d2 req-427babd5-2e15-4830-bd70-579827dcf8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Received event network-vif-unplugged-3bab73ac-3e6b-4687-baeb-3313f2704c0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.480 182939 DEBUG oslo_concurrency.lockutils [req-3e335fae-b4f2-4492-8292-bd2faa3bd0d2 req-427babd5-2e15-4830-bd70-579827dcf8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.481 182939 DEBUG oslo_concurrency.lockutils [req-3e335fae-b4f2-4492-8292-bd2faa3bd0d2 req-427babd5-2e15-4830-bd70-579827dcf8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.481 182939 DEBUG oslo_concurrency.lockutils [req-3e335fae-b4f2-4492-8292-bd2faa3bd0d2 req-427babd5-2e15-4830-bd70-579827dcf8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.481 182939 DEBUG nova.compute.manager [req-3e335fae-b4f2-4492-8292-bd2faa3bd0d2 req-427babd5-2e15-4830-bd70-579827dcf8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] No waiting events found dispatching network-vif-unplugged-3bab73ac-3e6b-4687-baeb-3313f2704c0c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.481 182939 WARNING nova.compute.manager [req-3e335fae-b4f2-4492-8292-bd2faa3bd0d2 req-427babd5-2e15-4830-bd70-579827dcf8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Received unexpected event network-vif-unplugged-3bab73ac-3e6b-4687-baeb-3313f2704c0c for instance with vm_state deleted and task_state None.
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.482 182939 DEBUG nova.compute.manager [req-3e335fae-b4f2-4492-8292-bd2faa3bd0d2 req-427babd5-2e15-4830-bd70-579827dcf8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Received event network-vif-plugged-3bab73ac-3e6b-4687-baeb-3313f2704c0c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.482 182939 DEBUG oslo_concurrency.lockutils [req-3e335fae-b4f2-4492-8292-bd2faa3bd0d2 req-427babd5-2e15-4830-bd70-579827dcf8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.482 182939 DEBUG oslo_concurrency.lockutils [req-3e335fae-b4f2-4492-8292-bd2faa3bd0d2 req-427babd5-2e15-4830-bd70-579827dcf8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.482 182939 DEBUG oslo_concurrency.lockutils [req-3e335fae-b4f2-4492-8292-bd2faa3bd0d2 req-427babd5-2e15-4830-bd70-579827dcf8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "63ff0ae4-702a-490d-a4dd-7d5d32cb5eca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.482 182939 DEBUG nova.compute.manager [req-3e335fae-b4f2-4492-8292-bd2faa3bd0d2 req-427babd5-2e15-4830-bd70-579827dcf8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] No waiting events found dispatching network-vif-plugged-3bab73ac-3e6b-4687-baeb-3313f2704c0c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:06:49 compute-0 nova_compute[182935]: 2026-01-22 00:06:49.483 182939 WARNING nova.compute.manager [req-3e335fae-b4f2-4492-8292-bd2faa3bd0d2 req-427babd5-2e15-4830-bd70-579827dcf8d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Received unexpected event network-vif-plugged-3bab73ac-3e6b-4687-baeb-3313f2704c0c for instance with vm_state deleted and task_state None.
Jan 22 00:06:50 compute-0 podman[228061]: 2026-01-22 00:06:50.698577436 +0000 UTC m=+0.063412202 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:06:50 compute-0 podman[228060]: 2026-01-22 00:06:50.746922546 +0000 UTC m=+0.114932121 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 00:06:50 compute-0 nova_compute[182935]: 2026-01-22 00:06:50.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:06:50 compute-0 nova_compute[182935]: 2026-01-22 00:06:50.965 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:51 compute-0 nova_compute[182935]: 2026-01-22 00:06:51.185 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:51 compute-0 nova_compute[182935]: 2026-01-22 00:06:51.687 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:53 compute-0 nova_compute[182935]: 2026-01-22 00:06:53.283 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:55 compute-0 nova_compute[182935]: 2026-01-22 00:06:55.439 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040400.4378974, 12480564-c85f-4cf7-bf02-29faf18a3117 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:55 compute-0 nova_compute[182935]: 2026-01-22 00:06:55.440 182939 INFO nova.compute.manager [-] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] VM Stopped (Lifecycle Event)
Jan 22 00:06:55 compute-0 nova_compute[182935]: 2026-01-22 00:06:55.468 182939 DEBUG nova.compute.manager [None req-c9fea59b-a302-49e8-958b-0c292ff6bd33 - - - - - -] [instance: 12480564-c85f-4cf7-bf02-29faf18a3117] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.155 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.156 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.177 182939 DEBUG nova.compute.manager [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.280 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.281 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.289 182939 DEBUG nova.virt.hardware [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.289 182939 INFO nova.compute.claims [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.472 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Acquiring lock "7f386293-dbac-4fc9-b940-199f991abcc4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.473 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "7f386293-dbac-4fc9-b940-199f991abcc4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.480 182939 DEBUG nova.compute.provider_tree [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.500 182939 DEBUG nova.scheduler.client.report [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.509 182939 DEBUG nova.compute.manager [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.551 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.553 182939 DEBUG nova.compute.manager [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.622 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.623 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.635 182939 DEBUG nova.compute.manager [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.636 182939 DEBUG nova.network.neutron [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.646 182939 DEBUG nova.virt.hardware [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.647 182939 INFO nova.compute.claims [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.654 182939 INFO nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.691 182939 DEBUG nova.compute.manager [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.694 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.820 182939 DEBUG nova.compute.provider_tree [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.834 182939 DEBUG nova.compute.manager [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.836 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.837 182939 INFO nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Creating image(s)
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.838 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "/var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.838 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.840 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.853 182939 DEBUG nova.scheduler.client.report [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.857 182939 DEBUG oslo_concurrency.processutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.881 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.882 182939 DEBUG nova.compute.manager [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.920 182939 DEBUG oslo_concurrency.processutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.921 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.922 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.938 182939 DEBUG oslo_concurrency.processutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.970 182939 DEBUG nova.compute.manager [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 22 00:06:56 compute-0 nova_compute[182935]: 2026-01-22 00:06:56.989 182939 INFO nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.019 182939 DEBUG oslo_concurrency.processutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.020 182939 DEBUG oslo_concurrency.processutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.042 182939 DEBUG nova.policy [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.047 182939 DEBUG nova.compute.manager [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.061 182939 DEBUG oslo_concurrency.processutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.061 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.062 182939 DEBUG oslo_concurrency.processutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.134 182939 DEBUG oslo_concurrency.processutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.135 182939 DEBUG nova.virt.disk.api [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Checking if we can resize image /var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.135 182939 DEBUG oslo_concurrency.processutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.194 182939 DEBUG nova.compute.manager [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.196 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.197 182939 INFO nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Creating image(s)
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.198 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Acquiring lock "/var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.198 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "/var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.199 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "/var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.217 182939 DEBUG oslo_concurrency.processutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.218 182939 DEBUG nova.virt.disk.api [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Cannot resize image /var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.219 182939 DEBUG nova.objects.instance [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'migration_context' on Instance uuid 70d927e7-875a-426f-a8ee-8e784c4fc8eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.221 182939 DEBUG oslo_concurrency.processutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.262 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.263 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Ensure instance console log exists: /var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.263 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.264 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.264 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.293 182939 DEBUG oslo_concurrency.processutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.295 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.296 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.320 182939 DEBUG oslo_concurrency.processutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.390 182939 DEBUG oslo_concurrency.processutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.392 182939 DEBUG oslo_concurrency.processutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.453 182939 DEBUG oslo_concurrency.processutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk 1073741824" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.455 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.455 182939 DEBUG oslo_concurrency.processutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.532 182939 DEBUG oslo_concurrency.processutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.534 182939 DEBUG nova.virt.disk.api [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Checking if we can resize image /var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.534 182939 DEBUG oslo_concurrency.processutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.593 182939 DEBUG oslo_concurrency.processutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.594 182939 DEBUG nova.virt.disk.api [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Cannot resize image /var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.595 182939 DEBUG nova.objects.instance [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lazy-loading 'migration_context' on Instance uuid 7f386293-dbac-4fc9-b940-199f991abcc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.609 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.609 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Ensure instance console log exists: /var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.610 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.610 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.610 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.612 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.617 182939 WARNING nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.627 182939 DEBUG nova.virt.libvirt.host [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.628 182939 DEBUG nova.virt.libvirt.host [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.635 182939 DEBUG nova.virt.libvirt.host [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.636 182939 DEBUG nova.virt.libvirt.host [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.637 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.638 182939 DEBUG nova.virt.hardware [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.638 182939 DEBUG nova.virt.hardware [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.638 182939 DEBUG nova.virt.hardware [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.639 182939 DEBUG nova.virt.hardware [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.639 182939 DEBUG nova.virt.hardware [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.639 182939 DEBUG nova.virt.hardware [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.639 182939 DEBUG nova.virt.hardware [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.640 182939 DEBUG nova.virt.hardware [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.640 182939 DEBUG nova.virt.hardware [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.640 182939 DEBUG nova.virt.hardware [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.640 182939 DEBUG nova.virt.hardware [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.645 182939 DEBUG nova.objects.instance [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f386293-dbac-4fc9-b940-199f991abcc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.660 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:06:57 compute-0 nova_compute[182935]:   <uuid>7f386293-dbac-4fc9-b940-199f991abcc4</uuid>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   <name>instance-00000064</name>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersAaction247Test-server-757037287</nova:name>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:06:57</nova:creationTime>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:06:57 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:06:57 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:06:57 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:06:57 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:06:57 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:06:57 compute-0 nova_compute[182935]:         <nova:user uuid="b724e73b6eb24a878d6ccd81fdf46a83">tempest-ServersAaction247Test-1012348261-project-member</nova:user>
Jan 22 00:06:57 compute-0 nova_compute[182935]:         <nova:project uuid="34a48bec9aea4687b899bede2a019a85">tempest-ServersAaction247Test-1012348261</nova:project>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <system>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <entry name="serial">7f386293-dbac-4fc9-b940-199f991abcc4</entry>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <entry name="uuid">7f386293-dbac-4fc9-b940-199f991abcc4</entry>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     </system>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   <os>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   </os>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   <features>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   </features>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk.config"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/console.log" append="off"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <video>
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     </video>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:06:57 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:06:57 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:06:57 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:06:57 compute-0 nova_compute[182935]: </domain>
Jan 22 00:06:57 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.729 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.730 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.731 182939 INFO nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Using config drive
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.977 182939 INFO nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Creating config drive at /var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk.config
Jan 22 00:06:57 compute-0 nova_compute[182935]: 2026-01-22 00:06:57.986 182939 DEBUG oslo_concurrency.processutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcmftg0ae execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.119 182939 DEBUG oslo_concurrency.processutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcmftg0ae" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.125 182939 DEBUG nova.network.neutron [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Successfully created port: c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:06:58 compute-0 systemd-machined[154182]: New machine qemu-53-instance-00000064.
Jan 22 00:06:58 compute-0 systemd[1]: Started Virtual Machine qemu-53-instance-00000064.
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.286 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:06:58 compute-0 podman[228152]: 2026-01-22 00:06:58.305044855 +0000 UTC m=+0.090136961 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.849 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040418.8485417, 7f386293-dbac-4fc9-b940-199f991abcc4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.852 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] VM Resumed (Lifecycle Event)
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.855 182939 DEBUG nova.compute.manager [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.856 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.860 182939 INFO nova.virt.libvirt.driver [-] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Instance spawned successfully.
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.860 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.880 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.888 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.892 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.892 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.893 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.893 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.894 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.894 182939 DEBUG nova.virt.libvirt.driver [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.924 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.924 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040418.8520353, 7f386293-dbac-4fc9-b940-199f991abcc4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.924 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] VM Started (Lifecycle Event)
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.961 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:58 compute-0 nova_compute[182935]: 2026-01-22 00:06:58.965 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.003 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.013 182939 INFO nova.compute.manager [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Took 1.82 seconds to spawn the instance on the hypervisor.
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.014 182939 DEBUG nova.compute.manager [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.035 182939 DEBUG nova.network.neutron [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Successfully updated port: c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.060 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "refresh_cache-70d927e7-875a-426f-a8ee-8e784c4fc8eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.060 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquired lock "refresh_cache-70d927e7-875a-426f-a8ee-8e784c4fc8eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.061 182939 DEBUG nova.network.neutron [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.116 182939 INFO nova.compute.manager [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Took 2.53 seconds to build instance.
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.138 182939 DEBUG oslo_concurrency.lockutils [None req-313b57f1-4556-4bbb-b434-4c1a3843b9da b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "7f386293-dbac-4fc9-b940-199f991abcc4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.150 182939 DEBUG nova.compute.manager [req-d5aef565-579e-4e9f-9156-b7219cc91ca3 req-001eaed9-a12c-4bf0-9b80-92fecdf26d4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Received event network-changed-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.151 182939 DEBUG nova.compute.manager [req-d5aef565-579e-4e9f-9156-b7219cc91ca3 req-001eaed9-a12c-4bf0-9b80-92fecdf26d4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Refreshing instance network info cache due to event network-changed-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.152 182939 DEBUG oslo_concurrency.lockutils [req-d5aef565-579e-4e9f-9156-b7219cc91ca3 req-001eaed9-a12c-4bf0-9b80-92fecdf26d4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-70d927e7-875a-426f-a8ee-8e784c4fc8eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:06:59 compute-0 nova_compute[182935]: 2026-01-22 00:06:59.226 182939 DEBUG nova.network.neutron [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.403 182939 DEBUG nova.network.neutron [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Updating instance_info_cache with network_info: [{"id": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "address": "fa:16:3e:ea:83:62", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc893135f-e3", "ovs_interfaceid": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.430 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Releasing lock "refresh_cache-70d927e7-875a-426f-a8ee-8e784c4fc8eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.430 182939 DEBUG nova.compute.manager [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Instance network_info: |[{"id": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "address": "fa:16:3e:ea:83:62", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc893135f-e3", "ovs_interfaceid": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.431 182939 DEBUG oslo_concurrency.lockutils [req-d5aef565-579e-4e9f-9156-b7219cc91ca3 req-001eaed9-a12c-4bf0-9b80-92fecdf26d4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-70d927e7-875a-426f-a8ee-8e784c4fc8eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.432 182939 DEBUG nova.network.neutron [req-d5aef565-579e-4e9f-9156-b7219cc91ca3 req-001eaed9-a12c-4bf0-9b80-92fecdf26d4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Refreshing network info cache for port c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.437 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Start _get_guest_xml network_info=[{"id": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "address": "fa:16:3e:ea:83:62", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc893135f-e3", "ovs_interfaceid": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.445 182939 WARNING nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.458 182939 DEBUG nova.virt.libvirt.host [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.459 182939 DEBUG nova.virt.libvirt.host [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.465 182939 DEBUG nova.virt.libvirt.host [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.466 182939 DEBUG nova.virt.libvirt.host [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.468 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.468 182939 DEBUG nova.virt.hardware [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.469 182939 DEBUG nova.virt.hardware [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.470 182939 DEBUG nova.virt.hardware [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.470 182939 DEBUG nova.virt.hardware [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.471 182939 DEBUG nova.virt.hardware [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.471 182939 DEBUG nova.virt.hardware [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.472 182939 DEBUG nova.virt.hardware [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.472 182939 DEBUG nova.virt.hardware [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.473 182939 DEBUG nova.virt.hardware [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.473 182939 DEBUG nova.virt.hardware [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.474 182939 DEBUG nova.virt.hardware [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.480 182939 DEBUG nova.virt.libvirt.vif [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:06:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1064039032',display_name='tempest-DeleteServersTestJSON-server-1064039032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1064039032',id=99,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-5whcwyck',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:56Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=70d927e7-875a-426f-a8ee-8e784c4fc8eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "address": "fa:16:3e:ea:83:62", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc893135f-e3", "ovs_interfaceid": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.481 182939 DEBUG nova.network.os_vif_util [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "address": "fa:16:3e:ea:83:62", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc893135f-e3", "ovs_interfaceid": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.482 182939 DEBUG nova.network.os_vif_util [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:83:62,bridge_name='br-int',has_traffic_filtering=True,id=c893135f-e355-4d0e-abdf-c6a8ca3cf2d5,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc893135f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.484 182939 DEBUG nova.objects.instance [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 70d927e7-875a-426f-a8ee-8e784c4fc8eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.504 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:07:00 compute-0 nova_compute[182935]:   <uuid>70d927e7-875a-426f-a8ee-8e784c4fc8eb</uuid>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   <name>instance-00000063</name>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <nova:name>tempest-DeleteServersTestJSON-server-1064039032</nova:name>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:07:00</nova:creationTime>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:07:00 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:07:00 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:07:00 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:07:00 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:07:00 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:07:00 compute-0 nova_compute[182935]:         <nova:user uuid="74ad1bf274924c52af96aa4c6d431410">tempest-DeleteServersTestJSON-2033458913-project-member</nova:user>
Jan 22 00:07:00 compute-0 nova_compute[182935]:         <nova:project uuid="3822e32efd5647aebf2d79a3dd038bd4">tempest-DeleteServersTestJSON-2033458913</nova:project>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:07:00 compute-0 nova_compute[182935]:         <nova:port uuid="c893135f-e355-4d0e-abdf-c6a8ca3cf2d5">
Jan 22 00:07:00 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <system>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <entry name="serial">70d927e7-875a-426f-a8ee-8e784c4fc8eb</entry>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <entry name="uuid">70d927e7-875a-426f-a8ee-8e784c4fc8eb</entry>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     </system>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   <os>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   </os>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   <features>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   </features>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk.config"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:ea:83:62"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <target dev="tapc893135f-e3"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/console.log" append="off"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <video>
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     </video>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:07:00 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:07:00 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:07:00 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:07:00 compute-0 nova_compute[182935]: </domain>
Jan 22 00:07:00 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.506 182939 DEBUG nova.compute.manager [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Preparing to wait for external event network-vif-plugged-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.507 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.507 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.508 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.509 182939 DEBUG nova.virt.libvirt.vif [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:06:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1064039032',display_name='tempest-DeleteServersTestJSON-server-1064039032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1064039032',id=99,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-5whcwyck',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:06:56Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=70d927e7-875a-426f-a8ee-8e784c4fc8eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "address": "fa:16:3e:ea:83:62", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc893135f-e3", "ovs_interfaceid": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.509 182939 DEBUG nova.network.os_vif_util [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "address": "fa:16:3e:ea:83:62", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc893135f-e3", "ovs_interfaceid": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.510 182939 DEBUG nova.network.os_vif_util [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:83:62,bridge_name='br-int',has_traffic_filtering=True,id=c893135f-e355-4d0e-abdf-c6a8ca3cf2d5,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc893135f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.511 182939 DEBUG os_vif [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:83:62,bridge_name='br-int',has_traffic_filtering=True,id=c893135f-e355-4d0e-abdf-c6a8ca3cf2d5,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc893135f-e3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.512 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.512 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.513 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.519 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.519 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc893135f-e3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.520 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc893135f-e3, col_values=(('external_ids', {'iface-id': 'c893135f-e355-4d0e-abdf-c6a8ca3cf2d5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:83:62', 'vm-uuid': '70d927e7-875a-426f-a8ee-8e784c4fc8eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:00 compute-0 NetworkManager[55139]: <info>  [1769040420.5234] manager: (tapc893135f-e3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/189)
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.522 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.526 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.531 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.533 182939 INFO os_vif [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:83:62,bridge_name='br-int',has_traffic_filtering=True,id=c893135f-e355-4d0e-abdf-c6a8ca3cf2d5,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc893135f-e3')
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.624 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.625 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.625 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No VIF found with MAC fa:16:3e:ea:83:62, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.626 182939 INFO nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Using config drive
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.637 182939 DEBUG nova.compute.manager [None req-2b128324-d041-4da9-8379-12b99d492028 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.756 182939 INFO nova.compute.manager [None req-2b128324-d041-4da9-8379-12b99d492028 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] instance snapshotting
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.757 182939 DEBUG nova.objects.instance [None req-2b128324-d041-4da9-8379-12b99d492028 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lazy-loading 'flavor' on Instance uuid 7f386293-dbac-4fc9-b940-199f991abcc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.853 182939 DEBUG oslo_concurrency.lockutils [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Acquiring lock "7f386293-dbac-4fc9-b940-199f991abcc4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.853 182939 DEBUG oslo_concurrency.lockutils [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "7f386293-dbac-4fc9-b940-199f991abcc4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.854 182939 DEBUG oslo_concurrency.lockutils [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Acquiring lock "7f386293-dbac-4fc9-b940-199f991abcc4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.854 182939 DEBUG oslo_concurrency.lockutils [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "7f386293-dbac-4fc9-b940-199f991abcc4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.854 182939 DEBUG oslo_concurrency.lockutils [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "7f386293-dbac-4fc9-b940-199f991abcc4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.865 182939 INFO nova.compute.manager [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Terminating instance
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.881 182939 DEBUG oslo_concurrency.lockutils [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Acquiring lock "refresh_cache-7f386293-dbac-4fc9-b940-199f991abcc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.881 182939 DEBUG oslo_concurrency.lockutils [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Acquired lock "refresh_cache-7f386293-dbac-4fc9-b940-199f991abcc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:00 compute-0 nova_compute[182935]: 2026-01-22 00:07:00.881 182939 DEBUG nova.network.neutron [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.066 182939 DEBUG nova.network.neutron [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.087 182939 INFO nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Creating config drive at /var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk.config
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.091 182939 DEBUG oslo_concurrency.processutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1ppxkun0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.122 182939 INFO nova.virt.libvirt.driver [None req-2b128324-d041-4da9-8379-12b99d492028 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Beginning live snapshot process
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.196 182939 DEBUG nova.compute.manager [None req-2b128324-d041-4da9-8379-12b99d492028 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Instance disappeared during snapshot _snapshot_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:4390
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.218 182939 DEBUG oslo_concurrency.processutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1ppxkun0" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:01 compute-0 kernel: tapc893135f-e3: entered promiscuous mode
Jan 22 00:07:01 compute-0 systemd-udevd[228193]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:07:01 compute-0 NetworkManager[55139]: <info>  [1769040421.2801] manager: (tapc893135f-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/190)
Jan 22 00:07:01 compute-0 ovn_controller[95047]: 2026-01-22T00:07:01Z|00410|binding|INFO|Claiming lport c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 for this chassis.
Jan 22 00:07:01 compute-0 ovn_controller[95047]: 2026-01-22T00:07:01Z|00411|binding|INFO|c893135f-e355-4d0e-abdf-c6a8ca3cf2d5: Claiming fa:16:3e:ea:83:62 10.100.0.11
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.286 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.292 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:83:62 10.100.0.11'], port_security=['fa:16:3e:ea:83:62 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '70d927e7-875a-426f-a8ee-8e784c4fc8eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=c893135f-e355-4d0e-abdf-c6a8ca3cf2d5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.293 104408 INFO neutron.agent.ovn.metadata.agent [-] Port c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e bound to our chassis
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.294 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:07:01 compute-0 NetworkManager[55139]: <info>  [1769040421.2978] device (tapc893135f-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:07:01 compute-0 NetworkManager[55139]: <info>  [1769040421.2993] device (tapc893135f-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.311 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2ec18d9e-0251-46d8-b141-27a53c7e175c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.312 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd94993bc-71 in ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.314 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd94993bc-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.314 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6cdbfe-0cce-4812-81de-f27734e1ed27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.315 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0b4c90-970d-4429-925a-d33df5c4d417]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.325 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[b356503b-cfdf-4ad8-8e52-f476d59351bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 systemd-machined[154182]: New machine qemu-54-instance-00000063.
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.358 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.360 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e7736e1d-34ad-42ef-8b35-1560ba34bd9e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 systemd[1]: Started Virtual Machine qemu-54-instance-00000063.
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.365 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:01 compute-0 ovn_controller[95047]: 2026-01-22T00:07:01Z|00412|binding|INFO|Setting lport c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 ovn-installed in OVS
Jan 22 00:07:01 compute-0 ovn_controller[95047]: 2026-01-22T00:07:01Z|00413|binding|INFO|Setting lport c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 up in Southbound
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.369 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.390 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[aae0dee2-fdca-4077-bb19-c0a21aacce18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.397 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[989c6086-58b6-457a-ab23-32c665c8d402]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 NetworkManager[55139]: <info>  [1769040421.3988] manager: (tapd94993bc-70): new Veth device (/org/freedesktop/NetworkManager/Devices/191)
Jan 22 00:07:01 compute-0 systemd-udevd[228213]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.425 182939 DEBUG nova.network.neutron [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.433 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[03a52da1-e8c7-4ed9-a69b-69e82a104813]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.442 182939 DEBUG oslo_concurrency.lockutils [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Releasing lock "refresh_cache-7f386293-dbac-4fc9-b940-199f991abcc4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.443 182939 DEBUG nova.compute.manager [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.444 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[2cfdf473-d246-4f34-ba6b-a163bb142016]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 NetworkManager[55139]: <info>  [1769040421.4693] device (tapd94993bc-70): carrier: link connected
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.476 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7733f2db-589f-4bee-809b-964cdff78ad5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000064.scope: Deactivated successfully.
Jan 22 00:07:01 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000064.scope: Consumed 3.270s CPU time.
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.495 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[77b2de5f-bc0e-4258-b109-6d63abaf10fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492420, 'reachable_time': 25145, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228249, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 systemd-machined[154182]: Machine qemu-53-instance-00000064 terminated.
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.515 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9019138e-91d3-423e-82ed-ed90d16e6e0b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:eecd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492420, 'tstamp': 492420}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228250, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.537 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ae8455f9-8a5f-4e2c-bc59-f02da7c47bdc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492420, 'reachable_time': 25145, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228251, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.569 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[69c49ff7-1fc5-4d6f-a4bd-19a0d3bafbc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.637 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5101ee6b-8b96-407e-8052-3b723bd7e18a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.639 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.640 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.640 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd94993bc-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:01 compute-0 NetworkManager[55139]: <info>  [1769040421.6428] manager: (tapd94993bc-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/192)
Jan 22 00:07:01 compute-0 kernel: tapd94993bc-70: entered promiscuous mode
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.643 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.645 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd94993bc-70, col_values=(('external_ids', {'iface-id': 'd921ee25-8f8a-4375-9839-6c54ab328e88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:01 compute-0 ovn_controller[95047]: 2026-01-22T00:07:01Z|00414|binding|INFO|Releasing lport d921ee25-8f8a-4375-9839-6c54ab328e88 from this chassis (sb_readonly=0)
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.660 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.661 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[08139725-c04b-4532-8418-89917562913a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.662 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:07:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:01.662 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'env', 'PROCESS_TAG=haproxy-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d94993bc-77ac-42d2-88cb-3b0110dff29e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.664 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.692 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.708 182939 INFO nova.virt.libvirt.driver [-] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Instance destroyed successfully.
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.709 182939 DEBUG nova.objects.instance [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lazy-loading 'resources' on Instance uuid 7f386293-dbac-4fc9-b940-199f991abcc4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.728 182939 DEBUG nova.compute.manager [req-6167fd31-ec78-4f27-b949-01fb53bf864a req-f0e692a4-1d9c-4c93-8644-e6adb241f471 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Received event network-vif-plugged-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.729 182939 DEBUG oslo_concurrency.lockutils [req-6167fd31-ec78-4f27-b949-01fb53bf864a req-f0e692a4-1d9c-4c93-8644-e6adb241f471 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.729 182939 DEBUG oslo_concurrency.lockutils [req-6167fd31-ec78-4f27-b949-01fb53bf864a req-f0e692a4-1d9c-4c93-8644-e6adb241f471 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.729 182939 DEBUG oslo_concurrency.lockutils [req-6167fd31-ec78-4f27-b949-01fb53bf864a req-f0e692a4-1d9c-4c93-8644-e6adb241f471 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.730 182939 DEBUG nova.compute.manager [req-6167fd31-ec78-4f27-b949-01fb53bf864a req-f0e692a4-1d9c-4c93-8644-e6adb241f471 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Processing event network-vif-plugged-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.730 182939 INFO nova.virt.libvirt.driver [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Deleting instance files /var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4_del
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.731 182939 INFO nova.virt.libvirt.driver [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Deletion of /var/lib/nova/instances/7f386293-dbac-4fc9-b940-199f991abcc4_del complete
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.812 182939 INFO nova.compute.manager [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.814 182939 DEBUG oslo.service.loopingcall [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.814 182939 DEBUG nova.compute.manager [-] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.815 182939 DEBUG nova.network.neutron [-] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.828 182939 DEBUG nova.compute.manager [None req-2b128324-d041-4da9-8379-12b99d492028 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Found 0 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.884 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040421.8831053, 70d927e7-875a-426f-a8ee-8e784c4fc8eb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.884 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] VM Started (Lifecycle Event)
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.886 182939 DEBUG nova.compute.manager [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.896 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.902 182939 INFO nova.virt.libvirt.driver [-] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Instance spawned successfully.
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.902 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.914 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.918 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.927 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.927 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.928 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.928 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.929 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.929 182939 DEBUG nova.virt.libvirt.driver [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.937 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.938 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040421.883252, 70d927e7-875a-426f-a8ee-8e784c4fc8eb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.938 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] VM Paused (Lifecycle Event)
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.957 182939 DEBUG nova.network.neutron [-] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.968 182939 DEBUG nova.network.neutron [req-d5aef565-579e-4e9f-9156-b7219cc91ca3 req-001eaed9-a12c-4bf0-9b80-92fecdf26d4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Updated VIF entry in instance network info cache for port c893135f-e355-4d0e-abdf-c6a8ca3cf2d5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.968 182939 DEBUG nova.network.neutron [req-d5aef565-579e-4e9f-9156-b7219cc91ca3 req-001eaed9-a12c-4bf0-9b80-92fecdf26d4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Updating instance_info_cache with network_info: [{"id": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "address": "fa:16:3e:ea:83:62", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc893135f-e3", "ovs_interfaceid": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.985 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.989 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040421.895776, 70d927e7-875a-426f-a8ee-8e784c4fc8eb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.990 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] VM Resumed (Lifecycle Event)
Jan 22 00:07:01 compute-0 nova_compute[182935]: 2026-01-22 00:07:01.995 182939 DEBUG nova.network.neutron [-] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.000 182939 DEBUG oslo_concurrency.lockutils [req-d5aef565-579e-4e9f-9156-b7219cc91ca3 req-001eaed9-a12c-4bf0-9b80-92fecdf26d4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-70d927e7-875a-426f-a8ee-8e784c4fc8eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.015 182939 INFO nova.compute.manager [-] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Took 0.20 seconds to deallocate network for instance.
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.016 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.022 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:07:02 compute-0 podman[228298]: 2026-01-22 00:07:02.051709697 +0000 UTC m=+0.053199741 container create 77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.093 182939 INFO nova.compute.manager [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Took 5.26 seconds to spawn the instance on the hypervisor.
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.094 182939 DEBUG nova.compute.manager [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:02 compute-0 systemd[1]: Started libpod-conmon-77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794.scope.
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.097 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:07:02 compute-0 podman[228298]: 2026-01-22 00:07:02.022816845 +0000 UTC m=+0.024306909 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:07:02 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:07:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52f5bee79e5323aa5a2f05dfa7ca2206649ee05b736080df6fbc684bc60946f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:07:02 compute-0 podman[228298]: 2026-01-22 00:07:02.140112063 +0000 UTC m=+0.141602127 container init 77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:07:02 compute-0 podman[228298]: 2026-01-22 00:07:02.144964192 +0000 UTC m=+0.146454236 container start 77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.179 182939 DEBUG oslo_concurrency.lockutils [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.180 182939 DEBUG oslo_concurrency.lockutils [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:02 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228313]: [NOTICE]   (228317) : New worker (228319) forked
Jan 22 00:07:02 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228313]: [NOTICE]   (228317) : Loading success.
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.195 182939 INFO nova.compute.manager [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Took 5.95 seconds to build instance.
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.214 182939 DEBUG oslo_concurrency.lockutils [None req-d88fc92b-0b74-4985-9b12-d76007f4b994 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.267 182939 DEBUG nova.compute.provider_tree [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.281 182939 DEBUG nova.scheduler.client.report [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.308 182939 DEBUG oslo_concurrency.lockutils [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.334 182939 INFO nova.scheduler.client.report [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Deleted allocations for instance 7f386293-dbac-4fc9-b940-199f991abcc4
Jan 22 00:07:02 compute-0 nova_compute[182935]: 2026-01-22 00:07:02.404 182939 DEBUG oslo_concurrency.lockutils [None req-619a4a3a-11db-4d8e-ba90-534e490ab311 b724e73b6eb24a878d6ccd81fdf46a83 34a48bec9aea4687b899bede2a019a85 - - default default] Lock "7f386293-dbac-4fc9-b940-199f991abcc4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:02 compute-0 podman[228328]: 2026-01-22 00:07:02.68260169 +0000 UTC m=+0.049829957 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:07:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:03.201 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:03.202 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:03.203 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:03 compute-0 nova_compute[182935]: 2026-01-22 00:07:03.206 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040408.2053063, 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:03 compute-0 nova_compute[182935]: 2026-01-22 00:07:03.206 182939 INFO nova.compute.manager [-] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] VM Stopped (Lifecycle Event)
Jan 22 00:07:03 compute-0 nova_compute[182935]: 2026-01-22 00:07:03.227 182939 DEBUG nova.compute.manager [None req-fd7acb6c-9350-41ac-8ef3-87ead59d1bac - - - - - -] [instance: 63ff0ae4-702a-490d-a4dd-7d5d32cb5eca] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:03 compute-0 nova_compute[182935]: 2026-01-22 00:07:03.970 182939 DEBUG nova.compute.manager [req-3b8ad09a-2c5d-495b-ba28-187158ffd1dc req-d3cff6eb-e784-4b42-b57a-8e78425d7c87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Received event network-vif-plugged-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:03 compute-0 nova_compute[182935]: 2026-01-22 00:07:03.971 182939 DEBUG oslo_concurrency.lockutils [req-3b8ad09a-2c5d-495b-ba28-187158ffd1dc req-d3cff6eb-e784-4b42-b57a-8e78425d7c87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:03 compute-0 nova_compute[182935]: 2026-01-22 00:07:03.971 182939 DEBUG oslo_concurrency.lockutils [req-3b8ad09a-2c5d-495b-ba28-187158ffd1dc req-d3cff6eb-e784-4b42-b57a-8e78425d7c87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:03 compute-0 nova_compute[182935]: 2026-01-22 00:07:03.971 182939 DEBUG oslo_concurrency.lockutils [req-3b8ad09a-2c5d-495b-ba28-187158ffd1dc req-d3cff6eb-e784-4b42-b57a-8e78425d7c87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:03 compute-0 nova_compute[182935]: 2026-01-22 00:07:03.971 182939 DEBUG nova.compute.manager [req-3b8ad09a-2c5d-495b-ba28-187158ffd1dc req-d3cff6eb-e784-4b42-b57a-8e78425d7c87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] No waiting events found dispatching network-vif-plugged-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:07:03 compute-0 nova_compute[182935]: 2026-01-22 00:07:03.971 182939 WARNING nova.compute.manager [req-3b8ad09a-2c5d-495b-ba28-187158ffd1dc req-d3cff6eb-e784-4b42-b57a-8e78425d7c87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Received unexpected event network-vif-plugged-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 for instance with vm_state active and task_state None.
Jan 22 00:07:05 compute-0 nova_compute[182935]: 2026-01-22 00:07:05.522 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:06 compute-0 nova_compute[182935]: 2026-01-22 00:07:06.319 182939 INFO nova.compute.manager [None req-9425114f-566f-46b4-b41e-507d3c8e27b3 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Pausing
Jan 22 00:07:06 compute-0 nova_compute[182935]: 2026-01-22 00:07:06.321 182939 DEBUG nova.objects.instance [None req-9425114f-566f-46b4-b41e-507d3c8e27b3 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'flavor' on Instance uuid 70d927e7-875a-426f-a8ee-8e784c4fc8eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:06 compute-0 nova_compute[182935]: 2026-01-22 00:07:06.364 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040426.3642719, 70d927e7-875a-426f-a8ee-8e784c4fc8eb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:06 compute-0 nova_compute[182935]: 2026-01-22 00:07:06.364 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] VM Paused (Lifecycle Event)
Jan 22 00:07:06 compute-0 nova_compute[182935]: 2026-01-22 00:07:06.366 182939 DEBUG nova.compute.manager [None req-9425114f-566f-46b4-b41e-507d3c8e27b3 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:06 compute-0 nova_compute[182935]: 2026-01-22 00:07:06.391 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:06 compute-0 nova_compute[182935]: 2026-01-22 00:07:06.396 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:07:06 compute-0 nova_compute[182935]: 2026-01-22 00:07:06.425 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] During sync_power_state the instance has a pending task (pausing). Skip.
Jan 22 00:07:06 compute-0 nova_compute[182935]: 2026-01-22 00:07:06.696 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:06 compute-0 podman[228349]: 2026-01-22 00:07:06.705535764 +0000 UTC m=+0.069147163 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 00:07:06 compute-0 podman[228348]: 2026-01-22 00:07:06.705693378 +0000 UTC m=+0.075624823 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.200 182939 DEBUG oslo_concurrency.lockutils [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.200 182939 DEBUG oslo_concurrency.lockutils [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.200 182939 DEBUG oslo_concurrency.lockutils [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.200 182939 DEBUG oslo_concurrency.lockutils [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.201 182939 DEBUG oslo_concurrency.lockutils [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.210 182939 INFO nova.compute.manager [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Terminating instance
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.219 182939 DEBUG nova.compute.manager [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:07:10 compute-0 kernel: tapc893135f-e3 (unregistering): left promiscuous mode
Jan 22 00:07:10 compute-0 NetworkManager[55139]: <info>  [1769040430.2412] device (tapc893135f-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.251 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:10 compute-0 ovn_controller[95047]: 2026-01-22T00:07:10Z|00415|binding|INFO|Releasing lport c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 from this chassis (sb_readonly=0)
Jan 22 00:07:10 compute-0 ovn_controller[95047]: 2026-01-22T00:07:10Z|00416|binding|INFO|Setting lport c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 down in Southbound
Jan 22 00:07:10 compute-0 ovn_controller[95047]: 2026-01-22T00:07:10Z|00417|binding|INFO|Removing iface tapc893135f-e3 ovn-installed in OVS
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.256 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.265 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:83:62 10.100.0.11'], port_security=['fa:16:3e:ea:83:62 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '70d927e7-875a-426f-a8ee-8e784c4fc8eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=c893135f-e355-4d0e-abdf-c6a8ca3cf2d5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.269 104408 INFO neutron.agent.ovn.metadata.agent [-] Port c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e unbound from our chassis
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.269 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.270 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d94993bc-77ac-42d2-88cb-3b0110dff29e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.271 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ba07f59f-df12-4a5b-b97c-f07f4b793fa2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.272 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace which is not needed anymore
Jan 22 00:07:10 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000063.scope: Deactivated successfully.
Jan 22 00:07:10 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000063.scope: Consumed 5.012s CPU time.
Jan 22 00:07:10 compute-0 systemd-machined[154182]: Machine qemu-54-instance-00000063 terminated.
Jan 22 00:07:10 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228313]: [NOTICE]   (228317) : haproxy version is 2.8.14-c23fe91
Jan 22 00:07:10 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228313]: [NOTICE]   (228317) : path to executable is /usr/sbin/haproxy
Jan 22 00:07:10 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228313]: [WARNING]  (228317) : Exiting Master process...
Jan 22 00:07:10 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228313]: [ALERT]    (228317) : Current worker (228319) exited with code 143 (Terminated)
Jan 22 00:07:10 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228313]: [WARNING]  (228317) : All workers exited. Exiting... (0)
Jan 22 00:07:10 compute-0 systemd[1]: libpod-77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794.scope: Deactivated successfully.
Jan 22 00:07:10 compute-0 podman[228408]: 2026-01-22 00:07:10.403826554 +0000 UTC m=+0.047391768 container died 77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:07:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794-userdata-shm.mount: Deactivated successfully.
Jan 22 00:07:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-52f5bee79e5323aa5a2f05dfa7ca2206649ee05b736080df6fbc684bc60946f2-merged.mount: Deactivated successfully.
Jan 22 00:07:10 compute-0 podman[228408]: 2026-01-22 00:07:10.440359954 +0000 UTC m=+0.083925158 container cleanup 77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.446 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.453 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:10 compute-0 systemd[1]: libpod-conmon-77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794.scope: Deactivated successfully.
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.493 182939 INFO nova.virt.libvirt.driver [-] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Instance destroyed successfully.
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.494 182939 DEBUG nova.objects.instance [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'resources' on Instance uuid 70d927e7-875a-426f-a8ee-8e784c4fc8eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:10 compute-0 podman[228442]: 2026-01-22 00:07:10.507932927 +0000 UTC m=+0.047141962 container remove 77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.512 182939 DEBUG nova.virt.libvirt.vif [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:06:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1064039032',display_name='tempest-DeleteServersTestJSON-server-1064039032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1064039032',id=99,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:07:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-5whcwyck',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:07:06Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=70d927e7-875a-426f-a8ee-8e784c4fc8eb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "address": "fa:16:3e:ea:83:62", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc893135f-e3", "ovs_interfaceid": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.513 182939 DEBUG nova.network.os_vif_util [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "address": "fa:16:3e:ea:83:62", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc893135f-e3", "ovs_interfaceid": "c893135f-e355-4d0e-abdf-c6a8ca3cf2d5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.512 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[88a573ae-d7c2-4722-9330-27e61e27fa2f]: (4, ('Thu Jan 22 12:07:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794)\n77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794\nThu Jan 22 12:07:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794)\n77e9e6f79d47081238ebe5e8fcf8cbc12a4b99d6b753394c210798fb42f65794\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.513 182939 DEBUG nova.network.os_vif_util [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:83:62,bridge_name='br-int',has_traffic_filtering=True,id=c893135f-e355-4d0e-abdf-c6a8ca3cf2d5,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc893135f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.514 182939 DEBUG os_vif [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:83:62,bridge_name='br-int',has_traffic_filtering=True,id=c893135f-e355-4d0e-abdf-c6a8ca3cf2d5,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc893135f-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.514 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[df81c5d0-11e4-4aa5-a85d-02049b916ced]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.515 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.515 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.516 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc893135f-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.517 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:10 compute-0 kernel: tapd94993bc-70: left promiscuous mode
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.520 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.529 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.531 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f528d220-7726-42e1-adde-204e84f24685]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.532 182939 INFO os_vif [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:83:62,bridge_name='br-int',has_traffic_filtering=True,id=c893135f-e355-4d0e-abdf-c6a8ca3cf2d5,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc893135f-e3')
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.533 182939 INFO nova.virt.libvirt.driver [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Deleting instance files /var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb_del
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.533 182939 INFO nova.virt.libvirt.driver [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Deletion of /var/lib/nova/instances/70d927e7-875a-426f-a8ee-8e784c4fc8eb_del complete
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.553 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[679eb7d5-e647-4642-9002-330f1497fe15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.554 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c9f8c296-c06e-489f-a954-8de8473f25f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.572 182939 DEBUG nova.compute.manager [req-a153cd85-a7c1-4238-a53d-1a1a3583733a req-42c8205d-d9b6-4c43-87a1-063de0e1f369 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Received event network-vif-unplugged-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.572 182939 DEBUG oslo_concurrency.lockutils [req-a153cd85-a7c1-4238-a53d-1a1a3583733a req-42c8205d-d9b6-4c43-87a1-063de0e1f369 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.573 182939 DEBUG oslo_concurrency.lockutils [req-a153cd85-a7c1-4238-a53d-1a1a3583733a req-42c8205d-d9b6-4c43-87a1-063de0e1f369 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.573 182939 DEBUG oslo_concurrency.lockutils [req-a153cd85-a7c1-4238-a53d-1a1a3583733a req-42c8205d-d9b6-4c43-87a1-063de0e1f369 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.573 182939 DEBUG nova.compute.manager [req-a153cd85-a7c1-4238-a53d-1a1a3583733a req-42c8205d-d9b6-4c43-87a1-063de0e1f369 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] No waiting events found dispatching network-vif-unplugged-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.573 182939 DEBUG nova.compute.manager [req-a153cd85-a7c1-4238-a53d-1a1a3583733a req-42c8205d-d9b6-4c43-87a1-063de0e1f369 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Received event network-vif-unplugged-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.573 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b70304f3-8bbb-4d9e-b30d-89d73eaa35d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492412, 'reachable_time': 18741, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228466, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.576 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:07:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:10.576 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[08efcae1-2428-409d-bb51-2ab7125c6383]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:10 compute-0 systemd[1]: run-netns-ovnmeta\x2dd94993bc\x2d77ac\x2d42d2\x2d88cb\x2d3b0110dff29e.mount: Deactivated successfully.
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.636 182939 INFO nova.compute.manager [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.637 182939 DEBUG oslo.service.loopingcall [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.637 182939 DEBUG nova.compute.manager [-] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:07:10 compute-0 nova_compute[182935]: 2026-01-22 00:07:10.638 182939 DEBUG nova.network.neutron [-] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:07:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:11.091 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:07:11 compute-0 nova_compute[182935]: 2026-01-22 00:07:11.091 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:11.093 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:07:11 compute-0 nova_compute[182935]: 2026-01-22 00:07:11.700 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:12 compute-0 nova_compute[182935]: 2026-01-22 00:07:12.151 182939 DEBUG nova.network.neutron [-] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:12 compute-0 nova_compute[182935]: 2026-01-22 00:07:12.187 182939 INFO nova.compute.manager [-] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Took 1.55 seconds to deallocate network for instance.
Jan 22 00:07:12 compute-0 nova_compute[182935]: 2026-01-22 00:07:12.230 182939 DEBUG nova.compute.manager [req-4861a084-eb67-4100-9bd1-c564fb55b030 req-1601b422-6c26-46c8-9090-362ff02d79e1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Received event network-vif-deleted-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:12 compute-0 nova_compute[182935]: 2026-01-22 00:07:12.277 182939 DEBUG oslo_concurrency.lockutils [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:12 compute-0 nova_compute[182935]: 2026-01-22 00:07:12.278 182939 DEBUG oslo_concurrency.lockutils [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:12 compute-0 nova_compute[182935]: 2026-01-22 00:07:12.332 182939 DEBUG nova.compute.provider_tree [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:07:12 compute-0 nova_compute[182935]: 2026-01-22 00:07:12.348 182939 DEBUG nova.scheduler.client.report [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:07:12 compute-0 nova_compute[182935]: 2026-01-22 00:07:12.368 182939 DEBUG oslo_concurrency.lockutils [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:12 compute-0 nova_compute[182935]: 2026-01-22 00:07:12.394 182939 INFO nova.scheduler.client.report [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Deleted allocations for instance 70d927e7-875a-426f-a8ee-8e784c4fc8eb
Jan 22 00:07:12 compute-0 nova_compute[182935]: 2026-01-22 00:07:12.476 182939 DEBUG oslo_concurrency.lockutils [None req-06d1ea17-3803-4034-bd9f-746c1b6275ea 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:13 compute-0 nova_compute[182935]: 2026-01-22 00:07:13.160 182939 DEBUG nova.compute.manager [req-cfac9d46-b58b-4ddb-9b5c-620b2fab75a3 req-07dba090-b147-4408-99d3-45bff1264540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Received event network-vif-plugged-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:13 compute-0 nova_compute[182935]: 2026-01-22 00:07:13.160 182939 DEBUG oslo_concurrency.lockutils [req-cfac9d46-b58b-4ddb-9b5c-620b2fab75a3 req-07dba090-b147-4408-99d3-45bff1264540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:13 compute-0 nova_compute[182935]: 2026-01-22 00:07:13.161 182939 DEBUG oslo_concurrency.lockutils [req-cfac9d46-b58b-4ddb-9b5c-620b2fab75a3 req-07dba090-b147-4408-99d3-45bff1264540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:13 compute-0 nova_compute[182935]: 2026-01-22 00:07:13.161 182939 DEBUG oslo_concurrency.lockutils [req-cfac9d46-b58b-4ddb-9b5c-620b2fab75a3 req-07dba090-b147-4408-99d3-45bff1264540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "70d927e7-875a-426f-a8ee-8e784c4fc8eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:13 compute-0 nova_compute[182935]: 2026-01-22 00:07:13.161 182939 DEBUG nova.compute.manager [req-cfac9d46-b58b-4ddb-9b5c-620b2fab75a3 req-07dba090-b147-4408-99d3-45bff1264540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] No waiting events found dispatching network-vif-plugged-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:07:13 compute-0 nova_compute[182935]: 2026-01-22 00:07:13.161 182939 WARNING nova.compute.manager [req-cfac9d46-b58b-4ddb-9b5c-620b2fab75a3 req-07dba090-b147-4408-99d3-45bff1264540 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Received unexpected event network-vif-plugged-c893135f-e355-4d0e-abdf-c6a8ca3cf2d5 for instance with vm_state deleted and task_state None.
Jan 22 00:07:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:14.095 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:15 compute-0 nova_compute[182935]: 2026-01-22 00:07:15.517 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:15 compute-0 sshd-session[228467]: Invalid user svn from 188.166.69.60 port 51940
Jan 22 00:07:15 compute-0 sshd-session[228467]: Connection closed by invalid user svn 188.166.69.60 port 51940 [preauth]
Jan 22 00:07:16 compute-0 nova_compute[182935]: 2026-01-22 00:07:16.702 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:16 compute-0 nova_compute[182935]: 2026-01-22 00:07:16.704 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040421.701697, 7f386293-dbac-4fc9-b940-199f991abcc4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:16 compute-0 nova_compute[182935]: 2026-01-22 00:07:16.704 182939 INFO nova.compute.manager [-] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] VM Stopped (Lifecycle Event)
Jan 22 00:07:16 compute-0 nova_compute[182935]: 2026-01-22 00:07:16.779 182939 DEBUG nova.compute.manager [None req-18a445d7-914e-45ca-9be7-d66731e934f8 - - - - - -] [instance: 7f386293-dbac-4fc9-b940-199f991abcc4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:16 compute-0 nova_compute[182935]: 2026-01-22 00:07:16.970 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "59441413-f484-464f-b5e2-f8d3aeb80f83" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:16 compute-0 nova_compute[182935]: 2026-01-22 00:07:16.971 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:16 compute-0 nova_compute[182935]: 2026-01-22 00:07:16.990 182939 DEBUG nova.compute.manager [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.101 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.102 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.109 182939 DEBUG nova.virt.hardware [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.109 182939 INFO nova.compute.claims [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.265 182939 DEBUG nova.compute.provider_tree [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.291 182939 DEBUG nova.scheduler.client.report [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.319 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.320 182939 DEBUG nova.compute.manager [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.398 182939 DEBUG nova.compute.manager [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.398 182939 DEBUG nova.network.neutron [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.437 182939 INFO nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.468 182939 DEBUG nova.compute.manager [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.578 182939 DEBUG nova.compute.manager [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.579 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.580 182939 INFO nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Creating image(s)
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.580 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "/var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.581 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.582 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.593 182939 DEBUG oslo_concurrency.processutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.666 182939 DEBUG oslo_concurrency.processutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.668 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.669 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.684 182939 DEBUG oslo_concurrency.processutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.760 182939 DEBUG oslo_concurrency.processutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.761 182939 DEBUG oslo_concurrency.processutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.807 182939 DEBUG oslo_concurrency.processutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.808 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.809 182939 DEBUG oslo_concurrency.processutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.869 182939 DEBUG oslo_concurrency.processutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.871 182939 DEBUG nova.virt.disk.api [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Checking if we can resize image /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.871 182939 DEBUG oslo_concurrency.processutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.940 182939 DEBUG oslo_concurrency.processutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.942 182939 DEBUG nova.virt.disk.api [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Cannot resize image /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.942 182939 DEBUG nova.objects.instance [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'migration_context' on Instance uuid 59441413-f484-464f-b5e2-f8d3aeb80f83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.963 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.963 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Ensure instance console log exists: /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.964 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.964 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:17 compute-0 nova_compute[182935]: 2026-01-22 00:07:17.964 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:18 compute-0 nova_compute[182935]: 2026-01-22 00:07:18.047 182939 DEBUG nova.policy [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:07:20 compute-0 nova_compute[182935]: 2026-01-22 00:07:20.333 182939 DEBUG nova.network.neutron [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Successfully created port: 65f4805a-47f5-4285-af8b-236f66964a00 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:07:20 compute-0 nova_compute[182935]: 2026-01-22 00:07:20.518 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:21 compute-0 nova_compute[182935]: 2026-01-22 00:07:21.304 182939 DEBUG nova.network.neutron [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Successfully updated port: 65f4805a-47f5-4285-af8b-236f66964a00 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:07:21 compute-0 nova_compute[182935]: 2026-01-22 00:07:21.317 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:21 compute-0 nova_compute[182935]: 2026-01-22 00:07:21.318 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquired lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:21 compute-0 nova_compute[182935]: 2026-01-22 00:07:21.318 182939 DEBUG nova.network.neutron [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:07:21 compute-0 nova_compute[182935]: 2026-01-22 00:07:21.477 182939 DEBUG nova.compute.manager [req-39a3cf02-705c-497a-be92-91d230443337 req-b2d0fa31-b8d2-4bcd-b12b-329b5ee797bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Received event network-changed-65f4805a-47f5-4285-af8b-236f66964a00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:21 compute-0 nova_compute[182935]: 2026-01-22 00:07:21.478 182939 DEBUG nova.compute.manager [req-39a3cf02-705c-497a-be92-91d230443337 req-b2d0fa31-b8d2-4bcd-b12b-329b5ee797bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Refreshing instance network info cache due to event network-changed-65f4805a-47f5-4285-af8b-236f66964a00. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:07:21 compute-0 nova_compute[182935]: 2026-01-22 00:07:21.478 182939 DEBUG oslo_concurrency.lockutils [req-39a3cf02-705c-497a-be92-91d230443337 req-b2d0fa31-b8d2-4bcd-b12b-329b5ee797bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:21 compute-0 nova_compute[182935]: 2026-01-22 00:07:21.689 182939 DEBUG nova.network.neutron [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:07:21 compute-0 nova_compute[182935]: 2026-01-22 00:07:21.704 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:21 compute-0 podman[228485]: 2026-01-22 00:07:21.712076276 +0000 UTC m=+0.069307812 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:07:21 compute-0 podman[228484]: 2026-01-22 00:07:21.747241519 +0000 UTC m=+0.115877499 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:07:21 compute-0 nova_compute[182935]: 2026-01-22 00:07:21.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.021 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.021 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.041 182939 DEBUG nova.compute.manager [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.148 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.148 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.157 182939 DEBUG nova.virt.hardware [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.158 182939 INFO nova.compute.claims [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.381 182939 DEBUG nova.compute.provider_tree [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.399 182939 DEBUG nova.scheduler.client.report [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.424 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.425 182939 DEBUG nova.compute.manager [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.512 182939 DEBUG nova.compute.manager [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.512 182939 DEBUG nova.network.neutron [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.530 182939 INFO nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.546 182939 DEBUG nova.network.neutron [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Updating instance_info_cache with network_info: [{"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f4805a-47", "ovs_interfaceid": "65f4805a-47f5-4285-af8b-236f66964a00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.557 182939 DEBUG nova.compute.manager [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.565 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Releasing lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.565 182939 DEBUG nova.compute.manager [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Instance network_info: |[{"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f4805a-47", "ovs_interfaceid": "65f4805a-47f5-4285-af8b-236f66964a00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.566 182939 DEBUG oslo_concurrency.lockutils [req-39a3cf02-705c-497a-be92-91d230443337 req-b2d0fa31-b8d2-4bcd-b12b-329b5ee797bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.566 182939 DEBUG nova.network.neutron [req-39a3cf02-705c-497a-be92-91d230443337 req-b2d0fa31-b8d2-4bcd-b12b-329b5ee797bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Refreshing network info cache for port 65f4805a-47f5-4285-af8b-236f66964a00 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.568 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Start _get_guest_xml network_info=[{"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f4805a-47", "ovs_interfaceid": "65f4805a-47f5-4285-af8b-236f66964a00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.573 182939 WARNING nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.578 182939 DEBUG nova.virt.libvirt.host [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.578 182939 DEBUG nova.virt.libvirt.host [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.581 182939 DEBUG nova.virt.libvirt.host [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.581 182939 DEBUG nova.virt.libvirt.host [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.582 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.583 182939 DEBUG nova.virt.hardware [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.583 182939 DEBUG nova.virt.hardware [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.583 182939 DEBUG nova.virt.hardware [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.583 182939 DEBUG nova.virt.hardware [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.584 182939 DEBUG nova.virt.hardware [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.584 182939 DEBUG nova.virt.hardware [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.584 182939 DEBUG nova.virt.hardware [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.584 182939 DEBUG nova.virt.hardware [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.584 182939 DEBUG nova.virt.hardware [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.585 182939 DEBUG nova.virt.hardware [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.585 182939 DEBUG nova.virt.hardware [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.588 182939 DEBUG nova.virt.libvirt.vif [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:07:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-181402284',display_name='tempest-DeleteServersTestJSON-server-181402284',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-181402284',id=101,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-qws46uci',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:07:17Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=59441413-f484-464f-b5e2-f8d3aeb80f83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f4805a-47", "ovs_interfaceid": "65f4805a-47f5-4285-af8b-236f66964a00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.588 182939 DEBUG nova.network.os_vif_util [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f4805a-47", "ovs_interfaceid": "65f4805a-47f5-4285-af8b-236f66964a00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.589 182939 DEBUG nova.network.os_vif_util [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:97:19,bridge_name='br-int',has_traffic_filtering=True,id=65f4805a-47f5-4285-af8b-236f66964a00,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f4805a-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.590 182939 DEBUG nova.objects.instance [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 59441413-f484-464f-b5e2-f8d3aeb80f83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.619 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:07:22 compute-0 nova_compute[182935]:   <uuid>59441413-f484-464f-b5e2-f8d3aeb80f83</uuid>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   <name>instance-00000065</name>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <nova:name>tempest-DeleteServersTestJSON-server-181402284</nova:name>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:07:22</nova:creationTime>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:07:22 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:07:22 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:07:22 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:07:22 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:07:22 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:07:22 compute-0 nova_compute[182935]:         <nova:user uuid="74ad1bf274924c52af96aa4c6d431410">tempest-DeleteServersTestJSON-2033458913-project-member</nova:user>
Jan 22 00:07:22 compute-0 nova_compute[182935]:         <nova:project uuid="3822e32efd5647aebf2d79a3dd038bd4">tempest-DeleteServersTestJSON-2033458913</nova:project>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:07:22 compute-0 nova_compute[182935]:         <nova:port uuid="65f4805a-47f5-4285-af8b-236f66964a00">
Jan 22 00:07:22 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <system>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <entry name="serial">59441413-f484-464f-b5e2-f8d3aeb80f83</entry>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <entry name="uuid">59441413-f484-464f-b5e2-f8d3aeb80f83</entry>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     </system>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   <os>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   </os>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   <features>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   </features>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk.config"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:0e:97:19"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <target dev="tap65f4805a-47"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/console.log" append="off"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <video>
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     </video>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:07:22 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:07:22 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:07:22 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:07:22 compute-0 nova_compute[182935]: </domain>
Jan 22 00:07:22 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.620 182939 DEBUG nova.compute.manager [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Preparing to wait for external event network-vif-plugged-65f4805a-47f5-4285-af8b-236f66964a00 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.620 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.621 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.621 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.622 182939 DEBUG nova.virt.libvirt.vif [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:07:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-181402284',display_name='tempest-DeleteServersTestJSON-server-181402284',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-181402284',id=101,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-qws46uci',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:07:17Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=59441413-f484-464f-b5e2-f8d3aeb80f83,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f4805a-47", "ovs_interfaceid": "65f4805a-47f5-4285-af8b-236f66964a00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.622 182939 DEBUG nova.network.os_vif_util [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f4805a-47", "ovs_interfaceid": "65f4805a-47f5-4285-af8b-236f66964a00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.622 182939 DEBUG nova.network.os_vif_util [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:97:19,bridge_name='br-int',has_traffic_filtering=True,id=65f4805a-47f5-4285-af8b-236f66964a00,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f4805a-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.623 182939 DEBUG os_vif [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:97:19,bridge_name='br-int',has_traffic_filtering=True,id=65f4805a-47f5-4285-af8b-236f66964a00,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f4805a-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.623 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.624 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.624 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.627 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.627 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65f4805a-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.627 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap65f4805a-47, col_values=(('external_ids', {'iface-id': '65f4805a-47f5-4285-af8b-236f66964a00', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0e:97:19', 'vm-uuid': '59441413-f484-464f-b5e2-f8d3aeb80f83'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.629 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:22 compute-0 NetworkManager[55139]: <info>  [1769040442.6302] manager: (tap65f4805a-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.631 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.639 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.641 182939 INFO os_vif [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:97:19,bridge_name='br-int',has_traffic_filtering=True,id=65f4805a-47f5-4285-af8b-236f66964a00,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f4805a-47')
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.704 182939 DEBUG nova.compute.manager [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.705 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.705 182939 INFO nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Creating image(s)
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.706 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.706 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.707 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.719 182939 DEBUG oslo_concurrency.processutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.741 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.741 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.742 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No VIF found with MAC fa:16:3e:0e:97:19, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.742 182939 INFO nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Using config drive
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.777 182939 DEBUG oslo_concurrency.processutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.778 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.779 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.789 182939 DEBUG oslo_concurrency.processutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.841 182939 DEBUG oslo_concurrency.processutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.842 182939 DEBUG oslo_concurrency.processutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.880 182939 DEBUG oslo_concurrency.processutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.882 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.882 182939 DEBUG oslo_concurrency.processutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.949 182939 DEBUG oslo_concurrency.processutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.951 182939 DEBUG nova.virt.disk.api [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:07:22 compute-0 nova_compute[182935]: 2026-01-22 00:07:22.951 182939 DEBUG oslo_concurrency.processutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.006 182939 DEBUG oslo_concurrency.processutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.007 182939 DEBUG nova.virt.disk.api [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.008 182939 DEBUG nova.objects.instance [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 19d62f20-26c4-46d6-ad9f-0ad16c60d542 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.026 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.027 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Ensure instance console log exists: /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.027 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.028 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.028 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.030 182939 DEBUG nova.policy [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.188 182939 INFO nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Creating config drive at /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk.config
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.193 182939 DEBUG oslo_concurrency.processutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvsxw17yb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.320 182939 DEBUG oslo_concurrency.processutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvsxw17yb" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:23 compute-0 kernel: tap65f4805a-47: entered promiscuous mode
Jan 22 00:07:23 compute-0 NetworkManager[55139]: <info>  [1769040443.3860] manager: (tap65f4805a-47): new Tun device (/org/freedesktop/NetworkManager/Devices/194)
Jan 22 00:07:23 compute-0 ovn_controller[95047]: 2026-01-22T00:07:23Z|00418|binding|INFO|Claiming lport 65f4805a-47f5-4285-af8b-236f66964a00 for this chassis.
Jan 22 00:07:23 compute-0 ovn_controller[95047]: 2026-01-22T00:07:23Z|00419|binding|INFO|65f4805a-47f5-4285-af8b-236f66964a00: Claiming fa:16:3e:0e:97:19 10.100.0.5
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.388 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.395 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:97:19 10.100.0.5'], port_security=['fa:16:3e:0e:97:19 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '59441413-f484-464f-b5e2-f8d3aeb80f83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=65f4805a-47f5-4285-af8b-236f66964a00) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.397 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 65f4805a-47f5-4285-af8b-236f66964a00 in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e bound to our chassis
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.398 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:07:23 compute-0 ovn_controller[95047]: 2026-01-22T00:07:23Z|00420|binding|INFO|Setting lport 65f4805a-47f5-4285-af8b-236f66964a00 ovn-installed in OVS
Jan 22 00:07:23 compute-0 ovn_controller[95047]: 2026-01-22T00:07:23Z|00421|binding|INFO|Setting lport 65f4805a-47f5-4285-af8b-236f66964a00 up in Southbound
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.405 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.408 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.411 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[743b9e6b-e6f1-4332-904f-7945b97c6124]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.412 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd94993bc-71 in ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.415 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd94993bc-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.415 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a5368c9b-a4a1-48cb-8f5b-873cd9cbfa20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.416 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cb83dbbc-357b-45e5-8958-b8644c20c332]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 systemd-udevd[228570]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:07:23 compute-0 systemd-machined[154182]: New machine qemu-55-instance-00000065.
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.427 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[bc63f4b0-59e4-4d0c-9ec4-07eabd39ccc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 NetworkManager[55139]: <info>  [1769040443.4325] device (tap65f4805a-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:07:23 compute-0 NetworkManager[55139]: <info>  [1769040443.4332] device (tap65f4805a-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:07:23 compute-0 systemd[1]: Started Virtual Machine qemu-55-instance-00000065.
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.451 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9847d6a9-3e21-4112-bba3-1ad24a83186f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.480 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[889e2999-1cc4-438b-a289-05c1cedec057]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 NetworkManager[55139]: <info>  [1769040443.4884] manager: (tapd94993bc-70): new Veth device (/org/freedesktop/NetworkManager/Devices/195)
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.487 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[72d9390e-6f55-4856-b035-4cff28bdd169]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 systemd-udevd[228575]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.520 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[6eae0c6c-eafc-438a-b64f-fa814173e3f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.524 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[76bc2947-2a8f-401b-a188-af817c423bbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 NetworkManager[55139]: <info>  [1769040443.5494] device (tapd94993bc-70): carrier: link connected
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.556 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c5900761-c851-4564-aa39-37771841e070]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.572 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4e4e5e-bdb5-41b3-8e8e-1ea010feb203]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494628, 'reachable_time': 15554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228606, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.589 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1248db40-4468-42a6-9e95-caaa4b1da08a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:eecd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494628, 'tstamp': 494628}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228607, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.607 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4f6c08c4-51f8-48d9-8145-8aac7bc544b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494628, 'reachable_time': 15554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228608, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.639 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[371bda19-b5a7-4397-bf6c-3e8643bd9210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.695 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b4d3fa-8d7c-4d74-a03e-96e4e8f3b8e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.697 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.697 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.697 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd94993bc-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:23 compute-0 NetworkManager[55139]: <info>  [1769040443.7216] manager: (tapd94993bc-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/196)
Jan 22 00:07:23 compute-0 kernel: tapd94993bc-70: entered promiscuous mode
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.724 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd94993bc-70, col_values=(('external_ids', {'iface-id': 'd921ee25-8f8a-4375-9839-6c54ab328e88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.725 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:23 compute-0 ovn_controller[95047]: 2026-01-22T00:07:23Z|00422|binding|INFO|Releasing lport d921ee25-8f8a-4375-9839-6c54ab328e88 from this chassis (sb_readonly=0)
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.728 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.737 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.737 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[719d85b3-a1cb-4eb0-ac7d-298c3a77acbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.739 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:07:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:23.740 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'env', 'PROCESS_TAG=haproxy-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d94993bc-77ac-42d2-88cb-3b0110dff29e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:07:23 compute-0 nova_compute[182935]: 2026-01-22 00:07:23.807 182939 DEBUG nova.network.neutron [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Successfully created port: ae476462-b965-4dea-8a2a-9275391da91f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:07:24 compute-0 podman[228640]: 2026-01-22 00:07:24.17712219 +0000 UTC m=+0.079458118 container create d6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 00:07:24 compute-0 systemd[1]: Started libpod-conmon-d6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58.scope.
Jan 22 00:07:24 compute-0 podman[228640]: 2026-01-22 00:07:24.13515341 +0000 UTC m=+0.037489418 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:07:24 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.242 182939 DEBUG nova.compute.manager [req-8ad83003-c879-4ce4-8c65-d59197c05423 req-01030243-7854-4372-8a03-a5998caa4a61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Received event network-vif-plugged-65f4805a-47f5-4285-af8b-236f66964a00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.242 182939 DEBUG oslo_concurrency.lockutils [req-8ad83003-c879-4ce4-8c65-d59197c05423 req-01030243-7854-4372-8a03-a5998caa4a61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.242 182939 DEBUG oslo_concurrency.lockutils [req-8ad83003-c879-4ce4-8c65-d59197c05423 req-01030243-7854-4372-8a03-a5998caa4a61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.243 182939 DEBUG oslo_concurrency.lockutils [req-8ad83003-c879-4ce4-8c65-d59197c05423 req-01030243-7854-4372-8a03-a5998caa4a61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.243 182939 DEBUG nova.compute.manager [req-8ad83003-c879-4ce4-8c65-d59197c05423 req-01030243-7854-4372-8a03-a5998caa4a61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Processing event network-vif-plugged-65f4805a-47f5-4285-af8b-236f66964a00 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:07:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22fef007081a179aa04469f4d1a7bae941c7eeaea48e91b3f64d2b16643876e5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:07:24 compute-0 podman[228640]: 2026-01-22 00:07:24.265673106 +0000 UTC m=+0.168009064 container init d6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 00:07:24 compute-0 podman[228640]: 2026-01-22 00:07:24.27188161 +0000 UTC m=+0.174217538 container start d6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:07:24 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228656]: [NOTICE]   (228660) : New worker (228662) forked
Jan 22 00:07:24 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228656]: [NOTICE]   (228660) : Loading success.
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.420 182939 DEBUG nova.compute.manager [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.421 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040444.4200804, 59441413-f484-464f-b5e2-f8d3aeb80f83 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.422 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] VM Started (Lifecycle Event)
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.425 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.429 182939 INFO nova.virt.libvirt.driver [-] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Instance spawned successfully.
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.429 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.444 182939 DEBUG nova.network.neutron [req-39a3cf02-705c-497a-be92-91d230443337 req-b2d0fa31-b8d2-4bcd-b12b-329b5ee797bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Updated VIF entry in instance network info cache for port 65f4805a-47f5-4285-af8b-236f66964a00. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.445 182939 DEBUG nova.network.neutron [req-39a3cf02-705c-497a-be92-91d230443337 req-b2d0fa31-b8d2-4bcd-b12b-329b5ee797bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Updating instance_info_cache with network_info: [{"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f4805a-47", "ovs_interfaceid": "65f4805a-47f5-4285-af8b-236f66964a00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.454 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.461 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.467 182939 DEBUG oslo_concurrency.lockutils [req-39a3cf02-705c-497a-be92-91d230443337 req-b2d0fa31-b8d2-4bcd-b12b-329b5ee797bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.471 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.471 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.472 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.472 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.473 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.473 182939 DEBUG nova.virt.libvirt.driver [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.505 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.506 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040444.4214146, 59441413-f484-464f-b5e2-f8d3aeb80f83 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.506 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] VM Paused (Lifecycle Event)
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.537 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.541 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040444.4243734, 59441413-f484-464f-b5e2-f8d3aeb80f83 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.541 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] VM Resumed (Lifecycle Event)
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.562 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.565 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.594 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.608 182939 INFO nova.compute.manager [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Took 7.03 seconds to spawn the instance on the hypervisor.
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.608 182939 DEBUG nova.compute.manager [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.730 182939 INFO nova.compute.manager [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Took 7.66 seconds to build instance.
Jan 22 00:07:24 compute-0 nova_compute[182935]: 2026-01-22 00:07:24.747 182939 DEBUG oslo_concurrency.lockutils [None req-65464387-1730-4958-835a-c2bd7a8d4fec 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:25 compute-0 nova_compute[182935]: 2026-01-22 00:07:25.122 182939 DEBUG nova.network.neutron [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Successfully updated port: ae476462-b965-4dea-8a2a-9275391da91f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:07:25 compute-0 nova_compute[182935]: 2026-01-22 00:07:25.153 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-19d62f20-26c4-46d6-ad9f-0ad16c60d542" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:25 compute-0 nova_compute[182935]: 2026-01-22 00:07:25.153 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-19d62f20-26c4-46d6-ad9f-0ad16c60d542" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:25 compute-0 nova_compute[182935]: 2026-01-22 00:07:25.153 182939 DEBUG nova.network.neutron [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:07:25 compute-0 nova_compute[182935]: 2026-01-22 00:07:25.260 182939 DEBUG nova.compute.manager [req-1381cc4d-2621-4be0-854f-86c6a6b7c851 req-28cfbc60-5ebc-4e9e-b406-7587ebf1dd84 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Received event network-changed-ae476462-b965-4dea-8a2a-9275391da91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:25 compute-0 nova_compute[182935]: 2026-01-22 00:07:25.260 182939 DEBUG nova.compute.manager [req-1381cc4d-2621-4be0-854f-86c6a6b7c851 req-28cfbc60-5ebc-4e9e-b406-7587ebf1dd84 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Refreshing instance network info cache due to event network-changed-ae476462-b965-4dea-8a2a-9275391da91f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:07:25 compute-0 nova_compute[182935]: 2026-01-22 00:07:25.261 182939 DEBUG oslo_concurrency.lockutils [req-1381cc4d-2621-4be0-854f-86c6a6b7c851 req-28cfbc60-5ebc-4e9e-b406-7587ebf1dd84 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-19d62f20-26c4-46d6-ad9f-0ad16c60d542" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:25 compute-0 nova_compute[182935]: 2026-01-22 00:07:25.407 182939 DEBUG nova.network.neutron [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:07:25 compute-0 nova_compute[182935]: 2026-01-22 00:07:25.493 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040430.4904068, 70d927e7-875a-426f-a8ee-8e784c4fc8eb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:25 compute-0 nova_compute[182935]: 2026-01-22 00:07:25.494 182939 INFO nova.compute.manager [-] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] VM Stopped (Lifecycle Event)
Jan 22 00:07:25 compute-0 nova_compute[182935]: 2026-01-22 00:07:25.525 182939 DEBUG nova.compute.manager [None req-afba409b-9a06-4b78-b46e-6ba29350f0c1 - - - - - -] [instance: 70d927e7-875a-426f-a8ee-8e784c4fc8eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.343 182939 DEBUG nova.compute.manager [req-36afed03-b2d8-4d9c-858e-647a41b71e61 req-fe7f42db-65ed-4b0c-8234-112ee3b339a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Received event network-vif-plugged-65f4805a-47f5-4285-af8b-236f66964a00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.343 182939 DEBUG oslo_concurrency.lockutils [req-36afed03-b2d8-4d9c-858e-647a41b71e61 req-fe7f42db-65ed-4b0c-8234-112ee3b339a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.343 182939 DEBUG oslo_concurrency.lockutils [req-36afed03-b2d8-4d9c-858e-647a41b71e61 req-fe7f42db-65ed-4b0c-8234-112ee3b339a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.344 182939 DEBUG oslo_concurrency.lockutils [req-36afed03-b2d8-4d9c-858e-647a41b71e61 req-fe7f42db-65ed-4b0c-8234-112ee3b339a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.344 182939 DEBUG nova.compute.manager [req-36afed03-b2d8-4d9c-858e-647a41b71e61 req-fe7f42db-65ed-4b0c-8234-112ee3b339a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] No waiting events found dispatching network-vif-plugged-65f4805a-47f5-4285-af8b-236f66964a00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.344 182939 WARNING nova.compute.manager [req-36afed03-b2d8-4d9c-858e-647a41b71e61 req-fe7f42db-65ed-4b0c-8234-112ee3b339a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Received unexpected event network-vif-plugged-65f4805a-47f5-4285-af8b-236f66964a00 for instance with vm_state active and task_state shelving.
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.569 182939 DEBUG oslo_concurrency.lockutils [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "59441413-f484-464f-b5e2-f8d3aeb80f83" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.569 182939 DEBUG oslo_concurrency.lockutils [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.570 182939 INFO nova.compute.manager [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Shelving
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.621 182939 DEBUG nova.network.neutron [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Updating instance_info_cache with network_info: [{"id": "ae476462-b965-4dea-8a2a-9275391da91f", "address": "fa:16:3e:e2:b9:87", "network": {"id": "88aa5d18-f337-47b2-8592-39b5aa8263f7", "bridge": "br-int", "label": "tempest-network-smoke--210221422", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae476462-b9", "ovs_interfaceid": "ae476462-b965-4dea-8a2a-9275391da91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.656 182939 DEBUG nova.virt.libvirt.driver [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.665 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-19d62f20-26c4-46d6-ad9f-0ad16c60d542" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.666 182939 DEBUG nova.compute.manager [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Instance network_info: |[{"id": "ae476462-b965-4dea-8a2a-9275391da91f", "address": "fa:16:3e:e2:b9:87", "network": {"id": "88aa5d18-f337-47b2-8592-39b5aa8263f7", "bridge": "br-int", "label": "tempest-network-smoke--210221422", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae476462-b9", "ovs_interfaceid": "ae476462-b965-4dea-8a2a-9275391da91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.667 182939 DEBUG oslo_concurrency.lockutils [req-1381cc4d-2621-4be0-854f-86c6a6b7c851 req-28cfbc60-5ebc-4e9e-b406-7587ebf1dd84 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-19d62f20-26c4-46d6-ad9f-0ad16c60d542" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.667 182939 DEBUG nova.network.neutron [req-1381cc4d-2621-4be0-854f-86c6a6b7c851 req-28cfbc60-5ebc-4e9e-b406-7587ebf1dd84 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Refreshing network info cache for port ae476462-b965-4dea-8a2a-9275391da91f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.670 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Start _get_guest_xml network_info=[{"id": "ae476462-b965-4dea-8a2a-9275391da91f", "address": "fa:16:3e:e2:b9:87", "network": {"id": "88aa5d18-f337-47b2-8592-39b5aa8263f7", "bridge": "br-int", "label": "tempest-network-smoke--210221422", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae476462-b9", "ovs_interfaceid": "ae476462-b965-4dea-8a2a-9275391da91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.675 182939 WARNING nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.682 182939 DEBUG nova.virt.libvirt.host [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.682 182939 DEBUG nova.virt.libvirt.host [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.688 182939 DEBUG nova.virt.libvirt.host [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.689 182939 DEBUG nova.virt.libvirt.host [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.690 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.690 182939 DEBUG nova.virt.hardware [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.690 182939 DEBUG nova.virt.hardware [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.691 182939 DEBUG nova.virt.hardware [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.691 182939 DEBUG nova.virt.hardware [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.691 182939 DEBUG nova.virt.hardware [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.691 182939 DEBUG nova.virt.hardware [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.692 182939 DEBUG nova.virt.hardware [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.692 182939 DEBUG nova.virt.hardware [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.692 182939 DEBUG nova.virt.hardware [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.692 182939 DEBUG nova.virt.hardware [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.692 182939 DEBUG nova.virt.hardware [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.696 182939 DEBUG nova.virt.libvirt.vif [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:07:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-350797465',display_name='tempest-TestNetworkBasicOps-server-350797465',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-350797465',id=102,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFhpGikw3/h40VsG/Jm7LsXsSzYtwE7fCM0r3J6CqAyw+bIH5ldw9RHiK36T6EKitBYk60DQfyUlm6WtmOYvP23GMVLtMXdNYalbfeix+4qL1C0Gq789f9cMRNBkYgWtvQ==',key_name='tempest-TestNetworkBasicOps-538921274',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-j43mk00m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:07:22Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=19d62f20-26c4-46d6-ad9f-0ad16c60d542,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ae476462-b965-4dea-8a2a-9275391da91f", "address": "fa:16:3e:e2:b9:87", "network": {"id": "88aa5d18-f337-47b2-8592-39b5aa8263f7", "bridge": "br-int", "label": "tempest-network-smoke--210221422", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae476462-b9", "ovs_interfaceid": "ae476462-b965-4dea-8a2a-9275391da91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.696 182939 DEBUG nova.network.os_vif_util [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "ae476462-b965-4dea-8a2a-9275391da91f", "address": "fa:16:3e:e2:b9:87", "network": {"id": "88aa5d18-f337-47b2-8592-39b5aa8263f7", "bridge": "br-int", "label": "tempest-network-smoke--210221422", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae476462-b9", "ovs_interfaceid": "ae476462-b965-4dea-8a2a-9275391da91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.697 182939 DEBUG nova.network.os_vif_util [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:b9:87,bridge_name='br-int',has_traffic_filtering=True,id=ae476462-b965-4dea-8a2a-9275391da91f,network=Network(88aa5d18-f337-47b2-8592-39b5aa8263f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae476462-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.697 182939 DEBUG nova.objects.instance [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 19d62f20-26c4-46d6-ad9f-0ad16c60d542 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.708 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.724 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:07:26 compute-0 nova_compute[182935]:   <uuid>19d62f20-26c4-46d6-ad9f-0ad16c60d542</uuid>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   <name>instance-00000066</name>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <nova:name>tempest-TestNetworkBasicOps-server-350797465</nova:name>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:07:26</nova:creationTime>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:07:26 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:07:26 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:07:26 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:07:26 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:07:26 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:07:26 compute-0 nova_compute[182935]:         <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:07:26 compute-0 nova_compute[182935]:         <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:07:26 compute-0 nova_compute[182935]:         <nova:port uuid="ae476462-b965-4dea-8a2a-9275391da91f">
Jan 22 00:07:26 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <system>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <entry name="serial">19d62f20-26c4-46d6-ad9f-0ad16c60d542</entry>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <entry name="uuid">19d62f20-26c4-46d6-ad9f-0ad16c60d542</entry>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     </system>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   <os>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   </os>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   <features>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   </features>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk.config"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:e2:b9:87"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <target dev="tapae476462-b9"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/console.log" append="off"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <video>
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     </video>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:07:26 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:07:26 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:07:26 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:07:26 compute-0 nova_compute[182935]: </domain>
Jan 22 00:07:26 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.726 182939 DEBUG nova.compute.manager [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Preparing to wait for external event network-vif-plugged-ae476462-b965-4dea-8a2a-9275391da91f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.726 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.726 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.726 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.727 182939 DEBUG nova.virt.libvirt.vif [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:07:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-350797465',display_name='tempest-TestNetworkBasicOps-server-350797465',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-350797465',id=102,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFhpGikw3/h40VsG/Jm7LsXsSzYtwE7fCM0r3J6CqAyw+bIH5ldw9RHiK36T6EKitBYk60DQfyUlm6WtmOYvP23GMVLtMXdNYalbfeix+4qL1C0Gq789f9cMRNBkYgWtvQ==',key_name='tempest-TestNetworkBasicOps-538921274',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-j43mk00m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:07:22Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=19d62f20-26c4-46d6-ad9f-0ad16c60d542,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ae476462-b965-4dea-8a2a-9275391da91f", "address": "fa:16:3e:e2:b9:87", "network": {"id": "88aa5d18-f337-47b2-8592-39b5aa8263f7", "bridge": "br-int", "label": "tempest-network-smoke--210221422", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae476462-b9", "ovs_interfaceid": "ae476462-b965-4dea-8a2a-9275391da91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.728 182939 DEBUG nova.network.os_vif_util [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "ae476462-b965-4dea-8a2a-9275391da91f", "address": "fa:16:3e:e2:b9:87", "network": {"id": "88aa5d18-f337-47b2-8592-39b5aa8263f7", "bridge": "br-int", "label": "tempest-network-smoke--210221422", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae476462-b9", "ovs_interfaceid": "ae476462-b965-4dea-8a2a-9275391da91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.728 182939 DEBUG nova.network.os_vif_util [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:b9:87,bridge_name='br-int',has_traffic_filtering=True,id=ae476462-b965-4dea-8a2a-9275391da91f,network=Network(88aa5d18-f337-47b2-8592-39b5aa8263f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae476462-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.729 182939 DEBUG os_vif [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:b9:87,bridge_name='br-int',has_traffic_filtering=True,id=ae476462-b965-4dea-8a2a-9275391da91f,network=Network(88aa5d18-f337-47b2-8592-39b5aa8263f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae476462-b9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.729 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.730 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.730 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.732 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.733 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae476462-b9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.733 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapae476462-b9, col_values=(('external_ids', {'iface-id': 'ae476462-b965-4dea-8a2a-9275391da91f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:b9:87', 'vm-uuid': '19d62f20-26c4-46d6-ad9f-0ad16c60d542'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.735 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.737 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:07:26 compute-0 NetworkManager[55139]: <info>  [1769040446.7400] manager: (tapae476462-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.742 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.743 182939 INFO os_vif [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:b9:87,bridge_name='br-int',has_traffic_filtering=True,id=ae476462-b965-4dea-8a2a-9275391da91f,network=Network(88aa5d18-f337-47b2-8592-39b5aa8263f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae476462-b9')
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.824 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.824 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.825 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:e2:b9:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:07:26 compute-0 nova_compute[182935]: 2026-01-22 00:07:26.825 182939 INFO nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Using config drive
Jan 22 00:07:27 compute-0 nova_compute[182935]: 2026-01-22 00:07:27.167 182939 INFO nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Creating config drive at /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk.config
Jan 22 00:07:27 compute-0 nova_compute[182935]: 2026-01-22 00:07:27.171 182939 DEBUG oslo_concurrency.processutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmah7xklf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:27 compute-0 nova_compute[182935]: 2026-01-22 00:07:27.326 182939 DEBUG oslo_concurrency.processutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmah7xklf" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:27 compute-0 kernel: tapae476462-b9: entered promiscuous mode
Jan 22 00:07:27 compute-0 NetworkManager[55139]: <info>  [1769040447.4183] manager: (tapae476462-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/198)
Jan 22 00:07:27 compute-0 nova_compute[182935]: 2026-01-22 00:07:27.427 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:27 compute-0 ovn_controller[95047]: 2026-01-22T00:07:27Z|00423|binding|INFO|Claiming lport ae476462-b965-4dea-8a2a-9275391da91f for this chassis.
Jan 22 00:07:27 compute-0 ovn_controller[95047]: 2026-01-22T00:07:27Z|00424|binding|INFO|ae476462-b965-4dea-8a2a-9275391da91f: Claiming fa:16:3e:e2:b9:87 10.100.0.20
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.438 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:b9:87 10.100.0.20'], port_security=['fa:16:3e:e2:b9:87 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': '19d62f20-26c4-46d6-ad9f-0ad16c60d542', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88aa5d18-f337-47b2-8592-39b5aa8263f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e8b6cc30-3c91-408f-9475-22e091641435', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7163db1f-5923-487d-80b0-b5662c1fa9e2, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=ae476462-b965-4dea-8a2a-9275391da91f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.441 104408 INFO neutron.agent.ovn.metadata.agent [-] Port ae476462-b965-4dea-8a2a-9275391da91f in datapath 88aa5d18-f337-47b2-8592-39b5aa8263f7 bound to our chassis
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.445 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88aa5d18-f337-47b2-8592-39b5aa8263f7
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.462 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1ece56-2f11-4ce3-9b67-8f11ad286fbd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.464 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88aa5d18-f1 in ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.468 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88aa5d18-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.469 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2002e7-35fd-4eae-a809-d2be119f9aac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.470 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[aa0057c1-d0a7-4115-ad0d-26afd806df85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 systemd-udevd[228700]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:07:27 compute-0 nova_compute[182935]: 2026-01-22 00:07:27.483 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.487 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[2868ba39-d8ce-40be-936e-7c27032fc821]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 ovn_controller[95047]: 2026-01-22T00:07:27Z|00425|binding|INFO|Setting lport ae476462-b965-4dea-8a2a-9275391da91f ovn-installed in OVS
Jan 22 00:07:27 compute-0 ovn_controller[95047]: 2026-01-22T00:07:27Z|00426|binding|INFO|Setting lport ae476462-b965-4dea-8a2a-9275391da91f up in Southbound
Jan 22 00:07:27 compute-0 nova_compute[182935]: 2026-01-22 00:07:27.489 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:27 compute-0 systemd-machined[154182]: New machine qemu-56-instance-00000066.
Jan 22 00:07:27 compute-0 NetworkManager[55139]: <info>  [1769040447.5052] device (tapae476462-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:07:27 compute-0 NetworkManager[55139]: <info>  [1769040447.5062] device (tapae476462-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.507 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3c4efe48-2197-4108-a642-17d3bc7c41c9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 systemd[1]: Started Virtual Machine qemu-56-instance-00000066.
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.556 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8bb253-c46f-4347-b756-3a8115713c59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 NetworkManager[55139]: <info>  [1769040447.5641] manager: (tap88aa5d18-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/199)
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.565 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e9f8dd48-1e06-447d-94c6-bb701082206e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.599 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[298b3a08-76c4-4a77-9887-8aa7ab8c3860]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.604 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8f3dcc37-2ef6-490e-a5a4-716339b3937b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 NetworkManager[55139]: <info>  [1769040447.6327] device (tap88aa5d18-f0): carrier: link connected
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.639 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b9c85e34-a3ff-4cd6-8919-66428f6170eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.658 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[09c4b92c-3cd3-45fc-bd66-a5f7ea964ebb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88aa5d18-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:fd:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495037, 'reachable_time': 36763, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228731, 'error': None, 'target': 'ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.678 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[40561f0a-0288-4b34-92b1-69c8f8bee800]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef3:fd1d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 495037, 'tstamp': 495037}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228732, 'error': None, 'target': 'ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.706 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9e2fbeb6-45b6-4362-b5c3-09e61f518fd7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88aa5d18-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f3:fd:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495037, 'reachable_time': 36763, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228733, 'error': None, 'target': 'ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.741 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b7f4c8c3-39ae-4abb-ac1d-12da3c1607d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.796 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ac860f80-9c30-4f2e-8c61-d51dbfc6e35d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.798 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88aa5d18-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.798 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.799 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88aa5d18-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:27 compute-0 NetworkManager[55139]: <info>  [1769040447.8017] manager: (tap88aa5d18-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Jan 22 00:07:27 compute-0 nova_compute[182935]: 2026-01-22 00:07:27.801 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:27 compute-0 kernel: tap88aa5d18-f0: entered promiscuous mode
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.805 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88aa5d18-f0, col_values=(('external_ids', {'iface-id': '0de086b1-9b1a-4031-a6d1-e19e36eba182'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:27 compute-0 ovn_controller[95047]: 2026-01-22T00:07:27Z|00427|binding|INFO|Releasing lport 0de086b1-9b1a-4031-a6d1-e19e36eba182 from this chassis (sb_readonly=0)
Jan 22 00:07:27 compute-0 nova_compute[182935]: 2026-01-22 00:07:27.817 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:27 compute-0 nova_compute[182935]: 2026-01-22 00:07:27.820 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.821 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88aa5d18-f337-47b2-8592-39b5aa8263f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88aa5d18-f337-47b2-8592-39b5aa8263f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.822 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b5598280-0e5c-4f9d-aec3-57ce4343afec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.822 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-88aa5d18-f337-47b2-8592-39b5aa8263f7
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/88aa5d18-f337-47b2-8592-39b5aa8263f7.pid.haproxy
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 88aa5d18-f337-47b2-8592-39b5aa8263f7
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:07:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:27.824 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7', 'env', 'PROCESS_TAG=haproxy-88aa5d18-f337-47b2-8592-39b5aa8263f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88aa5d18-f337-47b2-8592-39b5aa8263f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:07:28 compute-0 podman[228762]: 2026-01-22 00:07:28.246425772 +0000 UTC m=+0.092011318 container create 5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:07:28 compute-0 systemd[1]: Started libpod-conmon-5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f.scope.
Jan 22 00:07:28 compute-0 podman[228762]: 2026-01-22 00:07:28.214118525 +0000 UTC m=+0.059704071 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:07:28 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:07:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a195f2a734cdd028291e01046053d49e8a13c5e9eaba7efd9cbc637faec7196/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:07:28 compute-0 podman[228762]: 2026-01-22 00:07:28.352909463 +0000 UTC m=+0.198495079 container init 5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 00:07:28 compute-0 podman[228762]: 2026-01-22 00:07:28.361311707 +0000 UTC m=+0.206897223 container start 5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 00:07:28 compute-0 neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7[228777]: [NOTICE]   (228790) : New worker (228795) forked
Jan 22 00:07:28 compute-0 neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7[228777]: [NOTICE]   (228790) : Loading success.
Jan 22 00:07:28 compute-0 podman[228780]: 2026-01-22 00:07:28.41245566 +0000 UTC m=+0.066132560 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:07:28 compute-0 nova_compute[182935]: 2026-01-22 00:07:28.554 182939 DEBUG nova.network.neutron [req-1381cc4d-2621-4be0-854f-86c6a6b7c851 req-28cfbc60-5ebc-4e9e-b406-7587ebf1dd84 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Updated VIF entry in instance network info cache for port ae476462-b965-4dea-8a2a-9275391da91f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:07:28 compute-0 nova_compute[182935]: 2026-01-22 00:07:28.557 182939 DEBUG nova.network.neutron [req-1381cc4d-2621-4be0-854f-86c6a6b7c851 req-28cfbc60-5ebc-4e9e-b406-7587ebf1dd84 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Updating instance_info_cache with network_info: [{"id": "ae476462-b965-4dea-8a2a-9275391da91f", "address": "fa:16:3e:e2:b9:87", "network": {"id": "88aa5d18-f337-47b2-8592-39b5aa8263f7", "bridge": "br-int", "label": "tempest-network-smoke--210221422", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae476462-b9", "ovs_interfaceid": "ae476462-b965-4dea-8a2a-9275391da91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:28 compute-0 nova_compute[182935]: 2026-01-22 00:07:28.583 182939 DEBUG oslo_concurrency.lockutils [req-1381cc4d-2621-4be0-854f-86c6a6b7c851 req-28cfbc60-5ebc-4e9e-b406-7587ebf1dd84 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-19d62f20-26c4-46d6-ad9f-0ad16c60d542" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:28 compute-0 nova_compute[182935]: 2026-01-22 00:07:28.726 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040448.7252257, 19d62f20-26c4-46d6-ad9f-0ad16c60d542 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:28 compute-0 nova_compute[182935]: 2026-01-22 00:07:28.726 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] VM Started (Lifecycle Event)
Jan 22 00:07:28 compute-0 nova_compute[182935]: 2026-01-22 00:07:28.750 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:28 compute-0 nova_compute[182935]: 2026-01-22 00:07:28.755 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040448.7254593, 19d62f20-26c4-46d6-ad9f-0ad16c60d542 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:28 compute-0 nova_compute[182935]: 2026-01-22 00:07:28.755 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] VM Paused (Lifecycle Event)
Jan 22 00:07:28 compute-0 nova_compute[182935]: 2026-01-22 00:07:28.794 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:28 compute-0 nova_compute[182935]: 2026-01-22 00:07:28.798 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:07:28 compute-0 nova_compute[182935]: 2026-01-22 00:07:28.820 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.409 182939 DEBUG nova.compute.manager [req-15339d00-af49-41c6-9f16-5b0b36778d77 req-6e600c5a-1544-4b2e-b479-f79a77ba9cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Received event network-vif-plugged-ae476462-b965-4dea-8a2a-9275391da91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.409 182939 DEBUG oslo_concurrency.lockutils [req-15339d00-af49-41c6-9f16-5b0b36778d77 req-6e600c5a-1544-4b2e-b479-f79a77ba9cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.409 182939 DEBUG oslo_concurrency.lockutils [req-15339d00-af49-41c6-9f16-5b0b36778d77 req-6e600c5a-1544-4b2e-b479-f79a77ba9cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.410 182939 DEBUG oslo_concurrency.lockutils [req-15339d00-af49-41c6-9f16-5b0b36778d77 req-6e600c5a-1544-4b2e-b479-f79a77ba9cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.410 182939 DEBUG nova.compute.manager [req-15339d00-af49-41c6-9f16-5b0b36778d77 req-6e600c5a-1544-4b2e-b479-f79a77ba9cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Processing event network-vif-plugged-ae476462-b965-4dea-8a2a-9275391da91f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.410 182939 DEBUG nova.compute.manager [req-15339d00-af49-41c6-9f16-5b0b36778d77 req-6e600c5a-1544-4b2e-b479-f79a77ba9cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Received event network-vif-plugged-ae476462-b965-4dea-8a2a-9275391da91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.410 182939 DEBUG oslo_concurrency.lockutils [req-15339d00-af49-41c6-9f16-5b0b36778d77 req-6e600c5a-1544-4b2e-b479-f79a77ba9cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.411 182939 DEBUG oslo_concurrency.lockutils [req-15339d00-af49-41c6-9f16-5b0b36778d77 req-6e600c5a-1544-4b2e-b479-f79a77ba9cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.411 182939 DEBUG oslo_concurrency.lockutils [req-15339d00-af49-41c6-9f16-5b0b36778d77 req-6e600c5a-1544-4b2e-b479-f79a77ba9cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.411 182939 DEBUG nova.compute.manager [req-15339d00-af49-41c6-9f16-5b0b36778d77 req-6e600c5a-1544-4b2e-b479-f79a77ba9cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] No waiting events found dispatching network-vif-plugged-ae476462-b965-4dea-8a2a-9275391da91f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.411 182939 WARNING nova.compute.manager [req-15339d00-af49-41c6-9f16-5b0b36778d77 req-6e600c5a-1544-4b2e-b479-f79a77ba9cf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Received unexpected event network-vif-plugged-ae476462-b965-4dea-8a2a-9275391da91f for instance with vm_state building and task_state spawning.
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.412 182939 DEBUG nova.compute.manager [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.421 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040449.4166086, 19d62f20-26c4-46d6-ad9f-0ad16c60d542 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.421 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] VM Resumed (Lifecycle Event)
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.422 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.430 182939 INFO nova.virt.libvirt.driver [-] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Instance spawned successfully.
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.431 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.660 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.667 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.667 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.668 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.668 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.669 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.669 182939 DEBUG nova.virt.libvirt.driver [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.673 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.725 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.773 182939 INFO nova.compute.manager [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Took 7.07 seconds to spawn the instance on the hypervisor.
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.774 182939 DEBUG nova.compute.manager [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.882 182939 INFO nova.compute.manager [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Took 7.77 seconds to build instance.
Jan 22 00:07:29 compute-0 nova_compute[182935]: 2026-01-22 00:07:29.914 182939 DEBUG oslo_concurrency.lockutils [None req-6c50447f-304e-4551-9c72-ac82d8085a94 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:31 compute-0 nova_compute[182935]: 2026-01-22 00:07:31.712 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:31 compute-0 nova_compute[182935]: 2026-01-22 00:07:31.735 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:33 compute-0 podman[228822]: 2026-01-22 00:07:33.687146712 +0000 UTC m=+0.054312207 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 00:07:33 compute-0 nova_compute[182935]: 2026-01-22 00:07:33.808 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:33 compute-0 nova_compute[182935]: 2026-01-22 00:07:33.809 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:07:33 compute-0 nova_compute[182935]: 2026-01-22 00:07:33.809 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:07:33 compute-0 nova_compute[182935]: 2026-01-22 00:07:33.831 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:33 compute-0 nova_compute[182935]: 2026-01-22 00:07:33.832 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:33 compute-0 nova_compute[182935]: 2026-01-22 00:07:33.832 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:07:33 compute-0 nova_compute[182935]: 2026-01-22 00:07:33.833 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 59441413-f484-464f-b5e2-f8d3aeb80f83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:35 compute-0 nova_compute[182935]: 2026-01-22 00:07:35.892 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Updating instance_info_cache with network_info: [{"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f4805a-47", "ovs_interfaceid": "65f4805a-47f5-4285-af8b-236f66964a00", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:35 compute-0 nova_compute[182935]: 2026-01-22 00:07:35.923 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:35 compute-0 nova_compute[182935]: 2026-01-22 00:07:35.924 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:07:35 compute-0 nova_compute[182935]: 2026-01-22 00:07:35.925 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:35 compute-0 nova_compute[182935]: 2026-01-22 00:07:35.926 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:35 compute-0 nova_compute[182935]: 2026-01-22 00:07:35.926 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:07:35 compute-0 nova_compute[182935]: 2026-01-22 00:07:35.927 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:35 compute-0 nova_compute[182935]: 2026-01-22 00:07:35.971 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:35 compute-0 nova_compute[182935]: 2026-01-22 00:07:35.972 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:35 compute-0 nova_compute[182935]: 2026-01-22 00:07:35.972 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:35 compute-0 nova_compute[182935]: 2026-01-22 00:07:35.972 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.049 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.148 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.153 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.277 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk --force-share --output=json" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.291 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.362 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.364 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.442 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.657 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:36 compute-0 NetworkManager[55139]: <info>  [1769040456.6581] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/201)
Jan 22 00:07:36 compute-0 NetworkManager[55139]: <info>  [1769040456.6597] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.712 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.713 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5478MB free_disk=73.1264533996582GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.713 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.714 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.730 182939 DEBUG nova.virt.libvirt.driver [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.736 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.797 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:36 compute-0 ovn_controller[95047]: 2026-01-22T00:07:36Z|00428|binding|INFO|Releasing lport d921ee25-8f8a-4375-9839-6c54ab328e88 from this chassis (sb_readonly=0)
Jan 22 00:07:36 compute-0 ovn_controller[95047]: 2026-01-22T00:07:36Z|00429|binding|INFO|Releasing lport 0de086b1-9b1a-4031-a6d1-e19e36eba182 from this chassis (sb_readonly=0)
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.820 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 59441413-f484-464f-b5e2-f8d3aeb80f83 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.821 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 19d62f20-26c4-46d6-ad9f-0ad16c60d542 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.821 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.821 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.831 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.887 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.904 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.934 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:07:36 compute-0 nova_compute[182935]: 2026-01-22 00:07:36.934 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:37 compute-0 ovn_controller[95047]: 2026-01-22T00:07:37Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0e:97:19 10.100.0.5
Jan 22 00:07:37 compute-0 ovn_controller[95047]: 2026-01-22T00:07:37Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0e:97:19 10.100.0.5
Jan 22 00:07:37 compute-0 podman[228868]: 2026-01-22 00:07:37.719153123 +0000 UTC m=+0.083586443 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7)
Jan 22 00:07:37 compute-0 podman[228869]: 2026-01-22 00:07:37.744714313 +0000 UTC m=+0.094480125 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 00:07:37 compute-0 nova_compute[182935]: 2026-01-22 00:07:37.803 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:39 compute-0 nova_compute[182935]: 2026-01-22 00:07:39.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:40 compute-0 nova_compute[182935]: 2026-01-22 00:07:40.127 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:40 compute-0 kernel: tap65f4805a-47 (unregistering): left promiscuous mode
Jan 22 00:07:40 compute-0 NetworkManager[55139]: <info>  [1769040460.1813] device (tap65f4805a-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:07:40 compute-0 nova_compute[182935]: 2026-01-22 00:07:40.193 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:40 compute-0 ovn_controller[95047]: 2026-01-22T00:07:40Z|00430|binding|INFO|Releasing lport 65f4805a-47f5-4285-af8b-236f66964a00 from this chassis (sb_readonly=0)
Jan 22 00:07:40 compute-0 ovn_controller[95047]: 2026-01-22T00:07:40Z|00431|binding|INFO|Setting lport 65f4805a-47f5-4285-af8b-236f66964a00 down in Southbound
Jan 22 00:07:40 compute-0 ovn_controller[95047]: 2026-01-22T00:07:40Z|00432|binding|INFO|Removing iface tap65f4805a-47 ovn-installed in OVS
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.203 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:97:19 10.100.0.5'], port_security=['fa:16:3e:0e:97:19 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '59441413-f484-464f-b5e2-f8d3aeb80f83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=65f4805a-47f5-4285-af8b-236f66964a00) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:07:40 compute-0 nova_compute[182935]: 2026-01-22 00:07:40.209 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.211 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 65f4805a-47f5-4285-af8b-236f66964a00 in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e unbound from our chassis
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.215 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d94993bc-77ac-42d2-88cb-3b0110dff29e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.218 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f90740fe-e853-4475-9d03-b31e59d15a1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.221 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace which is not needed anymore
Jan 22 00:07:40 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000065.scope: Deactivated successfully.
Jan 22 00:07:40 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000065.scope: Consumed 14.015s CPU time.
Jan 22 00:07:40 compute-0 systemd-machined[154182]: Machine qemu-55-instance-00000065 terminated.
Jan 22 00:07:40 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228656]: [NOTICE]   (228660) : haproxy version is 2.8.14-c23fe91
Jan 22 00:07:40 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228656]: [NOTICE]   (228660) : path to executable is /usr/sbin/haproxy
Jan 22 00:07:40 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228656]: [WARNING]  (228660) : Exiting Master process...
Jan 22 00:07:40 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228656]: [WARNING]  (228660) : Exiting Master process...
Jan 22 00:07:40 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228656]: [ALERT]    (228660) : Current worker (228662) exited with code 143 (Terminated)
Jan 22 00:07:40 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[228656]: [WARNING]  (228660) : All workers exited. Exiting... (0)
Jan 22 00:07:40 compute-0 systemd[1]: libpod-d6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58.scope: Deactivated successfully.
Jan 22 00:07:40 compute-0 podman[228928]: 2026-01-22 00:07:40.378397025 +0000 UTC m=+0.047238913 container died d6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:07:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58-userdata-shm.mount: Deactivated successfully.
Jan 22 00:07:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-22fef007081a179aa04469f4d1a7bae941c7eeaea48e91b3f64d2b16643876e5-merged.mount: Deactivated successfully.
Jan 22 00:07:40 compute-0 nova_compute[182935]: 2026-01-22 00:07:40.418 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:40 compute-0 nova_compute[182935]: 2026-01-22 00:07:40.424 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:40 compute-0 podman[228928]: 2026-01-22 00:07:40.42837391 +0000 UTC m=+0.097215808 container cleanup d6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:07:40 compute-0 systemd[1]: libpod-conmon-d6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58.scope: Deactivated successfully.
Jan 22 00:07:40 compute-0 podman[228970]: 2026-01-22 00:07:40.503844905 +0000 UTC m=+0.048476732 container remove d6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.511 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c33c38f4-3a12-46f1-9851-b55bdb4c672b]: (4, ('Thu Jan 22 12:07:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (d6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58)\nd6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58\nThu Jan 22 12:07:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (d6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58)\nd6de45feb4137e2522a0749323647e12741c44bc59f8a2d42b347badcb07ae58\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.513 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8d3a01b8-2a56-49c8-93fc-df52a9ab2f96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.515 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:40 compute-0 nova_compute[182935]: 2026-01-22 00:07:40.517 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:40 compute-0 kernel: tapd94993bc-70: left promiscuous mode
Jan 22 00:07:40 compute-0 nova_compute[182935]: 2026-01-22 00:07:40.536 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.540 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[859dd411-f274-41b9-947d-b5ba9732964a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.554 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[356ae1bb-63fa-4df1-b9c7-f287d04ced24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.556 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f72de4f3-5f8b-4553-81a7-57229ffc9ec2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.573 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[11158025-beb0-42dc-882e-cf43ceb09a3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494621, 'reachable_time': 39648, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228994, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:40 compute-0 systemd[1]: run-netns-ovnmeta\x2dd94993bc\x2d77ac\x2d42d2\x2d88cb\x2d3b0110dff29e.mount: Deactivated successfully.
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.578 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:07:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:40.579 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[47eb95d5-abbf-40bf-8155-8096996cdfbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:40 compute-0 nova_compute[182935]: 2026-01-22 00:07:40.749 182939 INFO nova.virt.libvirt.driver [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Instance shutdown successfully after 14 seconds.
Jan 22 00:07:40 compute-0 nova_compute[182935]: 2026-01-22 00:07:40.756 182939 INFO nova.virt.libvirt.driver [-] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Instance destroyed successfully.
Jan 22 00:07:40 compute-0 nova_compute[182935]: 2026-01-22 00:07:40.756 182939 DEBUG nova.objects.instance [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'numa_topology' on Instance uuid 59441413-f484-464f-b5e2-f8d3aeb80f83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:40 compute-0 nova_compute[182935]: 2026-01-22 00:07:40.821 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:41 compute-0 nova_compute[182935]: 2026-01-22 00:07:41.014 182939 INFO nova.virt.libvirt.driver [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Beginning cold snapshot process
Jan 22 00:07:41 compute-0 nova_compute[182935]: 2026-01-22 00:07:41.231 182939 DEBUG nova.privsep.utils [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 00:07:41 compute-0 nova_compute[182935]: 2026-01-22 00:07:41.231 182939 DEBUG oslo_concurrency.processutils [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk /var/lib/nova/instances/snapshots/tmphlnaxddd/da43c868d42743b5b648273eb151b3f3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:07:41 compute-0 nova_compute[182935]: 2026-01-22 00:07:41.506 182939 DEBUG oslo_concurrency.processutils [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83/disk /var/lib/nova/instances/snapshots/tmphlnaxddd/da43c868d42743b5b648273eb151b3f3" returned: 0 in 0.274s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:07:41 compute-0 nova_compute[182935]: 2026-01-22 00:07:41.507 182939 INFO nova.virt.libvirt.driver [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Snapshot extracted, beginning image upload
Jan 22 00:07:41 compute-0 nova_compute[182935]: 2026-01-22 00:07:41.742 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:41 compute-0 nova_compute[182935]: 2026-01-22 00:07:41.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:41 compute-0 nova_compute[182935]: 2026-01-22 00:07:41.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:41 compute-0 nova_compute[182935]: 2026-01-22 00:07:41.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:07:41 compute-0 nova_compute[182935]: 2026-01-22 00:07:41.798 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:42 compute-0 nova_compute[182935]: 2026-01-22 00:07:42.413 182939 DEBUG nova.compute.manager [req-d236b46a-bd57-446f-afd8-39e489bc6ca6 req-11e6b758-29be-4f7e-930b-abf56d871d3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Received event network-vif-unplugged-65f4805a-47f5-4285-af8b-236f66964a00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:42 compute-0 nova_compute[182935]: 2026-01-22 00:07:42.413 182939 DEBUG oslo_concurrency.lockutils [req-d236b46a-bd57-446f-afd8-39e489bc6ca6 req-11e6b758-29be-4f7e-930b-abf56d871d3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:42 compute-0 nova_compute[182935]: 2026-01-22 00:07:42.414 182939 DEBUG oslo_concurrency.lockutils [req-d236b46a-bd57-446f-afd8-39e489bc6ca6 req-11e6b758-29be-4f7e-930b-abf56d871d3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:42 compute-0 nova_compute[182935]: 2026-01-22 00:07:42.414 182939 DEBUG oslo_concurrency.lockutils [req-d236b46a-bd57-446f-afd8-39e489bc6ca6 req-11e6b758-29be-4f7e-930b-abf56d871d3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:42 compute-0 nova_compute[182935]: 2026-01-22 00:07:42.414 182939 DEBUG nova.compute.manager [req-d236b46a-bd57-446f-afd8-39e489bc6ca6 req-11e6b758-29be-4f7e-930b-abf56d871d3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] No waiting events found dispatching network-vif-unplugged-65f4805a-47f5-4285-af8b-236f66964a00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:07:42 compute-0 nova_compute[182935]: 2026-01-22 00:07:42.415 182939 WARNING nova.compute.manager [req-d236b46a-bd57-446f-afd8-39e489bc6ca6 req-11e6b758-29be-4f7e-930b-abf56d871d3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Received unexpected event network-vif-unplugged-65f4805a-47f5-4285-af8b-236f66964a00 for instance with vm_state active and task_state shelving_image_uploading.
Jan 22 00:07:44 compute-0 ovn_controller[95047]: 2026-01-22T00:07:44Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e2:b9:87 10.100.0.20
Jan 22 00:07:44 compute-0 ovn_controller[95047]: 2026-01-22T00:07:44Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e2:b9:87 10.100.0.20
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.142 182939 INFO nova.virt.libvirt.driver [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Snapshot image upload complete
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.142 182939 DEBUG nova.compute.manager [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.240 182939 INFO nova.compute.manager [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Shelve offloading
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.255 182939 INFO nova.virt.libvirt.driver [-] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Instance destroyed successfully.
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.256 182939 DEBUG nova.compute.manager [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.258 182939 DEBUG oslo_concurrency.lockutils [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.258 182939 DEBUG oslo_concurrency.lockutils [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquired lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.258 182939 DEBUG nova.network.neutron [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:07:44 compute-0 ovn_controller[95047]: 2026-01-22T00:07:44Z|00433|binding|INFO|Releasing lport 0de086b1-9b1a-4031-a6d1-e19e36eba182 from this chassis (sb_readonly=0)
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.343 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.529 182939 DEBUG nova.compute.manager [req-6171b3ad-4603-4b4e-b950-85593d256960 req-fb7d0148-934f-4302-8b95-3562b56f80f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Received event network-vif-plugged-65f4805a-47f5-4285-af8b-236f66964a00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.529 182939 DEBUG oslo_concurrency.lockutils [req-6171b3ad-4603-4b4e-b950-85593d256960 req-fb7d0148-934f-4302-8b95-3562b56f80f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.530 182939 DEBUG oslo_concurrency.lockutils [req-6171b3ad-4603-4b4e-b950-85593d256960 req-fb7d0148-934f-4302-8b95-3562b56f80f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.530 182939 DEBUG oslo_concurrency.lockutils [req-6171b3ad-4603-4b4e-b950-85593d256960 req-fb7d0148-934f-4302-8b95-3562b56f80f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.530 182939 DEBUG nova.compute.manager [req-6171b3ad-4603-4b4e-b950-85593d256960 req-fb7d0148-934f-4302-8b95-3562b56f80f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] No waiting events found dispatching network-vif-plugged-65f4805a-47f5-4285-af8b-236f66964a00 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:07:44 compute-0 nova_compute[182935]: 2026-01-22 00:07:44.530 182939 WARNING nova.compute.manager [req-6171b3ad-4603-4b4e-b950-85593d256960 req-fb7d0148-934f-4302-8b95-3562b56f80f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Received unexpected event network-vif-plugged-65f4805a-47f5-4285-af8b-236f66964a00 for instance with vm_state shelved and task_state shelving_offloading.
Jan 22 00:07:46 compute-0 nova_compute[182935]: 2026-01-22 00:07:46.746 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:46 compute-0 nova_compute[182935]: 2026-01-22 00:07:46.801 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:47 compute-0 nova_compute[182935]: 2026-01-22 00:07:47.057 182939 DEBUG nova.network.neutron [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Updating instance_info_cache with network_info: [{"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f4805a-47", "ovs_interfaceid": "65f4805a-47f5-4285-af8b-236f66964a00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:47 compute-0 nova_compute[182935]: 2026-01-22 00:07:47.091 182939 DEBUG oslo_concurrency.lockutils [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Releasing lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:47 compute-0 nova_compute[182935]: 2026-01-22 00:07:47.804 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.896 182939 INFO nova.virt.libvirt.driver [-] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Instance destroyed successfully.
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.897 182939 DEBUG nova.objects.instance [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'resources' on Instance uuid 59441413-f484-464f-b5e2-f8d3aeb80f83 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.917 182939 DEBUG nova.virt.libvirt.vif [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:07:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-181402284',display_name='tempest-DeleteServersTestJSON-server-181402284',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-181402284',id=101,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:07:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-qws46uci',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member',shelved_at='2026-01-22T00:07:44.142738',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='c83a6088-1f6c-4423-b46a-d4266204a372'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:07:41Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=59441413-f484-464f-b5e2-f8d3aeb80f83,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f4805a-47", "ovs_interfaceid": "65f4805a-47f5-4285-af8b-236f66964a00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.917 182939 DEBUG nova.network.os_vif_util [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap65f4805a-47", "ovs_interfaceid": "65f4805a-47f5-4285-af8b-236f66964a00", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.918 182939 DEBUG nova.network.os_vif_util [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:97:19,bridge_name='br-int',has_traffic_filtering=True,id=65f4805a-47f5-4285-af8b-236f66964a00,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f4805a-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.918 182939 DEBUG os_vif [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:97:19,bridge_name='br-int',has_traffic_filtering=True,id=65f4805a-47f5-4285-af8b-236f66964a00,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f4805a-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.920 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.921 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65f4805a-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.922 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.924 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.927 182939 INFO os_vif [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:97:19,bridge_name='br-int',has_traffic_filtering=True,id=65f4805a-47f5-4285-af8b-236f66964a00,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap65f4805a-47')
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.927 182939 INFO nova.virt.libvirt.driver [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Deleting instance files /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83_del
Jan 22 00:07:50 compute-0 nova_compute[182935]: 2026-01-22 00:07:50.933 182939 INFO nova.virt.libvirt.driver [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Deletion of /var/lib/nova/instances/59441413-f484-464f-b5e2-f8d3aeb80f83_del complete
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.045 182939 DEBUG nova.compute.manager [req-937886b0-3b29-40a4-aa1a-9beff9e69d02 req-5a70a88f-01d2-4c25-bd0d-dc793cbc0267 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Received event network-changed-65f4805a-47f5-4285-af8b-236f66964a00 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.045 182939 DEBUG nova.compute.manager [req-937886b0-3b29-40a4-aa1a-9beff9e69d02 req-5a70a88f-01d2-4c25-bd0d-dc793cbc0267 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Refreshing instance network info cache due to event network-changed-65f4805a-47f5-4285-af8b-236f66964a00. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.045 182939 DEBUG oslo_concurrency.lockutils [req-937886b0-3b29-40a4-aa1a-9beff9e69d02 req-5a70a88f-01d2-4c25-bd0d-dc793cbc0267 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.046 182939 DEBUG oslo_concurrency.lockutils [req-937886b0-3b29-40a4-aa1a-9beff9e69d02 req-5a70a88f-01d2-4c25-bd0d-dc793cbc0267 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.046 182939 DEBUG nova.network.neutron [req-937886b0-3b29-40a4-aa1a-9beff9e69d02 req-5a70a88f-01d2-4c25-bd0d-dc793cbc0267 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Refreshing network info cache for port 65f4805a-47f5-4285-af8b-236f66964a00 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.079 182939 INFO nova.scheduler.client.report [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Deleted allocations for instance 59441413-f484-464f-b5e2-f8d3aeb80f83
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.162 182939 DEBUG oslo_concurrency.lockutils [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.163 182939 DEBUG oslo_concurrency.lockutils [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.218 182939 DEBUG nova.compute.provider_tree [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.232 182939 DEBUG nova.scheduler.client.report [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.259 182939 DEBUG oslo_concurrency.lockutils [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.342 182939 DEBUG oslo_concurrency.lockutils [None req-878ed38f-0671-4e8b-a3f8-621ac45d4de5 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "59441413-f484-464f-b5e2-f8d3aeb80f83" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 24.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.420 182939 DEBUG oslo_concurrency.lockutils [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.421 182939 DEBUG oslo_concurrency.lockutils [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.421 182939 DEBUG oslo_concurrency.lockutils [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.421 182939 DEBUG oslo_concurrency.lockutils [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.422 182939 DEBUG oslo_concurrency.lockutils [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.432 182939 INFO nova.compute.manager [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Terminating instance
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.443 182939 DEBUG nova.compute.manager [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:07:51 compute-0 kernel: tapae476462-b9 (unregistering): left promiscuous mode
Jan 22 00:07:51 compute-0 NetworkManager[55139]: <info>  [1769040471.4682] device (tapae476462-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.473 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:51 compute-0 ovn_controller[95047]: 2026-01-22T00:07:51Z|00434|binding|INFO|Releasing lport ae476462-b965-4dea-8a2a-9275391da91f from this chassis (sb_readonly=0)
Jan 22 00:07:51 compute-0 ovn_controller[95047]: 2026-01-22T00:07:51Z|00435|binding|INFO|Setting lport ae476462-b965-4dea-8a2a-9275391da91f down in Southbound
Jan 22 00:07:51 compute-0 ovn_controller[95047]: 2026-01-22T00:07:51Z|00436|binding|INFO|Removing iface tapae476462-b9 ovn-installed in OVS
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.476 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.487 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:b9:87 10.100.0.20'], port_security=['fa:16:3e:e2:b9:87 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': '19d62f20-26c4-46d6-ad9f-0ad16c60d542', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88aa5d18-f337-47b2-8592-39b5aa8263f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e8b6cc30-3c91-408f-9475-22e091641435', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7163db1f-5923-487d-80b0-b5662c1fa9e2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=ae476462-b965-4dea-8a2a-9275391da91f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.488 104408 INFO neutron.agent.ovn.metadata.agent [-] Port ae476462-b965-4dea-8a2a-9275391da91f in datapath 88aa5d18-f337-47b2-8592-39b5aa8263f7 unbound from our chassis
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.489 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88aa5d18-f337-47b2-8592-39b5aa8263f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.490 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[02633840-45c1-4377-b3d2-bbc581f5742e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.490 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7 namespace which is not needed anymore
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.492 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:51 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000066.scope: Deactivated successfully.
Jan 22 00:07:51 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000066.scope: Consumed 15.089s CPU time.
Jan 22 00:07:51 compute-0 systemd-machined[154182]: Machine qemu-56-instance-00000066 terminated.
Jan 22 00:07:51 compute-0 neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7[228777]: [NOTICE]   (228790) : haproxy version is 2.8.14-c23fe91
Jan 22 00:07:51 compute-0 neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7[228777]: [NOTICE]   (228790) : path to executable is /usr/sbin/haproxy
Jan 22 00:07:51 compute-0 neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7[228777]: [WARNING]  (228790) : Exiting Master process...
Jan 22 00:07:51 compute-0 neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7[228777]: [ALERT]    (228790) : Current worker (228795) exited with code 143 (Terminated)
Jan 22 00:07:51 compute-0 neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7[228777]: [WARNING]  (228790) : All workers exited. Exiting... (0)
Jan 22 00:07:51 compute-0 systemd[1]: libpod-5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f.scope: Deactivated successfully.
Jan 22 00:07:51 compute-0 podman[229046]: 2026-01-22 00:07:51.623673384 +0000 UTC m=+0.046181879 container died 5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:07:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f-userdata-shm.mount: Deactivated successfully.
Jan 22 00:07:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-2a195f2a734cdd028291e01046053d49e8a13c5e9eaba7efd9cbc637faec7196-merged.mount: Deactivated successfully.
Jan 22 00:07:51 compute-0 podman[229046]: 2026-01-22 00:07:51.655654922 +0000 UTC m=+0.078163407 container cleanup 5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:07:51 compute-0 systemd[1]: libpod-conmon-5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f.scope: Deactivated successfully.
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.712 182939 INFO nova.virt.libvirt.driver [-] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Instance destroyed successfully.
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.713 182939 DEBUG nova.objects.instance [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 19d62f20-26c4-46d6-ad9f-0ad16c60d542 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:07:51 compute-0 podman[229077]: 2026-01-22 00:07:51.725756282 +0000 UTC m=+0.048542482 container remove 5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.732 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[84830fed-cc56-4a77-aa16-804be68cab87]: (4, ('Thu Jan 22 12:07:51 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7 (5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f)\n5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f\nThu Jan 22 12:07:51 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7 (5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f)\n5c68006989d7fc192574332fa2ba56173647fee1e9ddef6bb96fc1afee314f3f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.734 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[16c14fe0-7e03-4330-8d1d-1c45b6eb8739]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.735 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88aa5d18-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.736 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:51 compute-0 kernel: tap88aa5d18-f0: left promiscuous mode
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.740 182939 DEBUG nova.virt.libvirt.vif [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:07:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-350797465',display_name='tempest-TestNetworkBasicOps-server-350797465',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-350797465',id=102,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFhpGikw3/h40VsG/Jm7LsXsSzYtwE7fCM0r3J6CqAyw+bIH5ldw9RHiK36T6EKitBYk60DQfyUlm6WtmOYvP23GMVLtMXdNYalbfeix+4qL1C0Gq789f9cMRNBkYgWtvQ==',key_name='tempest-TestNetworkBasicOps-538921274',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:07:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-j43mk00m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:07:29Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=19d62f20-26c4-46d6-ad9f-0ad16c60d542,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ae476462-b965-4dea-8a2a-9275391da91f", "address": "fa:16:3e:e2:b9:87", "network": {"id": "88aa5d18-f337-47b2-8592-39b5aa8263f7", "bridge": "br-int", "label": "tempest-network-smoke--210221422", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae476462-b9", "ovs_interfaceid": "ae476462-b965-4dea-8a2a-9275391da91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.741 182939 DEBUG nova.network.os_vif_util [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "ae476462-b965-4dea-8a2a-9275391da91f", "address": "fa:16:3e:e2:b9:87", "network": {"id": "88aa5d18-f337-47b2-8592-39b5aa8263f7", "bridge": "br-int", "label": "tempest-network-smoke--210221422", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae476462-b9", "ovs_interfaceid": "ae476462-b965-4dea-8a2a-9275391da91f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.742 182939 DEBUG nova.network.os_vif_util [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:b9:87,bridge_name='br-int',has_traffic_filtering=True,id=ae476462-b965-4dea-8a2a-9275391da91f,network=Network(88aa5d18-f337-47b2-8592-39b5aa8263f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae476462-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.742 182939 DEBUG os_vif [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:b9:87,bridge_name='br-int',has_traffic_filtering=True,id=ae476462-b965-4dea-8a2a-9275391da91f,network=Network(88aa5d18-f337-47b2-8592-39b5aa8263f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae476462-b9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.743 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.744 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae476462-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.745 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.746 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.751 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.755 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4f0b61e9-ccb8-49f3-812a-e7f3e000f3e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.754 182939 INFO os_vif [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:b9:87,bridge_name='br-int',has_traffic_filtering=True,id=ae476462-b965-4dea-8a2a-9275391da91f,network=Network(88aa5d18-f337-47b2-8592-39b5aa8263f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae476462-b9')
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.755 182939 INFO nova.virt.libvirt.driver [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Deleting instance files /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542_del
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.756 182939 INFO nova.virt.libvirt.driver [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Deletion of /var/lib/nova/instances/19d62f20-26c4-46d6-ad9f-0ad16c60d542_del complete
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.766 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5bf8c69e-d238-42b8-b973-90d2bf3b43e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.768 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[749cf97b-941b-4fd2-844b-b7fe3f6f8899]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.786 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e38de581-3051-4391-b570-4f3e0a1e0c30]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495028, 'reachable_time': 15626, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229121, 'error': None, 'target': 'ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:51 compute-0 systemd[1]: run-netns-ovnmeta\x2d88aa5d18\x2df337\x2d47b2\x2d8592\x2d39b5aa8263f7.mount: Deactivated successfully.
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.789 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88aa5d18-f337-47b2-8592-39b5aa8263f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:07:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:07:51.789 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[a121da1d-3450-4716-9ac4-54fc9224d149]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.803 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.824 182939 DEBUG nova.compute.manager [req-e61cff52-e2bf-4ddf-883d-a1d396caf6d3 req-c3e4f474-a993-4dce-bbdc-1e014a99415d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Received event network-vif-unplugged-ae476462-b965-4dea-8a2a-9275391da91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.825 182939 DEBUG oslo_concurrency.lockutils [req-e61cff52-e2bf-4ddf-883d-a1d396caf6d3 req-c3e4f474-a993-4dce-bbdc-1e014a99415d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.825 182939 DEBUG oslo_concurrency.lockutils [req-e61cff52-e2bf-4ddf-883d-a1d396caf6d3 req-c3e4f474-a993-4dce-bbdc-1e014a99415d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.825 182939 DEBUG oslo_concurrency.lockutils [req-e61cff52-e2bf-4ddf-883d-a1d396caf6d3 req-c3e4f474-a993-4dce-bbdc-1e014a99415d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.825 182939 DEBUG nova.compute.manager [req-e61cff52-e2bf-4ddf-883d-a1d396caf6d3 req-c3e4f474-a993-4dce-bbdc-1e014a99415d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] No waiting events found dispatching network-vif-unplugged-ae476462-b965-4dea-8a2a-9275391da91f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.826 182939 DEBUG nova.compute.manager [req-e61cff52-e2bf-4ddf-883d-a1d396caf6d3 req-c3e4f474-a993-4dce-bbdc-1e014a99415d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Received event network-vif-unplugged-ae476462-b965-4dea-8a2a-9275391da91f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.842 182939 INFO nova.compute.manager [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.843 182939 DEBUG oslo.service.loopingcall [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.843 182939 DEBUG nova.compute.manager [-] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:07:51 compute-0 nova_compute[182935]: 2026-01-22 00:07:51.844 182939 DEBUG nova.network.neutron [-] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:07:51 compute-0 podman[229107]: 2026-01-22 00:07:51.846197906 +0000 UTC m=+0.078913525 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:07:51 compute-0 podman[229108]: 2026-01-22 00:07:51.887051571 +0000 UTC m=+0.115089572 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.464 182939 DEBUG nova.network.neutron [req-937886b0-3b29-40a4-aa1a-9beff9e69d02 req-5a70a88f-01d2-4c25-bd0d-dc793cbc0267 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Updated VIF entry in instance network info cache for port 65f4805a-47f5-4285-af8b-236f66964a00. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.464 182939 DEBUG nova.network.neutron [req-937886b0-3b29-40a4-aa1a-9beff9e69d02 req-5a70a88f-01d2-4c25-bd0d-dc793cbc0267 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Updating instance_info_cache with network_info: [{"id": "65f4805a-47f5-4285-af8b-236f66964a00", "address": "fa:16:3e:0e:97:19", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": null, "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap65f4805a-47", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.490 182939 DEBUG oslo_concurrency.lockutils [req-937886b0-3b29-40a4-aa1a-9beff9e69d02 req-5a70a88f-01d2-4c25-bd0d-dc793cbc0267 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-59441413-f484-464f-b5e2-f8d3aeb80f83" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.736 182939 DEBUG nova.network.neutron [-] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.754 182939 INFO nova.compute.manager [-] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Took 0.91 seconds to deallocate network for instance.
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.856 182939 DEBUG oslo_concurrency.lockutils [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.856 182939 DEBUG oslo_concurrency.lockutils [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.859 182939 DEBUG nova.compute.manager [req-cd6d35dd-2640-4c08-966f-25fc5a17205e req-c8178294-ecf8-4b2b-a3b2-490c337764e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Received event network-vif-deleted-ae476462-b965-4dea-8a2a-9275391da91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.930 182939 DEBUG nova.compute.provider_tree [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.948 182939 DEBUG nova.scheduler.client.report [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.970 182939 DEBUG oslo_concurrency.lockutils [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:52 compute-0 nova_compute[182935]: 2026-01-22 00:07:52.998 182939 INFO nova.scheduler.client.report [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 19d62f20-26c4-46d6-ad9f-0ad16c60d542
Jan 22 00:07:53 compute-0 nova_compute[182935]: 2026-01-22 00:07:53.081 182939 DEBUG oslo_concurrency.lockutils [None req-1a6e2bfe-e670-464c-8024-8cb0134ce22f 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:53 compute-0 nova_compute[182935]: 2026-01-22 00:07:53.923 182939 DEBUG nova.compute.manager [req-c59160bb-26ca-4738-a5a5-7df10f147844 req-e204e1ba-81a5-4c4f-b27d-d4fb01b057c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Received event network-vif-plugged-ae476462-b965-4dea-8a2a-9275391da91f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:07:53 compute-0 nova_compute[182935]: 2026-01-22 00:07:53.923 182939 DEBUG oslo_concurrency.lockutils [req-c59160bb-26ca-4738-a5a5-7df10f147844 req-e204e1ba-81a5-4c4f-b27d-d4fb01b057c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:53 compute-0 nova_compute[182935]: 2026-01-22 00:07:53.924 182939 DEBUG oslo_concurrency.lockutils [req-c59160bb-26ca-4738-a5a5-7df10f147844 req-e204e1ba-81a5-4c4f-b27d-d4fb01b057c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:07:53 compute-0 nova_compute[182935]: 2026-01-22 00:07:53.924 182939 DEBUG oslo_concurrency.lockutils [req-c59160bb-26ca-4738-a5a5-7df10f147844 req-e204e1ba-81a5-4c4f-b27d-d4fb01b057c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "19d62f20-26c4-46d6-ad9f-0ad16c60d542-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:07:53 compute-0 nova_compute[182935]: 2026-01-22 00:07:53.924 182939 DEBUG nova.compute.manager [req-c59160bb-26ca-4738-a5a5-7df10f147844 req-e204e1ba-81a5-4c4f-b27d-d4fb01b057c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] No waiting events found dispatching network-vif-plugged-ae476462-b965-4dea-8a2a-9275391da91f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:07:53 compute-0 nova_compute[182935]: 2026-01-22 00:07:53.924 182939 WARNING nova.compute.manager [req-c59160bb-26ca-4738-a5a5-7df10f147844 req-e204e1ba-81a5-4c4f-b27d-d4fb01b057c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Received unexpected event network-vif-plugged-ae476462-b965-4dea-8a2a-9275391da91f for instance with vm_state deleted and task_state None.
Jan 22 00:07:55 compute-0 nova_compute[182935]: 2026-01-22 00:07:55.125 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:55 compute-0 nova_compute[182935]: 2026-01-22 00:07:55.313 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:55 compute-0 nova_compute[182935]: 2026-01-22 00:07:55.468 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040460.4667838, 59441413-f484-464f-b5e2-f8d3aeb80f83 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:07:55 compute-0 nova_compute[182935]: 2026-01-22 00:07:55.469 182939 INFO nova.compute.manager [-] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] VM Stopped (Lifecycle Event)
Jan 22 00:07:55 compute-0 nova_compute[182935]: 2026-01-22 00:07:55.492 182939 DEBUG nova.compute.manager [None req-57632745-2acc-4f85-af78-6a053f3e701f - - - - - -] [instance: 59441413-f484-464f-b5e2-f8d3aeb80f83] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:07:55 compute-0 nova_compute[182935]: 2026-01-22 00:07:55.493 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:55 compute-0 nova_compute[182935]: 2026-01-22 00:07:55.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:55 compute-0 nova_compute[182935]: 2026-01-22 00:07:55.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:07:55 compute-0 nova_compute[182935]: 2026-01-22 00:07:55.812 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:07:56 compute-0 nova_compute[182935]: 2026-01-22 00:07:56.745 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:56 compute-0 nova_compute[182935]: 2026-01-22 00:07:56.806 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:07:58 compute-0 nova_compute[182935]: 2026-01-22 00:07:58.084 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:07:58 compute-0 podman[229161]: 2026-01-22 00:07:58.68458116 +0000 UTC m=+0.055447642 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:07:59 compute-0 sshd-session[229185]: Invalid user svn from 188.166.69.60 port 41372
Jan 22 00:07:59 compute-0 sshd-session[229185]: Connection closed by invalid user svn 188.166.69.60 port 41372 [preauth]
Jan 22 00:07:59 compute-0 nova_compute[182935]: 2026-01-22 00:07:59.992 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:07:59 compute-0 nova_compute[182935]: 2026-01-22 00:07:59.992 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.010 182939 DEBUG nova.compute.manager [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.155 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.156 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.162 182939 DEBUG nova.virt.hardware [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.163 182939 INFO nova.compute.claims [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.311 182939 DEBUG nova.compute.provider_tree [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.327 182939 DEBUG nova.scheduler.client.report [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.351 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.352 182939 DEBUG nova.compute.manager [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.415 182939 DEBUG nova.compute.manager [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.416 182939 DEBUG nova.network.neutron [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.460 182939 INFO nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.494 182939 DEBUG nova.compute.manager [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.611 182939 DEBUG nova.compute.manager [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.613 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.613 182939 INFO nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Creating image(s)
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.614 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "/var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.614 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.615 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.628 182939 DEBUG oslo_concurrency.processutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.688 182939 DEBUG oslo_concurrency.processutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.690 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.690 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.704 182939 DEBUG oslo_concurrency.processutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.760 182939 DEBUG oslo_concurrency.processutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.761 182939 DEBUG oslo_concurrency.processutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.798 182939 DEBUG oslo_concurrency.processutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.799 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.800 182939 DEBUG oslo_concurrency.processutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.857 182939 DEBUG oslo_concurrency.processutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.858 182939 DEBUG nova.virt.disk.api [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Checking if we can resize image /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.858 182939 DEBUG oslo_concurrency.processutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.919 182939 DEBUG oslo_concurrency.processutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.920 182939 DEBUG nova.virt.disk.api [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Cannot resize image /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.921 182939 DEBUG nova.objects.instance [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'migration_context' on Instance uuid 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.937 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.938 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Ensure instance console log exists: /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.939 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.939 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:00 compute-0 nova_compute[182935]: 2026-01-22 00:08:00.939 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:01 compute-0 nova_compute[182935]: 2026-01-22 00:08:01.059 182939 DEBUG nova.policy [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:08:01 compute-0 nova_compute[182935]: 2026-01-22 00:08:01.747 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:01 compute-0 nova_compute[182935]: 2026-01-22 00:08:01.809 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:02 compute-0 nova_compute[182935]: 2026-01-22 00:08:02.025 182939 DEBUG nova.network.neutron [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Successfully created port: 1d405d7f-e331-40a8-bc0c-242ed82d7807 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:08:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:03.203 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:03.204 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:03.204 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:04 compute-0 nova_compute[182935]: 2026-01-22 00:08:04.537 182939 DEBUG nova.network.neutron [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Successfully updated port: 1d405d7f-e331-40a8-bc0c-242ed82d7807 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:08:04 compute-0 nova_compute[182935]: 2026-01-22 00:08:04.559 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "refresh_cache-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:08:04 compute-0 nova_compute[182935]: 2026-01-22 00:08:04.559 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquired lock "refresh_cache-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:08:04 compute-0 nova_compute[182935]: 2026-01-22 00:08:04.559 182939 DEBUG nova.network.neutron [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:08:04 compute-0 nova_compute[182935]: 2026-01-22 00:08:04.679 182939 DEBUG nova.compute.manager [req-6eb3a50a-78a1-45ad-8a97-acf09c7b28f3 req-019ceec6-8bc2-4d4f-8272-8eacccbf780a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Received event network-changed-1d405d7f-e331-40a8-bc0c-242ed82d7807 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:04 compute-0 nova_compute[182935]: 2026-01-22 00:08:04.680 182939 DEBUG nova.compute.manager [req-6eb3a50a-78a1-45ad-8a97-acf09c7b28f3 req-019ceec6-8bc2-4d4f-8272-8eacccbf780a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Refreshing instance network info cache due to event network-changed-1d405d7f-e331-40a8-bc0c-242ed82d7807. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:08:04 compute-0 nova_compute[182935]: 2026-01-22 00:08:04.680 182939 DEBUG oslo_concurrency.lockutils [req-6eb3a50a-78a1-45ad-8a97-acf09c7b28f3 req-019ceec6-8bc2-4d4f-8272-8eacccbf780a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:08:04 compute-0 podman[229202]: 2026-01-22 00:08:04.701204741 +0000 UTC m=+0.071435002 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 22 00:08:04 compute-0 nova_compute[182935]: 2026-01-22 00:08:04.894 182939 DEBUG nova.network.neutron [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.441 182939 DEBUG nova.network.neutron [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Updating instance_info_cache with network_info: [{"id": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "address": "fa:16:3e:f6:53:88", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d405d7f-e3", "ovs_interfaceid": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.468 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Releasing lock "refresh_cache-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.468 182939 DEBUG nova.compute.manager [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Instance network_info: |[{"id": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "address": "fa:16:3e:f6:53:88", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d405d7f-e3", "ovs_interfaceid": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.469 182939 DEBUG oslo_concurrency.lockutils [req-6eb3a50a-78a1-45ad-8a97-acf09c7b28f3 req-019ceec6-8bc2-4d4f-8272-8eacccbf780a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.469 182939 DEBUG nova.network.neutron [req-6eb3a50a-78a1-45ad-8a97-acf09c7b28f3 req-019ceec6-8bc2-4d4f-8272-8eacccbf780a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Refreshing network info cache for port 1d405d7f-e331-40a8-bc0c-242ed82d7807 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.472 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Start _get_guest_xml network_info=[{"id": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "address": "fa:16:3e:f6:53:88", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d405d7f-e3", "ovs_interfaceid": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.478 182939 WARNING nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.482 182939 DEBUG nova.virt.libvirt.host [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.483 182939 DEBUG nova.virt.libvirt.host [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.488 182939 DEBUG nova.virt.libvirt.host [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.489 182939 DEBUG nova.virt.libvirt.host [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.492 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.493 182939 DEBUG nova.virt.hardware [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.494 182939 DEBUG nova.virt.hardware [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.494 182939 DEBUG nova.virt.hardware [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.494 182939 DEBUG nova.virt.hardware [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.494 182939 DEBUG nova.virt.hardware [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.495 182939 DEBUG nova.virt.hardware [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.495 182939 DEBUG nova.virt.hardware [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.495 182939 DEBUG nova.virt.hardware [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.495 182939 DEBUG nova.virt.hardware [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.495 182939 DEBUG nova.virt.hardware [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.496 182939 DEBUG nova.virt.hardware [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.500 182939 DEBUG nova.virt.libvirt.vif [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:07:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1267757132',display_name='tempest-DeleteServersTestJSON-server-1267757132',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1267757132',id=105,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-0fci23xo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:00Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "address": "fa:16:3e:f6:53:88", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d405d7f-e3", "ovs_interfaceid": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.500 182939 DEBUG nova.network.os_vif_util [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "address": "fa:16:3e:f6:53:88", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d405d7f-e3", "ovs_interfaceid": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.501 182939 DEBUG nova.network.os_vif_util [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:88,bridge_name='br-int',has_traffic_filtering=True,id=1d405d7f-e331-40a8-bc0c-242ed82d7807,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d405d7f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.502 182939 DEBUG nova.objects.instance [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.526 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:08:06 compute-0 nova_compute[182935]:   <uuid>9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef</uuid>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   <name>instance-00000069</name>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <nova:name>tempest-DeleteServersTestJSON-server-1267757132</nova:name>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:08:06</nova:creationTime>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:08:06 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:08:06 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:08:06 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:08:06 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:08:06 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:08:06 compute-0 nova_compute[182935]:         <nova:user uuid="74ad1bf274924c52af96aa4c6d431410">tempest-DeleteServersTestJSON-2033458913-project-member</nova:user>
Jan 22 00:08:06 compute-0 nova_compute[182935]:         <nova:project uuid="3822e32efd5647aebf2d79a3dd038bd4">tempest-DeleteServersTestJSON-2033458913</nova:project>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:08:06 compute-0 nova_compute[182935]:         <nova:port uuid="1d405d7f-e331-40a8-bc0c-242ed82d7807">
Jan 22 00:08:06 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <system>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <entry name="serial">9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef</entry>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <entry name="uuid">9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef</entry>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     </system>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   <os>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   </os>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   <features>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   </features>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.config"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:f6:53:88"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <target dev="tap1d405d7f-e3"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/console.log" append="off"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <video>
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     </video>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:08:06 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:08:06 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:08:06 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:08:06 compute-0 nova_compute[182935]: </domain>
Jan 22 00:08:06 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.527 182939 DEBUG nova.compute.manager [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Preparing to wait for external event network-vif-plugged-1d405d7f-e331-40a8-bc0c-242ed82d7807 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.528 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.528 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.528 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.529 182939 DEBUG nova.virt.libvirt.vif [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:07:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1267757132',display_name='tempest-DeleteServersTestJSON-server-1267757132',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1267757132',id=105,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-0fci23xo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:00Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "address": "fa:16:3e:f6:53:88", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d405d7f-e3", "ovs_interfaceid": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.529 182939 DEBUG nova.network.os_vif_util [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "address": "fa:16:3e:f6:53:88", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d405d7f-e3", "ovs_interfaceid": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.530 182939 DEBUG nova.network.os_vif_util [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:88,bridge_name='br-int',has_traffic_filtering=True,id=1d405d7f-e331-40a8-bc0c-242ed82d7807,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d405d7f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.531 182939 DEBUG os_vif [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:88,bridge_name='br-int',has_traffic_filtering=True,id=1d405d7f-e331-40a8-bc0c-242ed82d7807,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d405d7f-e3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.531 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.532 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.532 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.536 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.536 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1d405d7f-e3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.536 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1d405d7f-e3, col_values=(('external_ids', {'iface-id': '1d405d7f-e331-40a8-bc0c-242ed82d7807', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:53:88', 'vm-uuid': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.538 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:06 compute-0 NetworkManager[55139]: <info>  [1769040486.5400] manager: (tap1d405d7f-e3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.541 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.544 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.545 182939 INFO os_vif [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:88,bridge_name='br-int',has_traffic_filtering=True,id=1d405d7f-e331-40a8-bc0c-242ed82d7807,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d405d7f-e3')
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.710 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040471.7096374, 19d62f20-26c4-46d6-ad9f-0ad16c60d542 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.711 182939 INFO nova.compute.manager [-] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] VM Stopped (Lifecycle Event)
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.747 182939 DEBUG nova.compute.manager [None req-04ee7c0c-e6b2-486a-8457-7b7ad4276e80 - - - - - -] [instance: 19d62f20-26c4-46d6-ad9f-0ad16c60d542] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.753 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.753 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.753 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No VIF found with MAC fa:16:3e:f6:53:88, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.754 182939 INFO nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Using config drive
Jan 22 00:08:06 compute-0 nova_compute[182935]: 2026-01-22 00:08:06.811 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:07 compute-0 nova_compute[182935]: 2026-01-22 00:08:07.257 182939 INFO nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Creating config drive at /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.config
Jan 22 00:08:07 compute-0 nova_compute[182935]: 2026-01-22 00:08:07.265 182939 DEBUG oslo_concurrency.processutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0ssfjsar execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:07 compute-0 nova_compute[182935]: 2026-01-22 00:08:07.402 182939 DEBUG oslo_concurrency.processutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0ssfjsar" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:07 compute-0 kernel: tap1d405d7f-e3: entered promiscuous mode
Jan 22 00:08:07 compute-0 NetworkManager[55139]: <info>  [1769040487.4754] manager: (tap1d405d7f-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Jan 22 00:08:07 compute-0 ovn_controller[95047]: 2026-01-22T00:08:07Z|00437|binding|INFO|Claiming lport 1d405d7f-e331-40a8-bc0c-242ed82d7807 for this chassis.
Jan 22 00:08:07 compute-0 ovn_controller[95047]: 2026-01-22T00:08:07Z|00438|binding|INFO|1d405d7f-e331-40a8-bc0c-242ed82d7807: Claiming fa:16:3e:f6:53:88 10.100.0.14
Jan 22 00:08:07 compute-0 nova_compute[182935]: 2026-01-22 00:08:07.477 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:07 compute-0 nova_compute[182935]: 2026-01-22 00:08:07.480 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.489 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:53:88 10.100.0.14'], port_security=['fa:16:3e:f6:53:88 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=1d405d7f-e331-40a8-bc0c-242ed82d7807) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.492 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 1d405d7f-e331-40a8-bc0c-242ed82d7807 in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e bound to our chassis
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.493 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.510 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cbd2bf35-1839-4876-b77e-466aeab4de6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.511 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd94993bc-71 in ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:08:07 compute-0 systemd-udevd[229241]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.515 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd94993bc-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.515 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[32ba83b2-57cd-4738-b527-c3c5f4e19bf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.516 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[815c549a-f881-48b9-8bba-cfdcc81e6258]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 systemd-machined[154182]: New machine qemu-57-instance-00000069.
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.529 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[c8b104cb-0bab-4ef5-a560-a89f63cf345a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 nova_compute[182935]: 2026-01-22 00:08:07.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:07 compute-0 systemd[1]: Started Virtual Machine qemu-57-instance-00000069.
Jan 22 00:08:07 compute-0 NetworkManager[55139]: <info>  [1769040487.5399] device (tap1d405d7f-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:08:07 compute-0 NetworkManager[55139]: <info>  [1769040487.5409] device (tap1d405d7f-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:08:07 compute-0 ovn_controller[95047]: 2026-01-22T00:08:07Z|00439|binding|INFO|Setting lport 1d405d7f-e331-40a8-bc0c-242ed82d7807 ovn-installed in OVS
Jan 22 00:08:07 compute-0 ovn_controller[95047]: 2026-01-22T00:08:07Z|00440|binding|INFO|Setting lport 1d405d7f-e331-40a8-bc0c-242ed82d7807 up in Southbound
Jan 22 00:08:07 compute-0 nova_compute[182935]: 2026-01-22 00:08:07.541 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.556 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ab0bc609-1cf0-4891-a62f-e0e1f84d7435]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.581 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[29f207e2-ed5e-44cd-b07f-3464995fa7ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 NetworkManager[55139]: <info>  [1769040487.5882] manager: (tapd94993bc-70): new Veth device (/org/freedesktop/NetworkManager/Devices/205)
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.587 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c4decd53-6dc6-462f-a077-d0296a467394]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 systemd-udevd[229246]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.620 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d8af6f-9a48-4ca7-bd82-7a796331aef8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.625 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d16e8bf9-1076-4b11-b953-8c4c3e061bfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 NetworkManager[55139]: <info>  [1769040487.6509] device (tapd94993bc-70): carrier: link connected
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.657 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[12b9b25a-da6c-4cd6-9f80-f864c89d66f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.673 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d257a6f4-4624-43ba-a575-ee36850e656d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499038, 'reachable_time': 33966, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229274, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.692 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[88a379a0-286a-4d3f-a5b1-2ebe2328461f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:eecd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 499038, 'tstamp': 499038}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229275, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.707 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7cd90136-d40d-4dec-82ff-e4f1e35c1988]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499038, 'reachable_time': 33966, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229276, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.739 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6fcf9e3f-c8de-4457-97f5-46564b7c6484]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.795 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[14e73b19-dd62-4546-b575-2738195bd9e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.796 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.797 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.797 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd94993bc-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:07 compute-0 NetworkManager[55139]: <info>  [1769040487.8006] manager: (tapd94993bc-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Jan 22 00:08:07 compute-0 nova_compute[182935]: 2026-01-22 00:08:07.800 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:07 compute-0 kernel: tapd94993bc-70: entered promiscuous mode
Jan 22 00:08:07 compute-0 nova_compute[182935]: 2026-01-22 00:08:07.802 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.803 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd94993bc-70, col_values=(('external_ids', {'iface-id': 'd921ee25-8f8a-4375-9839-6c54ab328e88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:07 compute-0 nova_compute[182935]: 2026-01-22 00:08:07.804 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:07 compute-0 ovn_controller[95047]: 2026-01-22T00:08:07Z|00441|binding|INFO|Releasing lport d921ee25-8f8a-4375-9839-6c54ab328e88 from this chassis (sb_readonly=0)
Jan 22 00:08:07 compute-0 nova_compute[182935]: 2026-01-22 00:08:07.814 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.815 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.816 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d403f17c-7d4f-4ee4-bcfe-7bf040423615]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.816 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:08:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:07.817 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'env', 'PROCESS_TAG=haproxy-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d94993bc-77ac-42d2-88cb-3b0110dff29e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:08:08 compute-0 podman[229308]: 2026-01-22 00:08:08.140752628 +0000 UTC m=+0.022183394 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:08:08 compute-0 podman[229308]: 2026-01-22 00:08:08.250176637 +0000 UTC m=+0.131607383 container create 03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:08:08 compute-0 nova_compute[182935]: 2026-01-22 00:08:08.251 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040488.2505803, 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:08 compute-0 nova_compute[182935]: 2026-01-22 00:08:08.251 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] VM Started (Lifecycle Event)
Jan 22 00:08:08 compute-0 systemd[1]: Started libpod-conmon-03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117.scope.
Jan 22 00:08:08 compute-0 nova_compute[182935]: 2026-01-22 00:08:08.295 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:08 compute-0 nova_compute[182935]: 2026-01-22 00:08:08.300 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040488.2543316, 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:08 compute-0 nova_compute[182935]: 2026-01-22 00:08:08.300 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] VM Paused (Lifecycle Event)
Jan 22 00:08:08 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:08:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fac0f755b561350ffd853229bafea7ec96fd51be183810374b4908e6a0c2347/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:08:08 compute-0 nova_compute[182935]: 2026-01-22 00:08:08.322 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:08 compute-0 nova_compute[182935]: 2026-01-22 00:08:08.327 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:08:08 compute-0 nova_compute[182935]: 2026-01-22 00:08:08.348 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:08:08 compute-0 podman[229308]: 2026-01-22 00:08:08.354234082 +0000 UTC m=+0.235664868 container init 03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 00:08:08 compute-0 podman[229328]: 2026-01-22 00:08:08.353789261 +0000 UTC m=+0.069170459 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Jan 22 00:08:08 compute-0 podman[229308]: 2026-01-22 00:08:08.360947208 +0000 UTC m=+0.242377954 container start 03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:08:08 compute-0 podman[229329]: 2026-01-22 00:08:08.374813647 +0000 UTC m=+0.086473769 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:08:08 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229332]: [NOTICE]   (229372) : New worker (229374) forked
Jan 22 00:08:08 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229332]: [NOTICE]   (229372) : Loading success.
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.133 182939 DEBUG nova.network.neutron [req-6eb3a50a-78a1-45ad-8a97-acf09c7b28f3 req-019ceec6-8bc2-4d4f-8272-8eacccbf780a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Updated VIF entry in instance network info cache for port 1d405d7f-e331-40a8-bc0c-242ed82d7807. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.134 182939 DEBUG nova.network.neutron [req-6eb3a50a-78a1-45ad-8a97-acf09c7b28f3 req-019ceec6-8bc2-4d4f-8272-8eacccbf780a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Updating instance_info_cache with network_info: [{"id": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "address": "fa:16:3e:f6:53:88", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d405d7f-e3", "ovs_interfaceid": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.156 182939 DEBUG oslo_concurrency.lockutils [req-6eb3a50a-78a1-45ad-8a97-acf09c7b28f3 req-019ceec6-8bc2-4d4f-8272-8eacccbf780a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.289 182939 DEBUG nova.compute.manager [req-1d593317-df05-47bb-9b2a-d14188d645a0 req-7a29c06f-cc3e-46da-a014-309b03e58445 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Received event network-vif-plugged-1d405d7f-e331-40a8-bc0c-242ed82d7807 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.289 182939 DEBUG oslo_concurrency.lockutils [req-1d593317-df05-47bb-9b2a-d14188d645a0 req-7a29c06f-cc3e-46da-a014-309b03e58445 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.290 182939 DEBUG oslo_concurrency.lockutils [req-1d593317-df05-47bb-9b2a-d14188d645a0 req-7a29c06f-cc3e-46da-a014-309b03e58445 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.290 182939 DEBUG oslo_concurrency.lockutils [req-1d593317-df05-47bb-9b2a-d14188d645a0 req-7a29c06f-cc3e-46da-a014-309b03e58445 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.290 182939 DEBUG nova.compute.manager [req-1d593317-df05-47bb-9b2a-d14188d645a0 req-7a29c06f-cc3e-46da-a014-309b03e58445 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Processing event network-vif-plugged-1d405d7f-e331-40a8-bc0c-242ed82d7807 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.291 182939 DEBUG nova.compute.manager [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.294 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040489.2942557, 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.294 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] VM Resumed (Lifecycle Event)
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.296 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.301 182939 INFO nova.virt.libvirt.driver [-] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Instance spawned successfully.
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.302 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.325 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.331 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.334 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.334 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.335 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.335 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.335 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.336 182939 DEBUG nova.virt.libvirt.driver [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.380 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.437 182939 INFO nova.compute.manager [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Took 8.82 seconds to spawn the instance on the hypervisor.
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.437 182939 DEBUG nova.compute.manager [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.559 182939 INFO nova.compute.manager [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Took 9.44 seconds to build instance.
Jan 22 00:08:09 compute-0 nova_compute[182935]: 2026-01-22 00:08:09.597 182939 DEBUG oslo_concurrency.lockutils [None req-3d559a04-d18c-46e4-8c16-fbf43739e63a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:10 compute-0 nova_compute[182935]: 2026-01-22 00:08:10.243 182939 DEBUG oslo_concurrency.lockutils [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:10 compute-0 nova_compute[182935]: 2026-01-22 00:08:10.244 182939 DEBUG oslo_concurrency.lockutils [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:10 compute-0 nova_compute[182935]: 2026-01-22 00:08:10.244 182939 DEBUG nova.compute.manager [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:10 compute-0 nova_compute[182935]: 2026-01-22 00:08:10.247 182939 DEBUG nova.compute.manager [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 22 00:08:10 compute-0 nova_compute[182935]: 2026-01-22 00:08:10.248 182939 DEBUG nova.objects.instance [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'flavor' on Instance uuid 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:10 compute-0 nova_compute[182935]: 2026-01-22 00:08:10.286 182939 DEBUG nova.objects.instance [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'info_cache' on Instance uuid 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:10 compute-0 nova_compute[182935]: 2026-01-22 00:08:10.347 182939 DEBUG nova.virt.libvirt.driver [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:08:11 compute-0 nova_compute[182935]: 2026-01-22 00:08:11.427 182939 DEBUG nova.compute.manager [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Received event network-vif-plugged-1d405d7f-e331-40a8-bc0c-242ed82d7807 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:11 compute-0 nova_compute[182935]: 2026-01-22 00:08:11.427 182939 DEBUG oslo_concurrency.lockutils [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:11 compute-0 nova_compute[182935]: 2026-01-22 00:08:11.427 182939 DEBUG oslo_concurrency.lockutils [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:11 compute-0 nova_compute[182935]: 2026-01-22 00:08:11.428 182939 DEBUG oslo_concurrency.lockutils [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:11 compute-0 nova_compute[182935]: 2026-01-22 00:08:11.428 182939 DEBUG nova.compute.manager [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] No waiting events found dispatching network-vif-plugged-1d405d7f-e331-40a8-bc0c-242ed82d7807 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:08:11 compute-0 nova_compute[182935]: 2026-01-22 00:08:11.428 182939 WARNING nova.compute.manager [req-eaeb6d0e-175e-4e9d-9567-aef76a9dbb99 req-5e19939c-5938-483d-a9bc-8d6e1ab2b232 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Received unexpected event network-vif-plugged-1d405d7f-e331-40a8-bc0c-242ed82d7807 for instance with vm_state active and task_state powering-off.
Jan 22 00:08:11 compute-0 nova_compute[182935]: 2026-01-22 00:08:11.538 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:11.811 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:08:11 compute-0 nova_compute[182935]: 2026-01-22 00:08:11.811 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:11.813 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:08:11 compute-0 nova_compute[182935]: 2026-01-22 00:08:11.813 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:12.815 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:16 compute-0 sshd-session[229383]: Received disconnect from 45.227.254.170 port 40372:11:  [preauth]
Jan 22 00:08:16 compute-0 sshd-session[229383]: Disconnected from authenticating user root 45.227.254.170 port 40372 [preauth]
Jan 22 00:08:16 compute-0 nova_compute[182935]: 2026-01-22 00:08:16.541 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:16 compute-0 nova_compute[182935]: 2026-01-22 00:08:16.815 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:20 compute-0 nova_compute[182935]: 2026-01-22 00:08:20.395 182939 DEBUG nova.virt.libvirt.driver [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:08:21 compute-0 nova_compute[182935]: 2026-01-22 00:08:21.544 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:21 compute-0 nova_compute[182935]: 2026-01-22 00:08:21.818 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:22 compute-0 podman[229404]: 2026-01-22 00:08:22.726630107 +0000 UTC m=+0.078141357 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:08:22 compute-0 podman[229403]: 2026-01-22 00:08:22.804930047 +0000 UTC m=+0.156432387 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.315 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'name': 'tempest-DeleteServersTestJSON-server-1267757132', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000069', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3822e32efd5647aebf2d79a3dd038bd4', 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'hostId': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.316 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.353 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.read.bytes volume: 29015040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.354 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.read.bytes volume: 221502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fa84a9f-ab29-4981-8c85-30ec77dbd1b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29015040, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-vda', 'timestamp': '2026-01-22T00:08:23.316429', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7776426a-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.110757703, 'message_signature': 'c0c5b14b673bdd75742a5fa7a0937951b84a3f0ca1167a5cbdfab633659b0365'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221502, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-sda', 'timestamp': '2026-01-22T00:08:23.316429', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77765eee-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.110757703, 'message_signature': '3bbbe794e5f5a3a5e7dbe64aaccf5d27f27ef10ae93a1e2af571f93222251e83'}]}, 'timestamp': '2026-01-22 00:08:23.355156', '_unique_id': '6f6b6435f56345f3af18724c2534ac82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.358 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.359 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.359 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.360 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1267757132>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1267757132>]
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.360 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.360 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.360 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1267757132>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1267757132>]
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.360 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.363 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef / tap1d405d7f-e3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.364 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17b87cce-76e6-4d2a-8d23-a409856ce459', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': 'instance-00000069-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-tap1d405d7f-e3', 'timestamp': '2026-01-22T00:08:23.360914', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'tap1d405d7f-e3', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1d405d7f-e3'}, 'message_id': '7777ceb4-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.155184831, 'message_signature': '36bc0bd08a825cef9abda22b2a70387117fcfa04ffad2fb8052c6500dfe85a0e'}]}, 'timestamp': '2026-01-22 00:08:23.364596', '_unique_id': '557f3bf22ae844449b411be2d2892dcb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.365 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.366 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.366 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.read.latency volume: 295950657 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.366 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.read.latency volume: 20853893 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bdeb5c2-c84b-4fd5-8ff0-868f6b2fdc41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 295950657, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-vda', 'timestamp': '2026-01-22T00:08:23.366523', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7778294a-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.110757703, 'message_signature': '8e8ac75af542615f8f5030ae44eabe74430effdcf772e9b19b80800d0c7ca13b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20853893, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-sda', 'timestamp': '2026-01-22T00:08:23.366523', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77783412-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.110757703, 'message_signature': 'bd2a9ef308f5f6831a4fbf4a05f052e19ccb74d1aaad6b5f0481b3d93afc8674'}]}, 'timestamp': '2026-01-22 00:08:23.367080', '_unique_id': 'c4abd85849884cfa9a3ea8047616485d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.367 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.368 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.368 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd82102d4-ef6a-4ade-ab50-468d4cbcba42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': 'instance-00000069-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-tap1d405d7f-e3', 'timestamp': '2026-01-22T00:08:23.368723', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'tap1d405d7f-e3', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1d405d7f-e3'}, 'message_id': '77788098-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.155184831, 'message_signature': '32a6696536ddbda0c93754d849e484dae83185001f3d907dbf07e0cea7d930a9'}]}, 'timestamp': '2026-01-22 00:08:23.369036', '_unique_id': '1fdcd6aea57d4d2991a2a648fc761386'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.369 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.370 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.370 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/network.incoming.bytes volume: 622 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6030d473-a183-471c-99f0-b47d22563aea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 622, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': 'instance-00000069-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-tap1d405d7f-e3', 'timestamp': '2026-01-22T00:08:23.370590', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'tap1d405d7f-e3', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1d405d7f-e3'}, 'message_id': '7778c8aa-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.155184831, 'message_signature': 'e5ca4b29827b20bbde56b3b1d317653a1fac60a396a9cd35e7e691e8de4d991e'}]}, 'timestamp': '2026-01-22 00:08:23.370902', '_unique_id': '07bbc901e4de43f5af9ec32b81325e8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.371 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.372 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.372 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f54d0dd-a8d1-4397-8149-fd221d88d52e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': 'instance-00000069-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-tap1d405d7f-e3', 'timestamp': '2026-01-22T00:08:23.372574', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'tap1d405d7f-e3', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1d405d7f-e3'}, 'message_id': '777919fe-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.155184831, 'message_signature': '7a8e80a66856775a2bb71a3347f9709cb3ad594c5fb9b0047db7211b1ce7c529'}]}, 'timestamp': '2026-01-22 00:08:23.372990', '_unique_id': '623cc1fbca4242119d3f3a37df77275c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.373 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.374 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.374 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '326b62b3-c9da-4a90-a1e7-0d5e4ce19334', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': 'instance-00000069-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-tap1d405d7f-e3', 'timestamp': '2026-01-22T00:08:23.374505', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'tap1d405d7f-e3', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1d405d7f-e3'}, 'message_id': '777960da-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.155184831, 'message_signature': '35ed5f671db786ff8e10b6c6ca91a3c5c5f24befd8a8ddf2fbb939d90c7683f1'}]}, 'timestamp': '2026-01-22 00:08:23.374775', '_unique_id': 'cb074f7b23ed4884b36b80749c54cc39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.375 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.376 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.376 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.376 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1267757132>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1267757132>]
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.376 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.397 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/memory.usage volume: 40.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f79839f4-7ecb-48eb-8df1-9abfff69ebed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.38671875, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'timestamp': '2026-01-22T00:08:23.376674', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '777cf3ee-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.191900019, 'message_signature': '0467fde123df2ecb38a1c02ad38fd1829320f4629a8134ffb4c09683c0c13d5c'}]}, 'timestamp': '2026-01-22 00:08:23.398252', '_unique_id': 'ad3afe2310d04f1ab54649fb22de0876'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.399 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.401 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.401 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc514452-f985-4a27-9cb8-3589ecc70869', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': 'instance-00000069-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-tap1d405d7f-e3', 'timestamp': '2026-01-22T00:08:23.401124', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'tap1d405d7f-e3', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1d405d7f-e3'}, 'message_id': '777d71c0-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.155184831, 'message_signature': '66193d8f01dea83be35bb506ec123ad7d7fcf5c9d362a41949ddd250e953e031'}]}, 'timestamp': '2026-01-22 00:08:23.401434', '_unique_id': 'f436983ad94b477e97cd99a264ce603d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.402 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23054cef-d232-46c7-93c9-e04be2e3f25a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': 'instance-00000069-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-tap1d405d7f-e3', 'timestamp': '2026-01-22T00:08:23.403014', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'tap1d405d7f-e3', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1d405d7f-e3'}, 'message_id': '777dbb4e-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.155184831, 'message_signature': '892b985aa872f607aeeceead56e8a9b3742d3416aaae7a3974103e58eb86b067'}]}, 'timestamp': '2026-01-22 00:08:23.403333', '_unique_id': '0729c4bbfa5744d4bacaac75dc72f021'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.403 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.404 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/cpu volume: 12160000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab4812aa-f26d-42c0-b5ef-600c3ba940d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12160000000, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'timestamp': '2026-01-22T00:08:23.405122', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '777e0d7e-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.191900019, 'message_signature': '6b3f5200d6fe446476e4a09962a7b66fdf93427432f408cd6f2a5d84b5994fde'}]}, 'timestamp': '2026-01-22 00:08:23.405413', '_unique_id': '1030758b44c84a458a850e1ed508c041'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.406 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.406 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.write.requests volume: 242 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.407 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26f5346c-5cd0-47ad-b3ab-a8e634269b98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 242, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-vda', 'timestamp': '2026-01-22T00:08:23.406916', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '777e52e8-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.110757703, 'message_signature': '2b95d932663f3560bd7d405794ea3f03a7caabdc9f538ae26676f00f4228d154'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-sda', 'timestamp': '2026-01-22T00:08:23.406916', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '777e5c48-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.110757703, 'message_signature': '9acd0f0a05d24d8358b968b5a760c88715750775d01dd03e55d2866a7786d820'}]}, 'timestamp': '2026-01-22 00:08:23.407414', '_unique_id': 'dc1ab9a73329431ea3df6813a25eacd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.408 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.409 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.409 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aabe6efd-d882-4bde-942a-bf5f4f6cb7ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': 'instance-00000069-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-tap1d405d7f-e3', 'timestamp': '2026-01-22T00:08:23.409328', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'tap1d405d7f-e3', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1d405d7f-e3'}, 'message_id': '777eb1b6-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.155184831, 'message_signature': 'ede5f4110c996a559cafe2feb449eec828f6ed3f6211646f3dd67c50787d49e4'}]}, 'timestamp': '2026-01-22 00:08:23.409644', '_unique_id': 'fdb0f2b1f005434491c940da6499187f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.410 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.411 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.411 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/network.incoming.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1c9a287-fb9f-4b75-b29f-5253b14ac3c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': 'instance-00000069-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-tap1d405d7f-e3', 'timestamp': '2026-01-22T00:08:23.411268', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'tap1d405d7f-e3', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1d405d7f-e3'}, 'message_id': '777efe0a-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.155184831, 'message_signature': 'c2abc20237b6ef3df5440ff3c235b63c782ca5b50bb6d77720216ccfde049879'}]}, 'timestamp': '2026-01-22 00:08:23.411608', '_unique_id': 'ceeb6e11229c42fabb6bfc0251c119d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.412 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.413 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.413 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.413 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1267757132>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-DeleteServersTestJSON-server-1267757132>]
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.413 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.413 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.read.requests volume: 1034 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.read.requests volume: 95 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be8c4f1a-63b8-4dc1-8c3a-e0db0d973963', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1034, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-vda', 'timestamp': '2026-01-22T00:08:23.413835', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '777f612e-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.110757703, 'message_signature': '461b754e1a9fe933f22c37304065a7e409fdb5c9c7353f010308c8de76025bb2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 95, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-sda', 'timestamp': '2026-01-22T00:08:23.413835', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '777f6a7a-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.110757703, 'message_signature': 'dbcad39bd678487f98e5032f15a8606fff9dc2834c6ed7030608034c1ffb6314'}]}, 'timestamp': '2026-01-22 00:08:23.414330', '_unique_id': '088acf2a55624c7ca29c5e9fae64a7b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.414 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.415 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.425 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.usage volume: 29294592 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.425 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7da8e57d-49a1-44b5-90c0-eb0f48f66b1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29294592, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-vda', 'timestamp': '2026-01-22T00:08:23.416056', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '778129fa-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.210330525, 'message_signature': '81277d52433675115b46efc39a3d2440e8c09232da1d84b99f8851cf751e07c1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-sda', 'timestamp': '2026-01-22T00:08:23.416056', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7781381e-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.210330525, 'message_signature': '97431fceb183bc6ab4f83d1efc4644e99b9ebcc1ea2ae2d5dae8048791ee62e8'}]}, 'timestamp': '2026-01-22 00:08:23.426157', '_unique_id': '6c150fac2da04e06b8e040d3e2aa9e92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.427 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.428 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.write.bytes volume: 25681920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.428 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6d1e279-2952-40ad-ab8f-cf41eee739f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25681920, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-vda', 'timestamp': '2026-01-22T00:08:23.428104', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77818e9a-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.110757703, 'message_signature': '17ec823c2d319443f7fb75b81556019864b8e067fea8b7504e98434360dbaeb7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-sda', 'timestamp': '2026-01-22T00:08:23.428104', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7781994e-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.110757703, 'message_signature': '7eb90f26fdc9c2be8fd9115d63c1fcd186bff629e822db63a38db18aca2014d6'}]}, 'timestamp': '2026-01-22 00:08:23.428639', '_unique_id': 'cce99e8c23954fc9b6ada37ff386f9e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.429 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.430 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.430 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.430 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a44c6d1-3d22-4b58-84ad-c6376b64917d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-vda', 'timestamp': '2026-01-22T00:08:23.430325', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7781e5ca-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.210330525, 'message_signature': '8f2f8c40ed8da811323883cc859e59b695ee377bdd5b3617f823943e611c90fc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-sda', 'timestamp': '2026-01-22T00:08:23.430325', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7781f146-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.210330525, 'message_signature': '9258b147447d08e3019f756458121a97779898c5b557438b278afdea3242cd36'}]}, 'timestamp': '2026-01-22 00:08:23.430916', '_unique_id': 'fbe5ae5285cf401ca2fd7154c25cb4f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.431 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.432 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.432 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.write.latency volume: 6650470946 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.432 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f8fbae6-1895-497e-a1ee-1142927dbe52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6650470946, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-vda', 'timestamp': '2026-01-22T00:08:23.432637', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77824060-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.110757703, 'message_signature': '46e395fee5a654f85df33f0356e4bfc093207ea9174574b067fa1bc3ba1a55ee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-sda', 'timestamp': '2026-01-22T00:08:23.432637', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77824b78-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.110757703, 'message_signature': '125bed325f55f3aa482eb31da90d1aeb59eb9d0dcda832517ca4202b81d16aca'}]}, 'timestamp': '2026-01-22 00:08:23.433203', '_unique_id': 'e8b24f6e89234b2ca6e839cb3e8bd187'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.433 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.434 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.434 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f0c8c68-9ec9-482b-b821-6f3b80d0aa76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-vda', 'timestamp': '2026-01-22T00:08:23.434730', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77829218-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.210330525, 'message_signature': '061d466ceb931e70424fc61eb8e9b1730da22a32bd8bd6161c4000b02079bd87'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-sda', 'timestamp': '2026-01-22T00:08:23.434730', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'instance-00000069', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77829b6e-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.210330525, 'message_signature': '59493323ced77ea5c6b50f7f93a28bdb262e85a0b8bc2543d0913df88abff74f'}]}, 'timestamp': '2026-01-22 00:08:23.435244', '_unique_id': '48c453b4537a4939b6f6087d738db87e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.435 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.436 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.436 12 DEBUG ceilometer.compute.pollsters [-] 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cf82017-701b-4685-aa79-5223139e5d94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_name': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_name': None, 'resource_id': 'instance-00000069-9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-tap1d405d7f-e3', 'timestamp': '2026-01-22T00:08:23.436793', 'resource_metadata': {'display_name': 'tempest-DeleteServersTestJSON-server-1267757132', 'name': 'tap1d405d7f-e3', 'instance_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'instance_type': 'm1.nano', 'host': 'b63bbb47297c314861954a08ca75437c67a8812ea164700e6343cb03', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f6:53:88', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1d405d7f-e3'}, 'message_id': '7782e7d6-f726-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5006.155184831, 'message_signature': '0450ac7514f642d33082ab2782b508ed421976d13942cb78c0e7b89bdc0b3026'}]}, 'timestamp': '2026-01-22 00:08:23.437305', '_unique_id': '143016f2ac774b1b8b40eded357ff3d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:08:23.437 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:08:24 compute-0 ovn_controller[95047]: 2026-01-22T00:08:24Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f6:53:88 10.100.0.14
Jan 22 00:08:24 compute-0 ovn_controller[95047]: 2026-01-22T00:08:24Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f6:53:88 10.100.0.14
Jan 22 00:08:26 compute-0 nova_compute[182935]: 2026-01-22 00:08:26.547 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:26 compute-0 nova_compute[182935]: 2026-01-22 00:08:26.819 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:29 compute-0 podman[229451]: 2026-01-22 00:08:29.738666004 +0000 UTC m=+0.109113293 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:08:31 compute-0 nova_compute[182935]: 2026-01-22 00:08:31.449 182939 DEBUG nova.virt.libvirt.driver [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:08:31 compute-0 nova_compute[182935]: 2026-01-22 00:08:31.550 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:31 compute-0 nova_compute[182935]: 2026-01-22 00:08:31.822 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:34 compute-0 kernel: tap1d405d7f-e3 (unregistering): left promiscuous mode
Jan 22 00:08:34 compute-0 NetworkManager[55139]: <info>  [1769040514.0518] device (tap1d405d7f-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:08:34 compute-0 ovn_controller[95047]: 2026-01-22T00:08:34Z|00442|binding|INFO|Releasing lport 1d405d7f-e331-40a8-bc0c-242ed82d7807 from this chassis (sb_readonly=0)
Jan 22 00:08:34 compute-0 ovn_controller[95047]: 2026-01-22T00:08:34Z|00443|binding|INFO|Setting lport 1d405d7f-e331-40a8-bc0c-242ed82d7807 down in Southbound
Jan 22 00:08:34 compute-0 ovn_controller[95047]: 2026-01-22T00:08:34Z|00444|binding|INFO|Removing iface tap1d405d7f-e3 ovn-installed in OVS
Jan 22 00:08:34 compute-0 nova_compute[182935]: 2026-01-22 00:08:34.060 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:34 compute-0 nova_compute[182935]: 2026-01-22 00:08:34.062 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.071 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:53:88 10.100.0.14'], port_security=['fa:16:3e:f6:53:88 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=1d405d7f-e331-40a8-bc0c-242ed82d7807) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.072 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 1d405d7f-e331-40a8-bc0c-242ed82d7807 in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e unbound from our chassis
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.074 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d94993bc-77ac-42d2-88cb-3b0110dff29e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.076 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f5bd7db2-fa6d-4f82-b25e-686dc96961f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.076 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace which is not needed anymore
Jan 22 00:08:34 compute-0 nova_compute[182935]: 2026-01-22 00:08:34.078 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:34 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000069.scope: Deactivated successfully.
Jan 22 00:08:34 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000069.scope: Consumed 14.703s CPU time.
Jan 22 00:08:34 compute-0 systemd-machined[154182]: Machine qemu-57-instance-00000069 terminated.
Jan 22 00:08:34 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229332]: [NOTICE]   (229372) : haproxy version is 2.8.14-c23fe91
Jan 22 00:08:34 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229332]: [NOTICE]   (229372) : path to executable is /usr/sbin/haproxy
Jan 22 00:08:34 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229332]: [WARNING]  (229372) : Exiting Master process...
Jan 22 00:08:34 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229332]: [ALERT]    (229372) : Current worker (229374) exited with code 143 (Terminated)
Jan 22 00:08:34 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229332]: [WARNING]  (229372) : All workers exited. Exiting... (0)
Jan 22 00:08:34 compute-0 systemd[1]: libpod-03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117.scope: Deactivated successfully.
Jan 22 00:08:34 compute-0 podman[229500]: 2026-01-22 00:08:34.20961709 +0000 UTC m=+0.042045083 container died 03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:08:34 compute-0 nova_compute[182935]: 2026-01-22 00:08:34.466 182939 INFO nova.virt.libvirt.driver [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Instance shutdown successfully after 24 seconds.
Jan 22 00:08:34 compute-0 nova_compute[182935]: 2026-01-22 00:08:34.472 182939 INFO nova.virt.libvirt.driver [-] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Instance destroyed successfully.
Jan 22 00:08:34 compute-0 nova_compute[182935]: 2026-01-22 00:08:34.472 182939 DEBUG nova.objects.instance [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:34 compute-0 nova_compute[182935]: 2026-01-22 00:08:34.542 182939 DEBUG nova.compute.manager [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:34 compute-0 nova_compute[182935]: 2026-01-22 00:08:34.631 182939 DEBUG oslo_concurrency.lockutils [None req-75b325ba-7ede-4fa5-b998-e612fc3ee74a 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 24.388s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117-userdata-shm.mount: Deactivated successfully.
Jan 22 00:08:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-1fac0f755b561350ffd853229bafea7ec96fd51be183810374b4908e6a0c2347-merged.mount: Deactivated successfully.
Jan 22 00:08:34 compute-0 podman[229500]: 2026-01-22 00:08:34.664682927 +0000 UTC m=+0.497110920 container cleanup 03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:08:34 compute-0 podman[229547]: 2026-01-22 00:08:34.853783378 +0000 UTC m=+0.167822050 container remove 03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.860 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[975b13da-d0f2-484f-841a-977cfbd5ef17]: (4, ('Thu Jan 22 12:08:34 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117)\n03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117\nThu Jan 22 12:08:34 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117)\n03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.862 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ebbdb589-d980-4a7e-bb4b-0b1c14ee5f31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.863 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:34 compute-0 nova_compute[182935]: 2026-01-22 00:08:34.865 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:34 compute-0 kernel: tapd94993bc-70: left promiscuous mode
Jan 22 00:08:34 compute-0 systemd[1]: libpod-conmon-03393e06e9702a7f4382aa16bca8303f22723cf6be06ae0b8c84b904fddc2117.scope: Deactivated successfully.
Jan 22 00:08:34 compute-0 nova_compute[182935]: 2026-01-22 00:08:34.879 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.884 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[87b6eab1-c866-4fdd-83bc-2d82ebaba38c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.896 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[935b706e-e222-4803-8b2e-b9e43e52029b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.897 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[36611f61-65fa-446d-b599-69e39662977d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.914 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[971a7e0a-28ae-4c50-b178-aceaf92d446f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499031, 'reachable_time': 24411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229572, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:34 compute-0 systemd[1]: run-netns-ovnmeta\x2dd94993bc\x2d77ac\x2d42d2\x2d88cb\x2d3b0110dff29e.mount: Deactivated successfully.
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.917 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:08:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:34.917 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[249bb681-7560-4c40-9003-e4dfc97a35b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:34 compute-0 podman[229561]: 2026-01-22 00:08:34.947709519 +0000 UTC m=+0.050385456 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.680 182939 DEBUG nova.compute.manager [req-5e44ed03-eba6-41f2-a16b-3668d4d21c40 req-e1fef98e-b5a4-4c09-b681-85737ff710a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Received event network-vif-unplugged-1d405d7f-e331-40a8-bc0c-242ed82d7807 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.681 182939 DEBUG oslo_concurrency.lockutils [req-5e44ed03-eba6-41f2-a16b-3668d4d21c40 req-e1fef98e-b5a4-4c09-b681-85737ff710a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.681 182939 DEBUG oslo_concurrency.lockutils [req-5e44ed03-eba6-41f2-a16b-3668d4d21c40 req-e1fef98e-b5a4-4c09-b681-85737ff710a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.681 182939 DEBUG oslo_concurrency.lockutils [req-5e44ed03-eba6-41f2-a16b-3668d4d21c40 req-e1fef98e-b5a4-4c09-b681-85737ff710a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.681 182939 DEBUG nova.compute.manager [req-5e44ed03-eba6-41f2-a16b-3668d4d21c40 req-e1fef98e-b5a4-4c09-b681-85737ff710a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] No waiting events found dispatching network-vif-unplugged-1d405d7f-e331-40a8-bc0c-242ed82d7807 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.682 182939 WARNING nova.compute.manager [req-5e44ed03-eba6-41f2-a16b-3668d4d21c40 req-e1fef98e-b5a4-4c09-b681-85737ff710a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Received unexpected event network-vif-unplugged-1d405d7f-e331-40a8-bc0c-242ed82d7807 for instance with vm_state stopped and task_state None.
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.682 182939 DEBUG nova.compute.manager [req-5e44ed03-eba6-41f2-a16b-3668d4d21c40 req-e1fef98e-b5a4-4c09-b681-85737ff710a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Received event network-vif-plugged-1d405d7f-e331-40a8-bc0c-242ed82d7807 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.682 182939 DEBUG oslo_concurrency.lockutils [req-5e44ed03-eba6-41f2-a16b-3668d4d21c40 req-e1fef98e-b5a4-4c09-b681-85737ff710a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.682 182939 DEBUG oslo_concurrency.lockutils [req-5e44ed03-eba6-41f2-a16b-3668d4d21c40 req-e1fef98e-b5a4-4c09-b681-85737ff710a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.682 182939 DEBUG oslo_concurrency.lockutils [req-5e44ed03-eba6-41f2-a16b-3668d4d21c40 req-e1fef98e-b5a4-4c09-b681-85737ff710a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.683 182939 DEBUG nova.compute.manager [req-5e44ed03-eba6-41f2-a16b-3668d4d21c40 req-e1fef98e-b5a4-4c09-b681-85737ff710a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] No waiting events found dispatching network-vif-plugged-1d405d7f-e331-40a8-bc0c-242ed82d7807 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.683 182939 WARNING nova.compute.manager [req-5e44ed03-eba6-41f2-a16b-3668d4d21c40 req-e1fef98e-b5a4-4c09-b681-85737ff710a2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Received unexpected event network-vif-plugged-1d405d7f-e331-40a8-bc0c-242ed82d7807 for instance with vm_state stopped and task_state None.
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.876 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.877 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:08:35 compute-0 nova_compute[182935]: 2026-01-22 00:08:35.898 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.552 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.822 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.822 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.823 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.823 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.866 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.930 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.988 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:36 compute-0 nova_compute[182935]: 2026-01-22 00:08:36.989 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.083 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.247 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.249 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5697MB free_disk=73.09935760498047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.249 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.249 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.533 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.534 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.535 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.557 182939 DEBUG oslo_concurrency.lockutils [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.558 182939 DEBUG oslo_concurrency.lockutils [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.559 182939 DEBUG oslo_concurrency.lockutils [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.559 182939 DEBUG oslo_concurrency.lockutils [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.560 182939 DEBUG oslo_concurrency.lockutils [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.578 182939 INFO nova.compute.manager [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Terminating instance
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.591 182939 DEBUG nova.compute.manager [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.602 182939 INFO nova.virt.libvirt.driver [-] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Instance destroyed successfully.
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.603 182939 DEBUG nova.objects.instance [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'resources' on Instance uuid 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.607 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.620 182939 DEBUG nova.virt.libvirt.vif [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:07:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1267757132',display_name='tempest-DeleteServersTestJSON-server-1267757132',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1267757132',id=105,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:08:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-0fci23xo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:08:34Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "address": "fa:16:3e:f6:53:88", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d405d7f-e3", "ovs_interfaceid": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.621 182939 DEBUG nova.network.os_vif_util [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "address": "fa:16:3e:f6:53:88", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1d405d7f-e3", "ovs_interfaceid": "1d405d7f-e331-40a8-bc0c-242ed82d7807", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.623 182939 DEBUG nova.network.os_vif_util [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:88,bridge_name='br-int',has_traffic_filtering=True,id=1d405d7f-e331-40a8-bc0c-242ed82d7807,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d405d7f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.623 182939 DEBUG os_vif [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:88,bridge_name='br-int',has_traffic_filtering=True,id=1d405d7f-e331-40a8-bc0c-242ed82d7807,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d405d7f-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.627 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.628 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1d405d7f-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.631 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.637 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.640 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.644 182939 INFO os_vif [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:53:88,bridge_name='br-int',has_traffic_filtering=True,id=1d405d7f-e331-40a8-bc0c-242ed82d7807,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1d405d7f-e3')
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.645 182939 INFO nova.virt.libvirt.driver [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Deleting instance files /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef_del
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.646 182939 INFO nova.virt.libvirt.driver [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Deletion of /var/lib/nova/instances/9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef_del complete
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.669 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.670 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.719 182939 INFO nova.compute.manager [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Took 0.13 seconds to destroy the instance on the hypervisor.
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.720 182939 DEBUG oslo.service.loopingcall [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.720 182939 DEBUG nova.compute.manager [-] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:08:37 compute-0 nova_compute[182935]: 2026-01-22 00:08:37.720 182939 DEBUG nova.network.neutron [-] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:08:38 compute-0 podman[229592]: 2026-01-22 00:08:38.709470034 +0000 UTC m=+0.075205080 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 00:08:38 compute-0 podman[229593]: 2026-01-22 00:08:38.731914542 +0000 UTC m=+0.092668223 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:08:39 compute-0 nova_compute[182935]: 2026-01-22 00:08:39.076 182939 DEBUG nova.network.neutron [-] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:08:39 compute-0 nova_compute[182935]: 2026-01-22 00:08:39.105 182939 INFO nova.compute.manager [-] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Took 1.38 seconds to deallocate network for instance.
Jan 22 00:08:39 compute-0 nova_compute[182935]: 2026-01-22 00:08:39.123 182939 DEBUG nova.compute.manager [req-7c31689e-49ef-41c2-872e-06291b4e8695 req-1815d36d-0ac7-47d9-a5aa-312281bf0092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Received event network-vif-deleted-1d405d7f-e331-40a8-bc0c-242ed82d7807 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:39 compute-0 nova_compute[182935]: 2026-01-22 00:08:39.271 182939 DEBUG oslo_concurrency.lockutils [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:39 compute-0 nova_compute[182935]: 2026-01-22 00:08:39.272 182939 DEBUG oslo_concurrency.lockutils [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:39 compute-0 nova_compute[182935]: 2026-01-22 00:08:39.334 182939 DEBUG nova.compute.provider_tree [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:08:39 compute-0 nova_compute[182935]: 2026-01-22 00:08:39.359 182939 DEBUG nova.scheduler.client.report [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:08:39 compute-0 nova_compute[182935]: 2026-01-22 00:08:39.380 182939 DEBUG oslo_concurrency.lockutils [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:39 compute-0 nova_compute[182935]: 2026-01-22 00:08:39.408 182939 INFO nova.scheduler.client.report [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Deleted allocations for instance 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef
Jan 22 00:08:39 compute-0 nova_compute[182935]: 2026-01-22 00:08:39.918 182939 DEBUG oslo_concurrency.lockutils [None req-2c894684-e865-4ff3-98d6-f70e6a02d5d1 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.360s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:40 compute-0 nova_compute[182935]: 2026-01-22 00:08:40.666 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:41 compute-0 sshd-session[229633]: Invalid user svn from 188.166.69.60 port 60960
Jan 22 00:08:41 compute-0 sshd-session[229633]: Connection closed by invalid user svn 188.166.69.60 port 60960 [preauth]
Jan 22 00:08:41 compute-0 nova_compute[182935]: 2026-01-22 00:08:41.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:41 compute-0 nova_compute[182935]: 2026-01-22 00:08:41.869 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:42 compute-0 nova_compute[182935]: 2026-01-22 00:08:42.632 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:42 compute-0 nova_compute[182935]: 2026-01-22 00:08:42.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:44 compute-0 nova_compute[182935]: 2026-01-22 00:08:44.725 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "6198539d-ef01-4d47-b9af-745c9885d0e3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:44 compute-0 nova_compute[182935]: 2026-01-22 00:08:44.725 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:44 compute-0 nova_compute[182935]: 2026-01-22 00:08:44.742 182939 DEBUG nova.compute.manager [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:08:44 compute-0 nova_compute[182935]: 2026-01-22 00:08:44.940 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:44 compute-0 nova_compute[182935]: 2026-01-22 00:08:44.941 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:44 compute-0 nova_compute[182935]: 2026-01-22 00:08:44.947 182939 DEBUG nova.virt.hardware [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:08:44 compute-0 nova_compute[182935]: 2026-01-22 00:08:44.947 182939 INFO nova.compute.claims [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:08:45 compute-0 nova_compute[182935]: 2026-01-22 00:08:45.210 182939 DEBUG nova.compute.provider_tree [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:08:45 compute-0 nova_compute[182935]: 2026-01-22 00:08:45.228 182939 DEBUG nova.scheduler.client.report [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:08:45 compute-0 nova_compute[182935]: 2026-01-22 00:08:45.281 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:45 compute-0 nova_compute[182935]: 2026-01-22 00:08:45.282 182939 DEBUG nova.compute.manager [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:08:45 compute-0 nova_compute[182935]: 2026-01-22 00:08:45.608 182939 DEBUG nova.compute.manager [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:08:45 compute-0 nova_compute[182935]: 2026-01-22 00:08:45.609 182939 DEBUG nova.network.neutron [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:08:45 compute-0 nova_compute[182935]: 2026-01-22 00:08:45.633 182939 INFO nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:08:45 compute-0 nova_compute[182935]: 2026-01-22 00:08:45.670 182939 DEBUG nova.compute.manager [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:08:45 compute-0 nova_compute[182935]: 2026-01-22 00:08:45.902 182939 DEBUG nova.policy [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.186 182939 DEBUG nova.compute.manager [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.187 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.188 182939 INFO nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Creating image(s)
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.189 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "/var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.189 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.190 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.204 182939 DEBUG oslo_concurrency.processutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.274 182939 DEBUG oslo_concurrency.processutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.276 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.277 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.292 182939 DEBUG oslo_concurrency.processutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.357 182939 DEBUG oslo_concurrency.processutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.358 182939 DEBUG oslo_concurrency.processutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.394 182939 DEBUG oslo_concurrency.processutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.395 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.396 182939 DEBUG oslo_concurrency.processutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.465 182939 DEBUG oslo_concurrency.processutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.466 182939 DEBUG nova.virt.disk.api [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Checking if we can resize image /var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.467 182939 DEBUG oslo_concurrency.processutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.534 182939 DEBUG oslo_concurrency.processutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.535 182939 DEBUG nova.virt.disk.api [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Cannot resize image /var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.536 182939 DEBUG nova.objects.instance [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'migration_context' on Instance uuid 6198539d-ef01-4d47-b9af-745c9885d0e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.557 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.558 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Ensure instance console log exists: /var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.558 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.558 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.559 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.667 182939 DEBUG nova.network.neutron [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Successfully created port: 4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:08:46 compute-0 nova_compute[182935]: 2026-01-22 00:08:46.871 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:47 compute-0 nova_compute[182935]: 2026-01-22 00:08:47.635 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:48 compute-0 nova_compute[182935]: 2026-01-22 00:08:48.099 182939 DEBUG nova.network.neutron [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Successfully updated port: 4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:08:48 compute-0 nova_compute[182935]: 2026-01-22 00:08:48.118 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "refresh_cache-6198539d-ef01-4d47-b9af-745c9885d0e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:08:48 compute-0 nova_compute[182935]: 2026-01-22 00:08:48.119 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquired lock "refresh_cache-6198539d-ef01-4d47-b9af-745c9885d0e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:08:48 compute-0 nova_compute[182935]: 2026-01-22 00:08:48.119 182939 DEBUG nova.network.neutron [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:08:48 compute-0 nova_compute[182935]: 2026-01-22 00:08:48.231 182939 DEBUG nova.compute.manager [req-afedae33-424a-402d-8a1e-97134fa8c60e req-aade3eb0-6a68-40cd-b30a-f5db6fe88752 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Received event network-changed-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:48 compute-0 nova_compute[182935]: 2026-01-22 00:08:48.231 182939 DEBUG nova.compute.manager [req-afedae33-424a-402d-8a1e-97134fa8c60e req-aade3eb0-6a68-40cd-b30a-f5db6fe88752 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Refreshing instance network info cache due to event network-changed-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:08:48 compute-0 nova_compute[182935]: 2026-01-22 00:08:48.232 182939 DEBUG oslo_concurrency.lockutils [req-afedae33-424a-402d-8a1e-97134fa8c60e req-aade3eb0-6a68-40cd-b30a-f5db6fe88752 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-6198539d-ef01-4d47-b9af-745c9885d0e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:08:49 compute-0 nova_compute[182935]: 2026-01-22 00:08:49.018 182939 DEBUG nova.network.neutron [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:08:49 compute-0 nova_compute[182935]: 2026-01-22 00:08:49.324 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040514.323627, 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:49 compute-0 nova_compute[182935]: 2026-01-22 00:08:49.325 182939 INFO nova.compute.manager [-] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] VM Stopped (Lifecycle Event)
Jan 22 00:08:49 compute-0 nova_compute[182935]: 2026-01-22 00:08:49.349 182939 DEBUG nova.compute.manager [None req-e8f1b5f8-f268-4d1c-9354-93c617350dbb - - - - - -] [instance: 9ccfcf6b-49ea-4486-9cc6-68e20b24e5ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.141 182939 DEBUG nova.network.neutron [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Updating instance_info_cache with network_info: [{"id": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "address": "fa:16:3e:ad:df:13", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca3ccbe-b9", "ovs_interfaceid": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.165 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Releasing lock "refresh_cache-6198539d-ef01-4d47-b9af-745c9885d0e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.165 182939 DEBUG nova.compute.manager [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Instance network_info: |[{"id": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "address": "fa:16:3e:ad:df:13", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca3ccbe-b9", "ovs_interfaceid": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.166 182939 DEBUG oslo_concurrency.lockutils [req-afedae33-424a-402d-8a1e-97134fa8c60e req-aade3eb0-6a68-40cd-b30a-f5db6fe88752 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-6198539d-ef01-4d47-b9af-745c9885d0e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.166 182939 DEBUG nova.network.neutron [req-afedae33-424a-402d-8a1e-97134fa8c60e req-aade3eb0-6a68-40cd-b30a-f5db6fe88752 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Refreshing network info cache for port 4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.169 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Start _get_guest_xml network_info=[{"id": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "address": "fa:16:3e:ad:df:13", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca3ccbe-b9", "ovs_interfaceid": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.174 182939 WARNING nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.180 182939 DEBUG nova.virt.libvirt.host [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.180 182939 DEBUG nova.virt.libvirt.host [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.188 182939 DEBUG nova.virt.libvirt.host [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.188 182939 DEBUG nova.virt.libvirt.host [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.190 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.190 182939 DEBUG nova.virt.hardware [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.191 182939 DEBUG nova.virt.hardware [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.191 182939 DEBUG nova.virt.hardware [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.191 182939 DEBUG nova.virt.hardware [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.192 182939 DEBUG nova.virt.hardware [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.192 182939 DEBUG nova.virt.hardware [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.192 182939 DEBUG nova.virt.hardware [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.193 182939 DEBUG nova.virt.hardware [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.193 182939 DEBUG nova.virt.hardware [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.193 182939 DEBUG nova.virt.hardware [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.193 182939 DEBUG nova.virt.hardware [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.198 182939 DEBUG nova.virt.libvirt.vif [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:08:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-29828546',display_name='tempest-DeleteServersTestJSON-server-29828546',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-29828546',id=109,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-bh0q9q28',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:45Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=6198539d-ef01-4d47-b9af-745c9885d0e3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "address": "fa:16:3e:ad:df:13", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca3ccbe-b9", "ovs_interfaceid": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.199 182939 DEBUG nova.network.os_vif_util [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "address": "fa:16:3e:ad:df:13", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca3ccbe-b9", "ovs_interfaceid": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.199 182939 DEBUG nova.network.os_vif_util [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:df:13,bridge_name='br-int',has_traffic_filtering=True,id=4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca3ccbe-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.200 182939 DEBUG nova.objects.instance [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6198539d-ef01-4d47-b9af-745c9885d0e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.217 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:08:50 compute-0 nova_compute[182935]:   <uuid>6198539d-ef01-4d47-b9af-745c9885d0e3</uuid>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   <name>instance-0000006d</name>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <nova:name>tempest-DeleteServersTestJSON-server-29828546</nova:name>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:08:50</nova:creationTime>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:08:50 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:08:50 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:08:50 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:08:50 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:08:50 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:08:50 compute-0 nova_compute[182935]:         <nova:user uuid="74ad1bf274924c52af96aa4c6d431410">tempest-DeleteServersTestJSON-2033458913-project-member</nova:user>
Jan 22 00:08:50 compute-0 nova_compute[182935]:         <nova:project uuid="3822e32efd5647aebf2d79a3dd038bd4">tempest-DeleteServersTestJSON-2033458913</nova:project>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:08:50 compute-0 nova_compute[182935]:         <nova:port uuid="4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345">
Jan 22 00:08:50 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <system>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <entry name="serial">6198539d-ef01-4d47-b9af-745c9885d0e3</entry>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <entry name="uuid">6198539d-ef01-4d47-b9af-745c9885d0e3</entry>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     </system>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   <os>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   </os>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   <features>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   </features>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk.config"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:ad:df:13"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <target dev="tap4ca3ccbe-b9"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/console.log" append="off"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <video>
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     </video>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:08:50 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:08:50 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:08:50 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:08:50 compute-0 nova_compute[182935]: </domain>
Jan 22 00:08:50 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.218 182939 DEBUG nova.compute.manager [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Preparing to wait for external event network-vif-plugged-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.219 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.220 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.220 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.222 182939 DEBUG nova.virt.libvirt.vif [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:08:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-29828546',display_name='tempest-DeleteServersTestJSON-server-29828546',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-29828546',id=109,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-bh0q9q28',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:08:45Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=6198539d-ef01-4d47-b9af-745c9885d0e3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "address": "fa:16:3e:ad:df:13", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca3ccbe-b9", "ovs_interfaceid": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.222 182939 DEBUG nova.network.os_vif_util [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "address": "fa:16:3e:ad:df:13", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca3ccbe-b9", "ovs_interfaceid": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.223 182939 DEBUG nova.network.os_vif_util [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:df:13,bridge_name='br-int',has_traffic_filtering=True,id=4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca3ccbe-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.224 182939 DEBUG os_vif [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:df:13,bridge_name='br-int',has_traffic_filtering=True,id=4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca3ccbe-b9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.225 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.226 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.226 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.231 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.232 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ca3ccbe-b9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.232 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4ca3ccbe-b9, col_values=(('external_ids', {'iface-id': '4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:df:13', 'vm-uuid': '6198539d-ef01-4d47-b9af-745c9885d0e3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:50 compute-0 NetworkManager[55139]: <info>  [1769040530.2351] manager: (tap4ca3ccbe-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.234 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.236 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.240 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.240 182939 INFO os_vif [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:df:13,bridge_name='br-int',has_traffic_filtering=True,id=4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca3ccbe-b9')
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.306 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.308 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.308 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No VIF found with MAC fa:16:3e:ad:df:13, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:08:50 compute-0 nova_compute[182935]: 2026-01-22 00:08:50.309 182939 INFO nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Using config drive
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.312 182939 INFO nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Creating config drive at /var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk.config
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.321 182939 DEBUG oslo_concurrency.processutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcks8eq69 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.471 182939 DEBUG oslo_concurrency.processutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcks8eq69" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:08:51 compute-0 kernel: tap4ca3ccbe-b9: entered promiscuous mode
Jan 22 00:08:51 compute-0 NetworkManager[55139]: <info>  [1769040531.5531] manager: (tap4ca3ccbe-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Jan 22 00:08:51 compute-0 ovn_controller[95047]: 2026-01-22T00:08:51Z|00445|binding|INFO|Claiming lport 4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 for this chassis.
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.557 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:51 compute-0 ovn_controller[95047]: 2026-01-22T00:08:51Z|00446|binding|INFO|4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345: Claiming fa:16:3e:ad:df:13 10.100.0.6
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.569 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:df:13 10.100.0.6'], port_security=['fa:16:3e:ad:df:13 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6198539d-ef01-4d47-b9af-745c9885d0e3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:08:51 compute-0 ovn_controller[95047]: 2026-01-22T00:08:51Z|00447|binding|INFO|Setting lport 4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 ovn-installed in OVS
Jan 22 00:08:51 compute-0 ovn_controller[95047]: 2026-01-22T00:08:51Z|00448|binding|INFO|Setting lport 4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 up in Southbound
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.572 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e bound to our chassis
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.573 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.575 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.576 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.589 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ba9a10e7-ec32-4889-9a71-9d377f2514c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.591 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd94993bc-71 in ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:08:51 compute-0 systemd-udevd[229670]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.597 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd94993bc-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.597 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7c7a4fed-06ee-49b3-a81d-fece7584cd9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.598 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5332cae1-89dd-4eef-bc51-7cdbc1809e15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 systemd-machined[154182]: New machine qemu-58-instance-0000006d.
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.612 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[b8157f2d-4188-4afc-852f-ea4b94b8db0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 NetworkManager[55139]: <info>  [1769040531.6143] device (tap4ca3ccbe-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:08:51 compute-0 NetworkManager[55139]: <info>  [1769040531.6159] device (tap4ca3ccbe-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.628 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d41ddd51-2bab-4e08-a449-b9c671948f77]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 systemd[1]: Started Virtual Machine qemu-58-instance-0000006d.
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.665 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7e99449a-8ad6-4d73-8564-2ea75644b131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 NetworkManager[55139]: <info>  [1769040531.6739] manager: (tapd94993bc-70): new Veth device (/org/freedesktop/NetworkManager/Devices/209)
Jan 22 00:08:51 compute-0 systemd-udevd[229674]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.673 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[36e7f9bb-0ba3-4070-b9ae-63ae6e331a00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.706 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[ea91625c-aac5-40ce-8a19-5d151825f2a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.709 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[6d4198e2-186f-4aff-bc5a-59fa46dc22c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 NetworkManager[55139]: <info>  [1769040531.7384] device (tapd94993bc-70): carrier: link connected
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.744 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[83e651dd-fca7-4521-ae10-34a8cf254e88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.766 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[37999858-ca7d-4f1a-9330-44bdef213d7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503447, 'reachable_time': 20251, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229703, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.792 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[66b27e4d-5116-4f12-9af1-48249352f1db]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:eecd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 503447, 'tstamp': 503447}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229704, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.814 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e317edce-b009-4fde-bb4b-68b884af0a37]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503447, 'reachable_time': 20251, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229707, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.860 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[eacc2188-39a4-4e99-b378-a7a3a637b4f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.873 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.906 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040531.9051945, 6198539d-ef01-4d47-b9af-745c9885d0e3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.907 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] VM Started (Lifecycle Event)
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.931 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa745cc-05dc-4e26-a23e-aabc60388891]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.933 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.933 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.933 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd94993bc-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:51 compute-0 NetworkManager[55139]: <info>  [1769040531.9365] manager: (tapd94993bc-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Jan 22 00:08:51 compute-0 kernel: tapd94993bc-70: entered promiscuous mode
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.938 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.940 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.940 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd94993bc-70, col_values=(('external_ids', {'iface-id': 'd921ee25-8f8a-4375-9839-6c54ab328e88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:51 compute-0 ovn_controller[95047]: 2026-01-22T00:08:51Z|00449|binding|INFO|Releasing lport d921ee25-8f8a-4375-9839-6c54ab328e88 from this chassis (sb_readonly=0)
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.943 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.944 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[64bf2bca-579d-4880-9149-7af1e777692c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.947 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.947 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040531.9053164, 6198539d-ef01-4d47-b9af-745c9885d0e3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.948 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] VM Paused (Lifecycle Event)
Jan 22 00:08:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:51.948 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'env', 'PROCESS_TAG=haproxy-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d94993bc-77ac-42d2-88cb-3b0110dff29e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.955 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.974 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:51 compute-0 nova_compute[182935]: 2026-01-22 00:08:51.978 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:08:52 compute-0 nova_compute[182935]: 2026-01-22 00:08:52.011 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:08:52 compute-0 podman[229744]: 2026-01-22 00:08:52.319788425 +0000 UTC m=+0.027039867 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:08:52 compute-0 nova_compute[182935]: 2026-01-22 00:08:52.440 182939 DEBUG nova.network.neutron [req-afedae33-424a-402d-8a1e-97134fa8c60e req-aade3eb0-6a68-40cd-b30a-f5db6fe88752 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Updated VIF entry in instance network info cache for port 4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:08:52 compute-0 nova_compute[182935]: 2026-01-22 00:08:52.440 182939 DEBUG nova.network.neutron [req-afedae33-424a-402d-8a1e-97134fa8c60e req-aade3eb0-6a68-40cd-b30a-f5db6fe88752 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Updating instance_info_cache with network_info: [{"id": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "address": "fa:16:3e:ad:df:13", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca3ccbe-b9", "ovs_interfaceid": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:08:52 compute-0 nova_compute[182935]: 2026-01-22 00:08:52.460 182939 DEBUG oslo_concurrency.lockutils [req-afedae33-424a-402d-8a1e-97134fa8c60e req-aade3eb0-6a68-40cd-b30a-f5db6fe88752 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-6198539d-ef01-4d47-b9af-745c9885d0e3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:08:52 compute-0 podman[229744]: 2026-01-22 00:08:52.48315281 +0000 UTC m=+0.190404222 container create 80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 00:08:52 compute-0 systemd[1]: Started libpod-conmon-80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409.scope.
Jan 22 00:08:52 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:08:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f9244ff3cedd98fc61cea2634b3ff1d90e6a286084418edb3b7cf1602ff091c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:08:52 compute-0 podman[229744]: 2026-01-22 00:08:52.587631865 +0000 UTC m=+0.294883307 container init 80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 00:08:52 compute-0 podman[229744]: 2026-01-22 00:08:52.594479633 +0000 UTC m=+0.301731045 container start 80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:08:52 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229760]: [NOTICE]   (229764) : New worker (229766) forked
Jan 22 00:08:52 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229760]: [NOTICE]   (229764) : Loading success.
Jan 22 00:08:53 compute-0 podman[229776]: 2026-01-22 00:08:53.713833274 +0000 UTC m=+0.073547870 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:08:53 compute-0 podman[229775]: 2026-01-22 00:08:53.73875711 +0000 UTC m=+0.105191502 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:08:53 compute-0 nova_compute[182935]: 2026-01-22 00:08:53.995 182939 DEBUG nova.compute.manager [req-d7ab3161-b5f4-44c6-8a62-9c94a39f918d req-946bd783-68ff-4f14-af74-c7c2cf20409d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Received event network-vif-plugged-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:53 compute-0 nova_compute[182935]: 2026-01-22 00:08:53.996 182939 DEBUG oslo_concurrency.lockutils [req-d7ab3161-b5f4-44c6-8a62-9c94a39f918d req-946bd783-68ff-4f14-af74-c7c2cf20409d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:53 compute-0 nova_compute[182935]: 2026-01-22 00:08:53.996 182939 DEBUG oslo_concurrency.lockutils [req-d7ab3161-b5f4-44c6-8a62-9c94a39f918d req-946bd783-68ff-4f14-af74-c7c2cf20409d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:53 compute-0 nova_compute[182935]: 2026-01-22 00:08:53.996 182939 DEBUG oslo_concurrency.lockutils [req-d7ab3161-b5f4-44c6-8a62-9c94a39f918d req-946bd783-68ff-4f14-af74-c7c2cf20409d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:53 compute-0 nova_compute[182935]: 2026-01-22 00:08:53.996 182939 DEBUG nova.compute.manager [req-d7ab3161-b5f4-44c6-8a62-9c94a39f918d req-946bd783-68ff-4f14-af74-c7c2cf20409d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Processing event network-vif-plugged-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:08:53 compute-0 nova_compute[182935]: 2026-01-22 00:08:53.996 182939 DEBUG nova.compute.manager [req-d7ab3161-b5f4-44c6-8a62-9c94a39f918d req-946bd783-68ff-4f14-af74-c7c2cf20409d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Received event network-vif-plugged-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:53 compute-0 nova_compute[182935]: 2026-01-22 00:08:53.997 182939 DEBUG oslo_concurrency.lockutils [req-d7ab3161-b5f4-44c6-8a62-9c94a39f918d req-946bd783-68ff-4f14-af74-c7c2cf20409d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:53 compute-0 nova_compute[182935]: 2026-01-22 00:08:53.997 182939 DEBUG oslo_concurrency.lockutils [req-d7ab3161-b5f4-44c6-8a62-9c94a39f918d req-946bd783-68ff-4f14-af74-c7c2cf20409d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:53 compute-0 nova_compute[182935]: 2026-01-22 00:08:53.997 182939 DEBUG oslo_concurrency.lockutils [req-d7ab3161-b5f4-44c6-8a62-9c94a39f918d req-946bd783-68ff-4f14-af74-c7c2cf20409d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:53 compute-0 nova_compute[182935]: 2026-01-22 00:08:53.997 182939 DEBUG nova.compute.manager [req-d7ab3161-b5f4-44c6-8a62-9c94a39f918d req-946bd783-68ff-4f14-af74-c7c2cf20409d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] No waiting events found dispatching network-vif-plugged-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:08:53 compute-0 nova_compute[182935]: 2026-01-22 00:08:53.997 182939 WARNING nova.compute.manager [req-d7ab3161-b5f4-44c6-8a62-9c94a39f918d req-946bd783-68ff-4f14-af74-c7c2cf20409d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Received unexpected event network-vif-plugged-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 for instance with vm_state building and task_state spawning.
Jan 22 00:08:53 compute-0 nova_compute[182935]: 2026-01-22 00:08:53.999 182939 DEBUG nova.compute.manager [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.006 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040534.005677, 6198539d-ef01-4d47-b9af-745c9885d0e3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.006 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] VM Resumed (Lifecycle Event)
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.009 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.014 182939 INFO nova.virt.libvirt.driver [-] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Instance spawned successfully.
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.015 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.034 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.042 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.046 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.046 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.047 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.047 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.048 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.048 182939 DEBUG nova.virt.libvirt.driver [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.320 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.423 182939 INFO nova.compute.manager [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Took 8.24 seconds to spawn the instance on the hypervisor.
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.424 182939 DEBUG nova.compute.manager [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.604 182939 INFO nova.compute.manager [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Took 9.80 seconds to build instance.
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.749 182939 DEBUG oslo_concurrency.lockutils [None req-ff78a145-aa36-41eb-950c-82430d1f06c7 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:54 compute-0 nova_compute[182935]: 2026-01-22 00:08:54.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:08:55 compute-0 nova_compute[182935]: 2026-01-22 00:08:55.237 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:55 compute-0 nova_compute[182935]: 2026-01-22 00:08:55.520 182939 DEBUG nova.objects.instance [None req-8ee81270-09d9-47fc-9039-280f050614eb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6198539d-ef01-4d47-b9af-745c9885d0e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:55 compute-0 nova_compute[182935]: 2026-01-22 00:08:55.556 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040535.555485, 6198539d-ef01-4d47-b9af-745c9885d0e3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:08:55 compute-0 nova_compute[182935]: 2026-01-22 00:08:55.556 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] VM Paused (Lifecycle Event)
Jan 22 00:08:55 compute-0 nova_compute[182935]: 2026-01-22 00:08:55.635 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:55 compute-0 nova_compute[182935]: 2026-01-22 00:08:55.640 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:08:55 compute-0 nova_compute[182935]: 2026-01-22 00:08:55.860 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 22 00:08:56 compute-0 kernel: tap4ca3ccbe-b9 (unregistering): left promiscuous mode
Jan 22 00:08:56 compute-0 NetworkManager[55139]: <info>  [1769040536.2493] device (tap4ca3ccbe-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:08:56 compute-0 ovn_controller[95047]: 2026-01-22T00:08:56Z|00450|binding|INFO|Releasing lport 4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 from this chassis (sb_readonly=0)
Jan 22 00:08:56 compute-0 ovn_controller[95047]: 2026-01-22T00:08:56Z|00451|binding|INFO|Setting lport 4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 down in Southbound
Jan 22 00:08:56 compute-0 ovn_controller[95047]: 2026-01-22T00:08:56Z|00452|binding|INFO|Removing iface tap4ca3ccbe-b9 ovn-installed in OVS
Jan 22 00:08:56 compute-0 nova_compute[182935]: 2026-01-22 00:08:56.262 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:56 compute-0 nova_compute[182935]: 2026-01-22 00:08:56.282 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.287 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ad:df:13 10.100.0.6'], port_security=['fa:16:3e:ad:df:13 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '6198539d-ef01-4d47-b9af-745c9885d0e3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.290 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e unbound from our chassis
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.293 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d94993bc-77ac-42d2-88cb-3b0110dff29e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.295 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[066a5a68-3ed9-437d-a6f4-78c685d9bbc9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.295 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace which is not needed anymore
Jan 22 00:08:56 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000006d.scope: Deactivated successfully.
Jan 22 00:08:56 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000006d.scope: Consumed 1.955s CPU time.
Jan 22 00:08:56 compute-0 systemd-machined[154182]: Machine qemu-58-instance-0000006d terminated.
Jan 22 00:08:56 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229760]: [NOTICE]   (229764) : haproxy version is 2.8.14-c23fe91
Jan 22 00:08:56 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229760]: [NOTICE]   (229764) : path to executable is /usr/sbin/haproxy
Jan 22 00:08:56 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229760]: [WARNING]  (229764) : Exiting Master process...
Jan 22 00:08:56 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229760]: [ALERT]    (229764) : Current worker (229766) exited with code 143 (Terminated)
Jan 22 00:08:56 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[229760]: [WARNING]  (229764) : All workers exited. Exiting... (0)
Jan 22 00:08:56 compute-0 systemd[1]: libpod-80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409.scope: Deactivated successfully.
Jan 22 00:08:56 compute-0 podman[229853]: 2026-01-22 00:08:56.471848289 +0000 UTC m=+0.051217815 container died 80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:08:56 compute-0 nova_compute[182935]: 2026-01-22 00:08:56.493 182939 DEBUG nova.compute.manager [None req-8ee81270-09d9-47fc-9039-280f050614eb 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:08:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409-userdata-shm.mount: Deactivated successfully.
Jan 22 00:08:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-5f9244ff3cedd98fc61cea2634b3ff1d90e6a286084418edb3b7cf1602ff091c-merged.mount: Deactivated successfully.
Jan 22 00:08:56 compute-0 podman[229853]: 2026-01-22 00:08:56.514276279 +0000 UTC m=+0.093645795 container cleanup 80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:08:56 compute-0 systemd[1]: libpod-conmon-80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409.scope: Deactivated successfully.
Jan 22 00:08:56 compute-0 podman[229896]: 2026-01-22 00:08:56.591653458 +0000 UTC m=+0.049943065 container remove 80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.597 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a672f8-aff4-44a3-aaee-55f85c76720e]: (4, ('Thu Jan 22 12:08:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409)\n80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409\nThu Jan 22 12:08:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409)\n80ab0cdab5aec1a0bbbbc844db0102ad30cb9932cfe4c6721d4cb726b4e92409\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.600 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0d834746-ed5e-41b2-8ddc-be34ee7fb8ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.601 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:56 compute-0 nova_compute[182935]: 2026-01-22 00:08:56.603 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:56 compute-0 kernel: tapd94993bc-70: left promiscuous mode
Jan 22 00:08:56 compute-0 nova_compute[182935]: 2026-01-22 00:08:56.619 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.623 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb96d46-d7fb-4ef3-884e-201bd1c8d6db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.640 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6405dad6-224f-483c-b766-80fdc465aa4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.641 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1bb9afb4-7d40-4e78-8eb7-1a3d0aa1b49f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.657 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5f10fe40-6c16-4f4e-b320-dc03ddaa207a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 503439, 'reachable_time': 21699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229916, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:56 compute-0 systemd[1]: run-netns-ovnmeta\x2dd94993bc\x2d77ac\x2d42d2\x2d88cb\x2d3b0110dff29e.mount: Deactivated successfully.
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.661 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:08:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:08:56.662 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[a92a8bcb-51da-45cd-ad47-1ed306bb4dc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:08:56 compute-0 nova_compute[182935]: 2026-01-22 00:08:56.876 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:57 compute-0 nova_compute[182935]: 2026-01-22 00:08:57.271 182939 DEBUG nova.compute.manager [req-d22490fe-cdb7-4215-9406-18416c4f2693 req-be4c7580-7dc5-478e-9ef0-47ca3d205fc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Received event network-vif-unplugged-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:57 compute-0 nova_compute[182935]: 2026-01-22 00:08:57.272 182939 DEBUG oslo_concurrency.lockutils [req-d22490fe-cdb7-4215-9406-18416c4f2693 req-be4c7580-7dc5-478e-9ef0-47ca3d205fc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:57 compute-0 nova_compute[182935]: 2026-01-22 00:08:57.273 182939 DEBUG oslo_concurrency.lockutils [req-d22490fe-cdb7-4215-9406-18416c4f2693 req-be4c7580-7dc5-478e-9ef0-47ca3d205fc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:57 compute-0 nova_compute[182935]: 2026-01-22 00:08:57.273 182939 DEBUG oslo_concurrency.lockutils [req-d22490fe-cdb7-4215-9406-18416c4f2693 req-be4c7580-7dc5-478e-9ef0-47ca3d205fc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:57 compute-0 nova_compute[182935]: 2026-01-22 00:08:57.274 182939 DEBUG nova.compute.manager [req-d22490fe-cdb7-4215-9406-18416c4f2693 req-be4c7580-7dc5-478e-9ef0-47ca3d205fc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] No waiting events found dispatching network-vif-unplugged-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:08:57 compute-0 nova_compute[182935]: 2026-01-22 00:08:57.275 182939 WARNING nova.compute.manager [req-d22490fe-cdb7-4215-9406-18416c4f2693 req-be4c7580-7dc5-478e-9ef0-47ca3d205fc6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Received unexpected event network-vif-unplugged-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 for instance with vm_state suspended and task_state None.
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.741 182939 DEBUG oslo_concurrency.lockutils [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "6198539d-ef01-4d47-b9af-745c9885d0e3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.742 182939 DEBUG oslo_concurrency.lockutils [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.743 182939 DEBUG oslo_concurrency.lockutils [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.743 182939 DEBUG oslo_concurrency.lockutils [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.744 182939 DEBUG oslo_concurrency.lockutils [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.761 182939 INFO nova.compute.manager [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Terminating instance
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.776 182939 DEBUG nova.compute.manager [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.787 182939 INFO nova.virt.libvirt.driver [-] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Instance destroyed successfully.
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.787 182939 DEBUG nova.objects.instance [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'resources' on Instance uuid 6198539d-ef01-4d47-b9af-745c9885d0e3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.822 182939 DEBUG nova.virt.libvirt.vif [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:08:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-29828546',display_name='tempest-DeleteServersTestJSON-server-29828546',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-29828546',id=109,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:08:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-bh0q9q28',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:08:56Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=6198539d-ef01-4d47-b9af-745c9885d0e3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "address": "fa:16:3e:ad:df:13", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca3ccbe-b9", "ovs_interfaceid": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.823 182939 DEBUG nova.network.os_vif_util [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "address": "fa:16:3e:ad:df:13", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca3ccbe-b9", "ovs_interfaceid": "4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.824 182939 DEBUG nova.network.os_vif_util [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:df:13,bridge_name='br-int',has_traffic_filtering=True,id=4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca3ccbe-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.825 182939 DEBUG os_vif [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:df:13,bridge_name='br-int',has_traffic_filtering=True,id=4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca3ccbe-b9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.827 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.828 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ca3ccbe-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.833 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.838 182939 INFO os_vif [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:df:13,bridge_name='br-int',has_traffic_filtering=True,id=4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca3ccbe-b9')
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.839 182939 INFO nova.virt.libvirt.driver [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Deleting instance files /var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3_del
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.840 182939 INFO nova.virt.libvirt.driver [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Deletion of /var/lib/nova/instances/6198539d-ef01-4d47-b9af-745c9885d0e3_del complete
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.972 182939 INFO nova.compute.manager [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Took 0.20 seconds to destroy the instance on the hypervisor.
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.975 182939 DEBUG oslo.service.loopingcall [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.976 182939 DEBUG nova.compute.manager [-] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:08:58 compute-0 nova_compute[182935]: 2026-01-22 00:08:58.976 182939 DEBUG nova.network.neutron [-] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:08:59 compute-0 nova_compute[182935]: 2026-01-22 00:08:59.607 182939 DEBUG nova.compute.manager [req-01b02a0b-3411-46ef-a99f-8c55d9038431 req-b091401a-b831-47be-8317-1ef82d7a27a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Received event network-vif-plugged-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:08:59 compute-0 nova_compute[182935]: 2026-01-22 00:08:59.608 182939 DEBUG oslo_concurrency.lockutils [req-01b02a0b-3411-46ef-a99f-8c55d9038431 req-b091401a-b831-47be-8317-1ef82d7a27a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:08:59 compute-0 nova_compute[182935]: 2026-01-22 00:08:59.608 182939 DEBUG oslo_concurrency.lockutils [req-01b02a0b-3411-46ef-a99f-8c55d9038431 req-b091401a-b831-47be-8317-1ef82d7a27a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:08:59 compute-0 nova_compute[182935]: 2026-01-22 00:08:59.608 182939 DEBUG oslo_concurrency.lockutils [req-01b02a0b-3411-46ef-a99f-8c55d9038431 req-b091401a-b831-47be-8317-1ef82d7a27a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:08:59 compute-0 nova_compute[182935]: 2026-01-22 00:08:59.609 182939 DEBUG nova.compute.manager [req-01b02a0b-3411-46ef-a99f-8c55d9038431 req-b091401a-b831-47be-8317-1ef82d7a27a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] No waiting events found dispatching network-vif-plugged-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:08:59 compute-0 nova_compute[182935]: 2026-01-22 00:08:59.609 182939 WARNING nova.compute.manager [req-01b02a0b-3411-46ef-a99f-8c55d9038431 req-b091401a-b831-47be-8317-1ef82d7a27a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Received unexpected event network-vif-plugged-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 for instance with vm_state suspended and task_state deleting.
Jan 22 00:09:00 compute-0 podman[229917]: 2026-01-22 00:09:00.715527431 +0000 UTC m=+0.082429576 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:09:00 compute-0 nova_compute[182935]: 2026-01-22 00:09:00.959 182939 DEBUG nova.network.neutron [-] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:01 compute-0 nova_compute[182935]: 2026-01-22 00:09:01.009 182939 INFO nova.compute.manager [-] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Took 2.03 seconds to deallocate network for instance.
Jan 22 00:09:01 compute-0 nova_compute[182935]: 2026-01-22 00:09:01.152 182939 DEBUG oslo_concurrency.lockutils [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:01 compute-0 nova_compute[182935]: 2026-01-22 00:09:01.153 182939 DEBUG oslo_concurrency.lockutils [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:01 compute-0 nova_compute[182935]: 2026-01-22 00:09:01.258 182939 DEBUG nova.compute.provider_tree [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:09:01 compute-0 nova_compute[182935]: 2026-01-22 00:09:01.279 182939 DEBUG nova.scheduler.client.report [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:09:01 compute-0 nova_compute[182935]: 2026-01-22 00:09:01.654 182939 DEBUG oslo_concurrency.lockutils [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.501s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:01 compute-0 nova_compute[182935]: 2026-01-22 00:09:01.778 182939 DEBUG nova.compute.manager [req-845270ed-ac15-472a-95bc-657f9c62152f req-f601878f-0be2-4594-ad45-78eeaf110d50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Received event network-vif-deleted-4ca3ccbe-b9f7-4cab-9547-13bdfe7dc345 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:01 compute-0 nova_compute[182935]: 2026-01-22 00:09:01.849 182939 INFO nova.scheduler.client.report [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Deleted allocations for instance 6198539d-ef01-4d47-b9af-745c9885d0e3
Jan 22 00:09:01 compute-0 nova_compute[182935]: 2026-01-22 00:09:01.880 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:02 compute-0 nova_compute[182935]: 2026-01-22 00:09:02.072 182939 DEBUG oslo_concurrency.lockutils [None req-28b88776-984f-4ce0-a207-16f15539be28 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "6198539d-ef01-4d47-b9af-745c9885d0e3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:03.205 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:03.206 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:03.206 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:03 compute-0 nova_compute[182935]: 2026-01-22 00:09:03.832 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:05 compute-0 podman[229940]: 2026-01-22 00:09:05.70196945 +0000 UTC m=+0.069828685 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:09:06 compute-0 nova_compute[182935]: 2026-01-22 00:09:06.950 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:08 compute-0 nova_compute[182935]: 2026-01-22 00:09:08.245 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:08 compute-0 nova_compute[182935]: 2026-01-22 00:09:08.245 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:08 compute-0 nova_compute[182935]: 2026-01-22 00:09:08.281 182939 DEBUG nova.compute.manager [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:09:08 compute-0 nova_compute[182935]: 2026-01-22 00:09:08.524 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:08 compute-0 nova_compute[182935]: 2026-01-22 00:09:08.525 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:08 compute-0 nova_compute[182935]: 2026-01-22 00:09:08.532 182939 DEBUG nova.virt.hardware [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:09:08 compute-0 nova_compute[182935]: 2026-01-22 00:09:08.533 182939 INFO nova.compute.claims [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:09:08 compute-0 nova_compute[182935]: 2026-01-22 00:09:08.834 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:08 compute-0 nova_compute[182935]: 2026-01-22 00:09:08.890 182939 DEBUG nova.compute.provider_tree [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:09:08 compute-0 nova_compute[182935]: 2026-01-22 00:09:08.908 182939 DEBUG nova.scheduler.client.report [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:09:08 compute-0 nova_compute[182935]: 2026-01-22 00:09:08.936 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.412s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:08 compute-0 nova_compute[182935]: 2026-01-22 00:09:08.937 182939 DEBUG nova.compute.manager [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.015 182939 DEBUG nova.compute.manager [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.016 182939 DEBUG nova.network.neutron [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.048 182939 INFO nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.080 182939 DEBUG nova.compute.manager [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.217 182939 DEBUG nova.compute.manager [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.219 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.220 182939 INFO nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Creating image(s)
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.221 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.221 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.223 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.244 182939 DEBUG oslo_concurrency.processutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.312 182939 DEBUG nova.policy [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '74ad1bf274924c52af96aa4c6d431410', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.343 182939 DEBUG oslo_concurrency.processutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.345 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.345 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.357 182939 DEBUG oslo_concurrency.processutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.417 182939 DEBUG oslo_concurrency.processutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:09 compute-0 nova_compute[182935]: 2026-01-22 00:09:09.419 182939 DEBUG oslo_concurrency.processutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:10 compute-0 nova_compute[182935]: 2026-01-22 00:09:10.767 182939 DEBUG nova.network.neutron [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Successfully created port: bc1908f8-b8ef-40b4-9e46-8e8664065a89 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:09:10 compute-0 podman[229968]: 2026-01-22 00:09:10.862768751 +0000 UTC m=+0.075685050 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, vcs-type=git)
Jan 22 00:09:10 compute-0 podman[229969]: 2026-01-22 00:09:10.904440664 +0000 UTC m=+0.106874160 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 22 00:09:10 compute-0 nova_compute[182935]: 2026-01-22 00:09:10.902 182939 DEBUG oslo_concurrency.processutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk 1073741824" returned: 0 in 1.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:10 compute-0 nova_compute[182935]: 2026-01-22 00:09:10.904 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:10 compute-0 nova_compute[182935]: 2026-01-22 00:09:10.904 182939 DEBUG oslo_concurrency.processutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:10 compute-0 nova_compute[182935]: 2026-01-22 00:09:10.972 182939 DEBUG oslo_concurrency.processutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:10 compute-0 nova_compute[182935]: 2026-01-22 00:09:10.973 182939 DEBUG nova.virt.disk.api [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Checking if we can resize image /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:09:10 compute-0 nova_compute[182935]: 2026-01-22 00:09:10.973 182939 DEBUG oslo_concurrency.processutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.063 182939 DEBUG oslo_concurrency.processutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.064 182939 DEBUG nova.virt.disk.api [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Cannot resize image /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.065 182939 DEBUG nova.objects.instance [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'migration_context' on Instance uuid 07bda903-2298-433c-aa7d-9a50380e24f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.082 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.083 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Ensure instance console log exists: /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.083 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.083 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.084 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.495 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040536.4931033, 6198539d-ef01-4d47-b9af-745c9885d0e3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.496 182939 INFO nova.compute.manager [-] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] VM Stopped (Lifecycle Event)
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.522 182939 DEBUG nova.compute.manager [None req-80f91478-c85a-4f92-a5fc-a491b8c55369 - - - - - -] [instance: 6198539d-ef01-4d47-b9af-745c9885d0e3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.731 182939 DEBUG nova.network.neutron [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Successfully updated port: bc1908f8-b8ef-40b4-9e46-8e8664065a89 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.750 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.750 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquired lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.751 182939 DEBUG nova.network.neutron [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.898 182939 DEBUG nova.compute.manager [req-544ff59b-505b-4b66-b919-eb01a8f1afce req-0fc6ee74-b77f-4288-a4c7-773a82d9f165 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-changed-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.898 182939 DEBUG nova.compute.manager [req-544ff59b-505b-4b66-b919-eb01a8f1afce req-0fc6ee74-b77f-4288-a4c7-773a82d9f165 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Refreshing instance network info cache due to event network-changed-bc1908f8-b8ef-40b4-9e46-8e8664065a89. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.898 182939 DEBUG oslo_concurrency.lockutils [req-544ff59b-505b-4b66-b919-eb01a8f1afce req-0fc6ee74-b77f-4288-a4c7-773a82d9f165 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.952 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:11 compute-0 nova_compute[182935]: 2026-01-22 00:09:11.961 182939 DEBUG nova.network.neutron [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.333 182939 DEBUG nova.network.neutron [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating instance_info_cache with network_info: [{"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.354 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Releasing lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.355 182939 DEBUG nova.compute.manager [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Instance network_info: |[{"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.355 182939 DEBUG oslo_concurrency.lockutils [req-544ff59b-505b-4b66-b919-eb01a8f1afce req-0fc6ee74-b77f-4288-a4c7-773a82d9f165 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.355 182939 DEBUG nova.network.neutron [req-544ff59b-505b-4b66-b919-eb01a8f1afce req-0fc6ee74-b77f-4288-a4c7-773a82d9f165 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Refreshing network info cache for port bc1908f8-b8ef-40b4-9e46-8e8664065a89 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.360 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Start _get_guest_xml network_info=[{"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.366 182939 WARNING nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.373 182939 DEBUG nova.virt.libvirt.host [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.374 182939 DEBUG nova.virt.libvirt.host [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.379 182939 DEBUG nova.virt.libvirt.host [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.379 182939 DEBUG nova.virt.libvirt.host [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.381 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.381 182939 DEBUG nova.virt.hardware [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.381 182939 DEBUG nova.virt.hardware [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.381 182939 DEBUG nova.virt.hardware [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.382 182939 DEBUG nova.virt.hardware [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.382 182939 DEBUG nova.virt.hardware [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.382 182939 DEBUG nova.virt.hardware [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.382 182939 DEBUG nova.virt.hardware [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.383 182939 DEBUG nova.virt.hardware [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.383 182939 DEBUG nova.virt.hardware [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.383 182939 DEBUG nova.virt.hardware [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.383 182939 DEBUG nova.virt.hardware [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.387 182939 DEBUG nova.virt.libvirt.vif [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:09:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-554893725',display_name='tempest-DeleteServersTestJSON-server-554893725',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-554893725',id=111,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-4j2wmp3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:09:09Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=07bda903-2298-433c-aa7d-9a50380e24f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.388 182939 DEBUG nova.network.os_vif_util [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.389 182939 DEBUG nova.network.os_vif_util [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.390 182939 DEBUG nova.objects.instance [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 07bda903-2298-433c-aa7d-9a50380e24f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.408 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:09:13 compute-0 nova_compute[182935]:   <uuid>07bda903-2298-433c-aa7d-9a50380e24f1</uuid>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   <name>instance-0000006f</name>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <nova:name>tempest-DeleteServersTestJSON-server-554893725</nova:name>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:09:13</nova:creationTime>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:09:13 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:09:13 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:09:13 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:09:13 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:09:13 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:09:13 compute-0 nova_compute[182935]:         <nova:user uuid="74ad1bf274924c52af96aa4c6d431410">tempest-DeleteServersTestJSON-2033458913-project-member</nova:user>
Jan 22 00:09:13 compute-0 nova_compute[182935]:         <nova:project uuid="3822e32efd5647aebf2d79a3dd038bd4">tempest-DeleteServersTestJSON-2033458913</nova:project>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:09:13 compute-0 nova_compute[182935]:         <nova:port uuid="bc1908f8-b8ef-40b4-9e46-8e8664065a89">
Jan 22 00:09:13 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <system>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <entry name="serial">07bda903-2298-433c-aa7d-9a50380e24f1</entry>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <entry name="uuid">07bda903-2298-433c-aa7d-9a50380e24f1</entry>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     </system>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   <os>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   </os>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   <features>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   </features>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.config"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:45:56:97"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <target dev="tapbc1908f8-b8"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/console.log" append="off"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <video>
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     </video>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:09:13 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:09:13 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:09:13 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:09:13 compute-0 nova_compute[182935]: </domain>
Jan 22 00:09:13 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.410 182939 DEBUG nova.compute.manager [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Preparing to wait for external event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.411 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.411 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.411 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.412 182939 DEBUG nova.virt.libvirt.vif [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:09:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-554893725',display_name='tempest-DeleteServersTestJSON-server-554893725',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-554893725',id=111,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-4j2wmp3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:09:09Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=07bda903-2298-433c-aa7d-9a50380e24f1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.412 182939 DEBUG nova.network.os_vif_util [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.413 182939 DEBUG nova.network.os_vif_util [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.413 182939 DEBUG os_vif [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.414 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.414 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.415 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.417 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.418 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc1908f8-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.418 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbc1908f8-b8, col_values=(('external_ids', {'iface-id': 'bc1908f8-b8ef-40b4-9e46-8e8664065a89', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:56:97', 'vm-uuid': '07bda903-2298-433c-aa7d-9a50380e24f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.419 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-0 NetworkManager[55139]: <info>  [1769040553.4206] manager: (tapbc1908f8-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/211)
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.421 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.429 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.430 182939 INFO os_vif [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8')
Jan 22 00:09:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:13.501 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.501 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:13.504 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.815 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.815 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.815 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] No VIF found with MAC fa:16:3e:45:56:97, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:09:13 compute-0 nova_compute[182935]: 2026-01-22 00:09:13.816 182939 INFO nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Using config drive
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.318 182939 INFO nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Creating config drive at /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.config
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.323 182939 DEBUG oslo_concurrency.processutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_e45_bro execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.452 182939 DEBUG oslo_concurrency.processutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_e45_bro" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:14 compute-0 kernel: tapbc1908f8-b8: entered promiscuous mode
Jan 22 00:09:14 compute-0 NetworkManager[55139]: <info>  [1769040554.5279] manager: (tapbc1908f8-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Jan 22 00:09:14 compute-0 ovn_controller[95047]: 2026-01-22T00:09:14Z|00453|binding|INFO|Claiming lport bc1908f8-b8ef-40b4-9e46-8e8664065a89 for this chassis.
Jan 22 00:09:14 compute-0 ovn_controller[95047]: 2026-01-22T00:09:14Z|00454|binding|INFO|bc1908f8-b8ef-40b4-9e46-8e8664065a89: Claiming fa:16:3e:45:56:97 10.100.0.12
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.528 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:14 compute-0 ovn_controller[95047]: 2026-01-22T00:09:14Z|00455|binding|INFO|Setting lport bc1908f8-b8ef-40b4-9e46-8e8664065a89 ovn-installed in OVS
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.541 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.543 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:14 compute-0 systemd-udevd[230035]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:09:14 compute-0 NetworkManager[55139]: <info>  [1769040554.5741] device (tapbc1908f8-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:09:14 compute-0 NetworkManager[55139]: <info>  [1769040554.5748] device (tapbc1908f8-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:09:14 compute-0 systemd-machined[154182]: New machine qemu-59-instance-0000006f.
Jan 22 00:09:14 compute-0 systemd[1]: Started Virtual Machine qemu-59-instance-0000006f.
Jan 22 00:09:14 compute-0 ovn_controller[95047]: 2026-01-22T00:09:14Z|00456|binding|INFO|Setting lport bc1908f8-b8ef-40b4-9e46-8e8664065a89 up in Southbound
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.810 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:56:97 10.100.0.12'], port_security=['fa:16:3e:45:56:97 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '07bda903-2298-433c-aa7d-9a50380e24f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=bc1908f8-b8ef-40b4-9e46-8e8664065a89) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.811 104408 INFO neutron.agent.ovn.metadata.agent [-] Port bc1908f8-b8ef-40b4-9e46-8e8664065a89 in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e bound to our chassis
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.813 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.824 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b7e58f-4f84-42e3-816b-0c08072c9c93]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.825 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd94993bc-71 in ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.828 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd94993bc-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.828 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cef45459-9d88-4233-92ea-34b6be31af45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.829 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[357a7b46-25d2-4bf6-94e6-91efe37135e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.848 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[a3714577-179f-445b-a905-2c186f0f0cbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.880 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9aa729e0-b189-4f0a-8b55-2b66ac7f6ba2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.911 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[84425b8f-f128-42b6-878d-98044e61b9a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:14 compute-0 systemd-udevd[230038]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:09:14 compute-0 NetworkManager[55139]: <info>  [1769040554.9205] manager: (tapd94993bc-70): new Veth device (/org/freedesktop/NetworkManager/Devices/213)
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.918 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7c249256-c5ea-4c3b-b533-87e60ae0ba14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.925 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040554.9242263, 07bda903-2298-433c-aa7d-9a50380e24f1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.925 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] VM Started (Lifecycle Event)
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.952 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[3395762d-f409-4cea-bd45-b5206dfb4739]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.953 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.956 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[0c5ca0ab-9287-40a6-9e96-b032c3e54517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.958 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040554.924357, 07bda903-2298-433c-aa7d-9a50380e24f1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.959 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] VM Paused (Lifecycle Event)
Jan 22 00:09:14 compute-0 NetworkManager[55139]: <info>  [1769040554.9761] device (tapd94993bc-70): carrier: link connected
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.980 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:14 compute-0 nova_compute[182935]: 2026-01-22 00:09:14.983 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:09:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:14.986 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b237accb-486e-4c6c-9104-f2f6fee8b16f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.007 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[73cfeac1-1db8-4e76-9623-6fe3ffecfb94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 145], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505771, 'reachable_time': 44115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230076, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.029 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f9261099-1426-4c2a-ab4f-10faeb2cde50]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:eecd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505771, 'tstamp': 505771}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230077, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.044 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[367e486d-dd9f-40f7-86a1-2268769a6086]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd94993bc-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:ee:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 145], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505771, 'reachable_time': 44115, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230078, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.047 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.075 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a4a9c519-5984-4e95-9a9e-69a345cd0782]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.137 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb20c11-fdcf-416f-9211-f7a3eb1361b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.138 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.138 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.139 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd94993bc-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.178 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:15 compute-0 kernel: tapd94993bc-70: entered promiscuous mode
Jan 22 00:09:15 compute-0 NetworkManager[55139]: <info>  [1769040555.1809] manager: (tapd94993bc-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.181 182939 DEBUG nova.network.neutron [req-544ff59b-505b-4b66-b919-eb01a8f1afce req-0fc6ee74-b77f-4288-a4c7-773a82d9f165 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updated VIF entry in instance network info cache for port bc1908f8-b8ef-40b4-9e46-8e8664065a89. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.182 182939 DEBUG nova.network.neutron [req-544ff59b-505b-4b66-b919-eb01a8f1afce req-0fc6ee74-b77f-4288-a4c7-773a82d9f165 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating instance_info_cache with network_info: [{"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.182 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd94993bc-70, col_values=(('external_ids', {'iface-id': 'd921ee25-8f8a-4375-9839-6c54ab328e88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:15 compute-0 ovn_controller[95047]: 2026-01-22T00:09:15Z|00457|binding|INFO|Releasing lport d921ee25-8f8a-4375-9839-6c54ab328e88 from this chassis (sb_readonly=0)
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.186 182939 DEBUG nova.compute.manager [req-0f2a800f-2870-4c40-a968-5ee3fb708f8c req-300b285d-78c7-474d-bfa4-057dd08147c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.186 182939 DEBUG oslo_concurrency.lockutils [req-0f2a800f-2870-4c40-a968-5ee3fb708f8c req-300b285d-78c7-474d-bfa4-057dd08147c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.186 182939 DEBUG oslo_concurrency.lockutils [req-0f2a800f-2870-4c40-a968-5ee3fb708f8c req-300b285d-78c7-474d-bfa4-057dd08147c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.187 182939 DEBUG oslo_concurrency.lockutils [req-0f2a800f-2870-4c40-a968-5ee3fb708f8c req-300b285d-78c7-474d-bfa4-057dd08147c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.187 182939 DEBUG nova.compute.manager [req-0f2a800f-2870-4c40-a968-5ee3fb708f8c req-300b285d-78c7-474d-bfa4-057dd08147c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Processing event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.187 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.189 182939 DEBUG nova.compute.manager [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.195 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040555.195596, 07bda903-2298-433c-aa7d-9a50380e24f1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.196 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] VM Resumed (Lifecycle Event)
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.198 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.201 182939 DEBUG oslo_concurrency.lockutils [req-544ff59b-505b-4b66-b919-eb01a8f1afce req-0fc6ee74-b77f-4288-a4c7-773a82d9f165 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.202 182939 INFO nova.virt.libvirt.driver [-] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Instance spawned successfully.
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.202 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.206 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.207 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.208 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc5aff8-7ac9-4530-b56a-33c2a4a9e2f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.209 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/d94993bc-77ac-42d2-88cb-3b0110dff29e.pid.haproxy
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID d94993bc-77ac-42d2-88cb-3b0110dff29e
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:09:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:15.209 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'env', 'PROCESS_TAG=haproxy-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d94993bc-77ac-42d2-88cb-3b0110dff29e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.218 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.227 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.230 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.231 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.231 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.232 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.232 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.232 182939 DEBUG nova.virt.libvirt.driver [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:15 compute-0 nova_compute[182935]: 2026-01-22 00:09:15.285 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:09:15 compute-0 podman[230110]: 2026-01-22 00:09:15.566249472 +0000 UTC m=+0.030137098 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:09:16 compute-0 nova_compute[182935]: 2026-01-22 00:09:16.023 182939 INFO nova.compute.manager [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Took 6.81 seconds to spawn the instance on the hypervisor.
Jan 22 00:09:16 compute-0 nova_compute[182935]: 2026-01-22 00:09:16.024 182939 DEBUG nova.compute.manager [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:16 compute-0 nova_compute[182935]: 2026-01-22 00:09:16.208 182939 INFO nova.compute.manager [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Took 7.73 seconds to build instance.
Jan 22 00:09:16 compute-0 nova_compute[182935]: 2026-01-22 00:09:16.243 182939 DEBUG oslo_concurrency.lockutils [None req-a338bc83-44fe-4c22-88e9-efef1fdc4a69 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.998s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:16 compute-0 podman[230110]: 2026-01-22 00:09:16.44367235 +0000 UTC m=+0.907559926 container create 19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:09:16 compute-0 systemd[1]: Started libpod-conmon-19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449.scope.
Jan 22 00:09:16 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:09:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d627493690f2e8e674c09c86cc0ced295e0cfa538249ffb5544bee379410231/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:09:16 compute-0 nova_compute[182935]: 2026-01-22 00:09:16.955 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:16 compute-0 podman[230110]: 2026-01-22 00:09:16.990660873 +0000 UTC m=+1.454548499 container init 19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:09:16 compute-0 podman[230110]: 2026-01-22 00:09:16.999112519 +0000 UTC m=+1.463000085 container start 19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:09:17 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[230126]: [NOTICE]   (230130) : New worker (230132) forked
Jan 22 00:09:17 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[230126]: [NOTICE]   (230130) : Loading success.
Jan 22 00:09:17 compute-0 nova_compute[182935]: 2026-01-22 00:09:17.530 182939 DEBUG nova.compute.manager [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:17 compute-0 nova_compute[182935]: 2026-01-22 00:09:17.530 182939 DEBUG oslo_concurrency.lockutils [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:17 compute-0 nova_compute[182935]: 2026-01-22 00:09:17.531 182939 DEBUG oslo_concurrency.lockutils [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:17 compute-0 nova_compute[182935]: 2026-01-22 00:09:17.531 182939 DEBUG oslo_concurrency.lockutils [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:17 compute-0 nova_compute[182935]: 2026-01-22 00:09:17.532 182939 DEBUG nova.compute.manager [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] No waiting events found dispatching network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:17 compute-0 nova_compute[182935]: 2026-01-22 00:09:17.532 182939 WARNING nova.compute.manager [req-c516740d-1950-410c-b6a4-8bbe239d59bb req-1333342a-b872-48a3-b848-d3093498c1ee 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received unexpected event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 for instance with vm_state active and task_state None.
Jan 22 00:09:18 compute-0 nova_compute[182935]: 2026-01-22 00:09:18.421 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:21 compute-0 nova_compute[182935]: 2026-01-22 00:09:21.957 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:22 compute-0 nova_compute[182935]: 2026-01-22 00:09:22.165 182939 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:22 compute-0 nova_compute[182935]: 2026-01-22 00:09:22.165 182939 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquired lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:22 compute-0 nova_compute[182935]: 2026-01-22 00:09:22.166 182939 DEBUG nova.network.neutron [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:09:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:22.506 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:23 compute-0 nova_compute[182935]: 2026-01-22 00:09:23.423 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:24 compute-0 nova_compute[182935]: 2026-01-22 00:09:24.246 182939 DEBUG nova.network.neutron [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating instance_info_cache with network_info: [{"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:24 compute-0 nova_compute[182935]: 2026-01-22 00:09:24.274 182939 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Releasing lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:24 compute-0 nova_compute[182935]: 2026-01-22 00:09:24.411 182939 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 22 00:09:24 compute-0 nova_compute[182935]: 2026-01-22 00:09:24.411 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Creating file /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/07c55cc59d074e0a9301d4933f321947.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 22 00:09:24 compute-0 nova_compute[182935]: 2026-01-22 00:09:24.411 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/07c55cc59d074e0a9301d4933f321947.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:24 compute-0 podman[230143]: 2026-01-22 00:09:24.68068783 +0000 UTC m=+0.055860832 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:09:24 compute-0 podman[230142]: 2026-01-22 00:09:24.718868983 +0000 UTC m=+0.088084716 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:09:24 compute-0 nova_compute[182935]: 2026-01-22 00:09:24.922 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/07c55cc59d074e0a9301d4933f321947.tmp" returned: 1 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:24 compute-0 nova_compute[182935]: 2026-01-22 00:09:24.923 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/07c55cc59d074e0a9301d4933f321947.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 00:09:24 compute-0 nova_compute[182935]: 2026-01-22 00:09:24.923 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Creating directory /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 22 00:09:24 compute-0 nova_compute[182935]: 2026-01-22 00:09:24.923 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:25 compute-0 sshd-session[230193]: Invalid user svn from 188.166.69.60 port 34638
Jan 22 00:09:25 compute-0 nova_compute[182935]: 2026-01-22 00:09:25.142 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1" returned: 0 in 0.219s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:25 compute-0 nova_compute[182935]: 2026-01-22 00:09:25.147 182939 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:09:25 compute-0 sshd-session[230193]: Connection closed by invalid user svn 188.166.69.60 port 34638 [preauth]
Jan 22 00:09:26 compute-0 nova_compute[182935]: 2026-01-22 00:09:26.959 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:28 compute-0 nova_compute[182935]: 2026-01-22 00:09:28.425 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:30 compute-0 ovn_controller[95047]: 2026-01-22T00:09:30Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:56:97 10.100.0.12
Jan 22 00:09:30 compute-0 ovn_controller[95047]: 2026-01-22T00:09:30Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:56:97 10.100.0.12
Jan 22 00:09:31 compute-0 podman[230213]: 2026-01-22 00:09:31.705334749 +0000 UTC m=+0.076705743 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:09:31 compute-0 nova_compute[182935]: 2026-01-22 00:09:31.960 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:33 compute-0 nova_compute[182935]: 2026-01-22 00:09:33.428 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:35 compute-0 nova_compute[182935]: 2026-01-22 00:09:35.196 182939 DEBUG nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:09:36 compute-0 podman[230237]: 2026-01-22 00:09:36.720870781 +0000 UTC m=+0.090188005 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:09:36 compute-0 nova_compute[182935]: 2026-01-22 00:09:36.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:36 compute-0 nova_compute[182935]: 2026-01-22 00:09:36.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:09:36 compute-0 nova_compute[182935]: 2026-01-22 00:09:36.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:09:36 compute-0 nova_compute[182935]: 2026-01-22 00:09:36.826 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:36 compute-0 nova_compute[182935]: 2026-01-22 00:09:36.826 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:36 compute-0 nova_compute[182935]: 2026-01-22 00:09:36.826 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:09:36 compute-0 nova_compute[182935]: 2026-01-22 00:09:36.826 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 07bda903-2298-433c-aa7d-9a50380e24f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:36 compute-0 nova_compute[182935]: 2026-01-22 00:09:36.962 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:37 compute-0 kernel: tapbc1908f8-b8 (unregistering): left promiscuous mode
Jan 22 00:09:37 compute-0 NetworkManager[55139]: <info>  [1769040577.3707] device (tapbc1908f8-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:09:37 compute-0 nova_compute[182935]: 2026-01-22 00:09:37.382 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:37 compute-0 ovn_controller[95047]: 2026-01-22T00:09:37Z|00458|binding|INFO|Releasing lport bc1908f8-b8ef-40b4-9e46-8e8664065a89 from this chassis (sb_readonly=0)
Jan 22 00:09:37 compute-0 ovn_controller[95047]: 2026-01-22T00:09:37Z|00459|binding|INFO|Setting lport bc1908f8-b8ef-40b4-9e46-8e8664065a89 down in Southbound
Jan 22 00:09:37 compute-0 ovn_controller[95047]: 2026-01-22T00:09:37Z|00460|binding|INFO|Removing iface tapbc1908f8-b8 ovn-installed in OVS
Jan 22 00:09:37 compute-0 nova_compute[182935]: 2026-01-22 00:09:37.386 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.392 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:56:97 10.100.0.12'], port_security=['fa:16:3e:45:56:97 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '07bda903-2298-433c-aa7d-9a50380e24f1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3822e32efd5647aebf2d79a3dd038bd4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e16c6ba2-0878-43d1-92b3-4f4c19054c38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b241e30a-4049-4268-8d26-df6eb9b41bf5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=bc1908f8-b8ef-40b4-9e46-8e8664065a89) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.393 104408 INFO neutron.agent.ovn.metadata.agent [-] Port bc1908f8-b8ef-40b4-9e46-8e8664065a89 in datapath d94993bc-77ac-42d2-88cb-3b0110dff29e unbound from our chassis
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.395 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d94993bc-77ac-42d2-88cb-3b0110dff29e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.396 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[27938ff4-06d4-4160-ae05-35a1d02beeb7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.397 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e namespace which is not needed anymore
Jan 22 00:09:37 compute-0 nova_compute[182935]: 2026-01-22 00:09:37.401 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:37 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000006f.scope: Deactivated successfully.
Jan 22 00:09:37 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000006f.scope: Consumed 13.983s CPU time.
Jan 22 00:09:37 compute-0 systemd-machined[154182]: Machine qemu-59-instance-0000006f terminated.
Jan 22 00:09:37 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[230126]: [NOTICE]   (230130) : haproxy version is 2.8.14-c23fe91
Jan 22 00:09:37 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[230126]: [NOTICE]   (230130) : path to executable is /usr/sbin/haproxy
Jan 22 00:09:37 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[230126]: [WARNING]  (230130) : Exiting Master process...
Jan 22 00:09:37 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[230126]: [ALERT]    (230130) : Current worker (230132) exited with code 143 (Terminated)
Jan 22 00:09:37 compute-0 neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e[230126]: [WARNING]  (230130) : All workers exited. Exiting... (0)
Jan 22 00:09:37 compute-0 systemd[1]: libpod-19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449.scope: Deactivated successfully.
Jan 22 00:09:37 compute-0 podman[230281]: 2026-01-22 00:09:37.547562728 +0000 UTC m=+0.049041484 container died 19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:09:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449-userdata-shm.mount: Deactivated successfully.
Jan 22 00:09:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-9d627493690f2e8e674c09c86cc0ced295e0cfa538249ffb5544bee379410231-merged.mount: Deactivated successfully.
Jan 22 00:09:37 compute-0 podman[230281]: 2026-01-22 00:09:37.586110509 +0000 UTC m=+0.087589245 container cleanup 19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:09:37 compute-0 systemd[1]: libpod-conmon-19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449.scope: Deactivated successfully.
Jan 22 00:09:37 compute-0 podman[230314]: 2026-01-22 00:09:37.653881136 +0000 UTC m=+0.044022369 container remove 19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.660 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fee755d0-7609-4549-9912-62603021e089]: (4, ('Thu Jan 22 12:09:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449)\n19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449\nThu Jan 22 12:09:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e (19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449)\n19a9848b31035d4f65ab462827e0410147b178b0a84edea213035751d9857449\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.663 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[57d912b7-9b74-4685-b4ae-0577f6afb12f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.664 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd94993bc-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:37 compute-0 nova_compute[182935]: 2026-01-22 00:09:37.697 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:37 compute-0 kernel: tapd94993bc-70: left promiscuous mode
Jan 22 00:09:37 compute-0 nova_compute[182935]: 2026-01-22 00:09:37.719 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.722 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[587ed150-1f11-4e7d-b934-d59d6ca9c20f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.739 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[163b4ac3-0540-44c0-9568-89e7556839b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.741 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6044b6e1-7f64-4c69-bb26-f01f1c288552]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.759 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3f190db8-fb65-4104-998c-2345b6e4bbc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505764, 'reachable_time': 18253, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230347, 'error': None, 'target': 'ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:37 compute-0 systemd[1]: run-netns-ovnmeta\x2dd94993bc\x2d77ac\x2d42d2\x2d88cb\x2d3b0110dff29e.mount: Deactivated successfully.
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.765 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d94993bc-77ac-42d2-88cb-3b0110dff29e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:09:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:37.765 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[fe09ecfd-7a09-4cea-bf1a-a9b242aa18cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.212 182939 INFO nova.virt.libvirt.driver [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Instance shutdown successfully after 13 seconds.
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.217 182939 INFO nova.virt.libvirt.driver [-] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Instance destroyed successfully.
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.218 182939 DEBUG nova.virt.libvirt.vif [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:09:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-554893725',display_name='tempest-DeleteServersTestJSON-server-554893725',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-554893725',id=111,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:09:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-4j2wmp3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:09:21Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=07bda903-2298-433c-aa7d-9a50380e24f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-173623444-network", "vif_mac": "fa:16:3e:45:56:97"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.218 182939 DEBUG nova.network.os_vif_util [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-173623444-network", "vif_mac": "fa:16:3e:45:56:97"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.219 182939 DEBUG nova.network.os_vif_util [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.219 182939 DEBUG os_vif [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.221 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.222 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc1908f8-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.223 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.225 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.228 182939 INFO os_vif [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8')
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.232 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.329 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.331 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.365 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating instance_info_cache with network_info: [{"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.385 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.385 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.386 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.386 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.386 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.413 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.414 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.414 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.414 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.416 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.418 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Copying file /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1_resize/disk to 192.168.122.101:/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.418 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1_resize/disk 192.168.122.101:/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.510 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000006f, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.631 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.633 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5652MB free_disk=73.09927749633789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.633 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:38 compute-0 nova_compute[182935]: 2026-01-22 00:09:38.633 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.123 182939 DEBUG nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-unplugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.125 182939 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.126 182939 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.126 182939 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.126 182939 DEBUG nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] No waiting events found dispatching network-vif-unplugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.127 182939 WARNING nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received unexpected event network-vif-unplugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 for instance with vm_state active and task_state resize_migrating.
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.127 182939 DEBUG nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.127 182939 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.128 182939 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.128 182939 DEBUG oslo_concurrency.lockutils [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.128 182939 DEBUG nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] No waiting events found dispatching network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.128 182939 WARNING nova.compute.manager [req-45b137db-447e-40c6-a29e-7080b126c7e5 req-7e1efb92-6681-436b-869d-bb53727ab171 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received unexpected event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 for instance with vm_state active and task_state resize_migrating.
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.138 182939 INFO nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating resource usage from migration 3d4b06f6-bb79-4378-827a-10ce205dc76f
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.193 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Migration 3d4b06f6-bb79-4378-827a-10ce205dc76f is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.194 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.194 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.220 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.255 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.255 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.258 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "scp -r /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1_resize/disk 192.168.122.101:/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk" returned: 0 in 0.840s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.259 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Copying file /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.259 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1_resize/disk.config 192.168.122.101:/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.283 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.322 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.379 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.398 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.429 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.429 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.479 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "scp -C -r /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1_resize/disk.config 192.168.122.101:/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.config" returned: 0 in 0.219s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.480 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Copying file /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.480 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1_resize/disk.info 192.168.122.101:/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.722 182939 DEBUG oslo_concurrency.processutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] CMD "scp -C -r /var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1_resize/disk.info 192.168.122.101:/var/lib/nova/instances/07bda903-2298-433c-aa7d-9a50380e24f1/disk.info" returned: 0 in 0.242s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.837 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:39 compute-0 nova_compute[182935]: 2026-01-22 00:09:39.837 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:40 compute-0 nova_compute[182935]: 2026-01-22 00:09:40.121 182939 DEBUG neutronclient.v2_0.client [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port bc1908f8-b8ef-40b4-9e46-8e8664065a89 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 00:09:40 compute-0 nova_compute[182935]: 2026-01-22 00:09:40.285 182939 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:40 compute-0 nova_compute[182935]: 2026-01-22 00:09:40.286 182939 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:40 compute-0 nova_compute[182935]: 2026-01-22 00:09:40.286 182939 DEBUG oslo_concurrency.lockutils [None req-43b71e46-6cc7-4d6e-8c59-e220051761d4 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:40 compute-0 nova_compute[182935]: 2026-01-22 00:09:40.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:41 compute-0 nova_compute[182935]: 2026-01-22 00:09:41.445 182939 DEBUG nova.compute.manager [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-changed-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:41 compute-0 nova_compute[182935]: 2026-01-22 00:09:41.446 182939 DEBUG nova.compute.manager [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Refreshing instance network info cache due to event network-changed-bc1908f8-b8ef-40b4-9e46-8e8664065a89. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:09:41 compute-0 nova_compute[182935]: 2026-01-22 00:09:41.447 182939 DEBUG oslo_concurrency.lockutils [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:41 compute-0 nova_compute[182935]: 2026-01-22 00:09:41.447 182939 DEBUG oslo_concurrency.lockutils [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:41 compute-0 nova_compute[182935]: 2026-01-22 00:09:41.447 182939 DEBUG nova.network.neutron [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Refreshing network info cache for port bc1908f8-b8ef-40b4-9e46-8e8664065a89 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:09:41 compute-0 podman[230362]: 2026-01-22 00:09:41.681146636 +0000 UTC m=+0.056135739 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git)
Jan 22 00:09:41 compute-0 podman[230363]: 2026-01-22 00:09:41.694674279 +0000 UTC m=+0.063419737 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 22 00:09:41 compute-0 nova_compute[182935]: 2026-01-22 00:09:41.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:41 compute-0 nova_compute[182935]: 2026-01-22 00:09:41.964 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:43 compute-0 nova_compute[182935]: 2026-01-22 00:09:43.120 182939 DEBUG nova.network.neutron [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updated VIF entry in instance network info cache for port bc1908f8-b8ef-40b4-9e46-8e8664065a89. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:09:43 compute-0 nova_compute[182935]: 2026-01-22 00:09:43.121 182939 DEBUG nova.network.neutron [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating instance_info_cache with network_info: [{"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:43 compute-0 nova_compute[182935]: 2026-01-22 00:09:43.141 182939 DEBUG oslo_concurrency.lockutils [req-578c5dfe-8ef9-494f-80f3-a95f28f1743d req-7b26f3f8-06a4-4a19-b075-073b3fb8657a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:43 compute-0 nova_compute[182935]: 2026-01-22 00:09:43.224 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:43 compute-0 nova_compute[182935]: 2026-01-22 00:09:43.830 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Acquiring lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:43 compute-0 nova_compute[182935]: 2026-01-22 00:09:43.831 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:43 compute-0 nova_compute[182935]: 2026-01-22 00:09:43.850 182939 DEBUG nova.compute.manager [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:09:43 compute-0 nova_compute[182935]: 2026-01-22 00:09:43.968 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:43 compute-0 nova_compute[182935]: 2026-01-22 00:09:43.969 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:43 compute-0 nova_compute[182935]: 2026-01-22 00:09:43.976 182939 DEBUG nova.virt.hardware [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:09:43 compute-0 nova_compute[182935]: 2026-01-22 00:09:43.976 182939 INFO nova.compute.claims [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.128 182939 DEBUG nova.compute.provider_tree [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.143 182939 DEBUG nova.scheduler.client.report [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.169 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.170 182939 DEBUG nova.compute.manager [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.233 182939 DEBUG nova.compute.manager [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.234 182939 DEBUG nova.network.neutron [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.255 182939 INFO nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.301 182939 DEBUG nova.compute.manager [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.413 182939 DEBUG nova.policy [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7ee17110d4744f99aaa3a6e7f5704bec', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '99c50ccfe643400aa6cbd9e61e8ac16b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.443 182939 DEBUG nova.compute.manager [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.446 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.446 182939 INFO nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Creating image(s)
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.447 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Acquiring lock "/var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.448 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "/var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.449 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "/var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.472 182939 DEBUG oslo_concurrency.processutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.538 182939 DEBUG oslo_concurrency.processutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.539 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.539 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.554 182939 DEBUG oslo_concurrency.processutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.613 182939 DEBUG oslo_concurrency.processutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.614 182939 DEBUG oslo_concurrency.processutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.650 182939 DEBUG oslo_concurrency.processutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.651 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.652 182939 DEBUG oslo_concurrency.processutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.710 182939 DEBUG oslo_concurrency.processutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.712 182939 DEBUG nova.virt.disk.api [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Checking if we can resize image /var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.712 182939 DEBUG oslo_concurrency.processutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.770 182939 DEBUG oslo_concurrency.processutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.771 182939 DEBUG nova.virt.disk.api [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Cannot resize image /var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.771 182939 DEBUG nova.objects.instance [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lazy-loading 'migration_context' on Instance uuid fa6f75b7-e928-4d1c-8867-417b02ad70ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.794 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.794 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Ensure instance console log exists: /var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.795 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.795 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:44 compute-0 nova_compute[182935]: 2026-01-22 00:09:44.795 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:45 compute-0 nova_compute[182935]: 2026-01-22 00:09:45.080 182939 DEBUG nova.network.neutron [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Successfully created port: 3c6ae08f-a647-4ee0-be98-10c12c2d1911 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:09:46 compute-0 nova_compute[182935]: 2026-01-22 00:09:46.103 182939 DEBUG nova.compute.manager [req-846914c9-26f1-4c62-aa52-573ed99ea084 req-dd0e9907-c25b-471b-b437-a94b50a90f4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:46 compute-0 nova_compute[182935]: 2026-01-22 00:09:46.104 182939 DEBUG oslo_concurrency.lockutils [req-846914c9-26f1-4c62-aa52-573ed99ea084 req-dd0e9907-c25b-471b-b437-a94b50a90f4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:46 compute-0 nova_compute[182935]: 2026-01-22 00:09:46.104 182939 DEBUG oslo_concurrency.lockutils [req-846914c9-26f1-4c62-aa52-573ed99ea084 req-dd0e9907-c25b-471b-b437-a94b50a90f4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:46 compute-0 nova_compute[182935]: 2026-01-22 00:09:46.104 182939 DEBUG oslo_concurrency.lockutils [req-846914c9-26f1-4c62-aa52-573ed99ea084 req-dd0e9907-c25b-471b-b437-a94b50a90f4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:46 compute-0 nova_compute[182935]: 2026-01-22 00:09:46.104 182939 DEBUG nova.compute.manager [req-846914c9-26f1-4c62-aa52-573ed99ea084 req-dd0e9907-c25b-471b-b437-a94b50a90f4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] No waiting events found dispatching network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:46 compute-0 nova_compute[182935]: 2026-01-22 00:09:46.105 182939 WARNING nova.compute.manager [req-846914c9-26f1-4c62-aa52-573ed99ea084 req-dd0e9907-c25b-471b-b437-a94b50a90f4a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received unexpected event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 for instance with vm_state resized and task_state None.
Jan 22 00:09:46 compute-0 nova_compute[182935]: 2026-01-22 00:09:46.218 182939 DEBUG nova.network.neutron [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Successfully updated port: 3c6ae08f-a647-4ee0-be98-10c12c2d1911 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:09:46 compute-0 nova_compute[182935]: 2026-01-22 00:09:46.246 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Acquiring lock "refresh_cache-fa6f75b7-e928-4d1c-8867-417b02ad70ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:46 compute-0 nova_compute[182935]: 2026-01-22 00:09:46.246 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Acquired lock "refresh_cache-fa6f75b7-e928-4d1c-8867-417b02ad70ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:46 compute-0 nova_compute[182935]: 2026-01-22 00:09:46.246 182939 DEBUG nova.network.neutron [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:09:46 compute-0 nova_compute[182935]: 2026-01-22 00:09:46.509 182939 DEBUG nova.network.neutron [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:09:46 compute-0 nova_compute[182935]: 2026-01-22 00:09:46.966 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:48 compute-0 nova_compute[182935]: 2026-01-22 00:09:48.225 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:49 compute-0 nova_compute[182935]: 2026-01-22 00:09:49.219 182939 DEBUG nova.compute.manager [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Received event network-changed-3c6ae08f-a647-4ee0-be98-10c12c2d1911 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:49 compute-0 nova_compute[182935]: 2026-01-22 00:09:49.219 182939 DEBUG nova.compute.manager [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Refreshing instance network info cache due to event network-changed-3c6ae08f-a647-4ee0-be98-10c12c2d1911. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:09:49 compute-0 nova_compute[182935]: 2026-01-22 00:09:49.220 182939 DEBUG oslo_concurrency.lockutils [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-fa6f75b7-e928-4d1c-8867-417b02ad70ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:50 compute-0 nova_compute[182935]: 2026-01-22 00:09:50.741 182939 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:50 compute-0 nova_compute[182935]: 2026-01-22 00:09:50.742 182939 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:50 compute-0 nova_compute[182935]: 2026-01-22 00:09:50.742 182939 DEBUG nova.compute.manager [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Going to confirm migration 16 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 22 00:09:50 compute-0 nova_compute[182935]: 2026-01-22 00:09:50.786 182939 DEBUG nova.objects.instance [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'info_cache' on Instance uuid 07bda903-2298-433c-aa7d-9a50380e24f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.194 182939 DEBUG nova.network.neutron [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Updating instance_info_cache with network_info: [{"id": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "address": "fa:16:3e:bb:44:1e", "network": {"id": "74f5a9fd-74b2-4d9e-9d16-9485f59bc51c", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-90331527-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99c50ccfe643400aa6cbd9e61e8ac16b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c6ae08f-a6", "ovs_interfaceid": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.291 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Releasing lock "refresh_cache-fa6f75b7-e928-4d1c-8867-417b02ad70ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.292 182939 DEBUG nova.compute.manager [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Instance network_info: |[{"id": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "address": "fa:16:3e:bb:44:1e", "network": {"id": "74f5a9fd-74b2-4d9e-9d16-9485f59bc51c", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-90331527-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99c50ccfe643400aa6cbd9e61e8ac16b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c6ae08f-a6", "ovs_interfaceid": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.292 182939 DEBUG oslo_concurrency.lockutils [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-fa6f75b7-e928-4d1c-8867-417b02ad70ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.292 182939 DEBUG nova.network.neutron [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Refreshing network info cache for port 3c6ae08f-a647-4ee0-be98-10c12c2d1911 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.296 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Start _get_guest_xml network_info=[{"id": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "address": "fa:16:3e:bb:44:1e", "network": {"id": "74f5a9fd-74b2-4d9e-9d16-9485f59bc51c", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-90331527-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99c50ccfe643400aa6cbd9e61e8ac16b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c6ae08f-a6", "ovs_interfaceid": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.301 182939 WARNING nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.307 182939 DEBUG nova.virt.libvirt.host [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.308 182939 DEBUG nova.virt.libvirt.host [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.315 182939 DEBUG nova.virt.libvirt.host [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.315 182939 DEBUG nova.virt.libvirt.host [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.317 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.317 182939 DEBUG nova.virt.hardware [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.317 182939 DEBUG nova.virt.hardware [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.318 182939 DEBUG nova.virt.hardware [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.318 182939 DEBUG nova.virt.hardware [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.318 182939 DEBUG nova.virt.hardware [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.319 182939 DEBUG nova.virt.hardware [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.319 182939 DEBUG nova.virt.hardware [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.319 182939 DEBUG nova.virt.hardware [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.319 182939 DEBUG nova.virt.hardware [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.320 182939 DEBUG nova.virt.hardware [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.320 182939 DEBUG nova.virt.hardware [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.324 182939 DEBUG nova.virt.libvirt.vif [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:09:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1331078077',display_name='tempest-ServerAddressesTestJSON-server-1331078077',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1331078077',id=113,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='99c50ccfe643400aa6cbd9e61e8ac16b',ramdisk_id='',reservation_id='r-dhb08rn9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-1063207139',owner_user_name='tempest-ServerAddressesTestJSON-1063207139-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:09:44Z,user_data=None,user_id='7ee17110d4744f99aaa3a6e7f5704bec',uuid=fa6f75b7-e928-4d1c-8867-417b02ad70ec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "address": "fa:16:3e:bb:44:1e", "network": {"id": "74f5a9fd-74b2-4d9e-9d16-9485f59bc51c", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-90331527-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99c50ccfe643400aa6cbd9e61e8ac16b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c6ae08f-a6", "ovs_interfaceid": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.325 182939 DEBUG nova.network.os_vif_util [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Converting VIF {"id": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "address": "fa:16:3e:bb:44:1e", "network": {"id": "74f5a9fd-74b2-4d9e-9d16-9485f59bc51c", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-90331527-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99c50ccfe643400aa6cbd9e61e8ac16b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c6ae08f-a6", "ovs_interfaceid": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.325 182939 DEBUG nova.network.os_vif_util [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:44:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c6ae08f-a647-4ee0-be98-10c12c2d1911,network=Network(74f5a9fd-74b2-4d9e-9d16-9485f59bc51c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c6ae08f-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.327 182939 DEBUG nova.objects.instance [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lazy-loading 'pci_devices' on Instance uuid fa6f75b7-e928-4d1c-8867-417b02ad70ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.351 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:09:51 compute-0 nova_compute[182935]:   <uuid>fa6f75b7-e928-4d1c-8867-417b02ad70ec</uuid>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   <name>instance-00000071</name>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerAddressesTestJSON-server-1331078077</nova:name>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:09:51</nova:creationTime>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:09:51 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:09:51 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:09:51 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:09:51 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:09:51 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:09:51 compute-0 nova_compute[182935]:         <nova:user uuid="7ee17110d4744f99aaa3a6e7f5704bec">tempest-ServerAddressesTestJSON-1063207139-project-member</nova:user>
Jan 22 00:09:51 compute-0 nova_compute[182935]:         <nova:project uuid="99c50ccfe643400aa6cbd9e61e8ac16b">tempest-ServerAddressesTestJSON-1063207139</nova:project>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:09:51 compute-0 nova_compute[182935]:         <nova:port uuid="3c6ae08f-a647-4ee0-be98-10c12c2d1911">
Jan 22 00:09:51 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <system>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <entry name="serial">fa6f75b7-e928-4d1c-8867-417b02ad70ec</entry>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <entry name="uuid">fa6f75b7-e928-4d1c-8867-417b02ad70ec</entry>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     </system>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   <os>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   </os>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   <features>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   </features>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk.config"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:bb:44:1e"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <target dev="tap3c6ae08f-a6"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/console.log" append="off"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <video>
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     </video>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:09:51 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:09:51 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:09:51 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:09:51 compute-0 nova_compute[182935]: </domain>
Jan 22 00:09:51 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.352 182939 DEBUG nova.compute.manager [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Preparing to wait for external event network-vif-plugged-3c6ae08f-a647-4ee0-be98-10c12c2d1911 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.353 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Acquiring lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.353 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.353 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.354 182939 DEBUG nova.virt.libvirt.vif [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:09:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1331078077',display_name='tempest-ServerAddressesTestJSON-server-1331078077',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1331078077',id=113,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='99c50ccfe643400aa6cbd9e61e8ac16b',ramdisk_id='',reservation_id='r-dhb08rn9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-1063207139',owner_user_name='tempest-ServerAddressesTestJSON-1063207139-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:09:44Z,user_data=None,user_id='7ee17110d4744f99aaa3a6e7f5704bec',uuid=fa6f75b7-e928-4d1c-8867-417b02ad70ec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "address": "fa:16:3e:bb:44:1e", "network": {"id": "74f5a9fd-74b2-4d9e-9d16-9485f59bc51c", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-90331527-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99c50ccfe643400aa6cbd9e61e8ac16b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c6ae08f-a6", "ovs_interfaceid": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.354 182939 DEBUG nova.network.os_vif_util [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Converting VIF {"id": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "address": "fa:16:3e:bb:44:1e", "network": {"id": "74f5a9fd-74b2-4d9e-9d16-9485f59bc51c", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-90331527-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99c50ccfe643400aa6cbd9e61e8ac16b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c6ae08f-a6", "ovs_interfaceid": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.355 182939 DEBUG nova.network.os_vif_util [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:44:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c6ae08f-a647-4ee0-be98-10c12c2d1911,network=Network(74f5a9fd-74b2-4d9e-9d16-9485f59bc51c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c6ae08f-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.355 182939 DEBUG os_vif [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:44:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c6ae08f-a647-4ee0-be98-10c12c2d1911,network=Network(74f5a9fd-74b2-4d9e-9d16-9485f59bc51c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c6ae08f-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.356 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.356 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.356 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.359 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.360 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c6ae08f-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.360 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3c6ae08f-a6, col_values=(('external_ids', {'iface-id': '3c6ae08f-a647-4ee0-be98-10c12c2d1911', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:44:1e', 'vm-uuid': 'fa6f75b7-e928-4d1c-8867-417b02ad70ec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.364 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:51 compute-0 NetworkManager[55139]: <info>  [1769040591.3659] manager: (tap3c6ae08f-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.367 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.370 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.371 182939 INFO os_vif [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:44:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c6ae08f-a647-4ee0-be98-10c12c2d1911,network=Network(74f5a9fd-74b2-4d9e-9d16-9485f59bc51c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c6ae08f-a6')
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.705 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.705 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.705 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] No VIF found with MAC fa:16:3e:bb:44:1e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.706 182939 INFO nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Using config drive
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.787 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:51 compute-0 nova_compute[182935]: 2026-01-22 00:09:51.968 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:52 compute-0 nova_compute[182935]: 2026-01-22 00:09:52.634 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040577.6339188, 07bda903-2298-433c-aa7d-9a50380e24f1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:52 compute-0 nova_compute[182935]: 2026-01-22 00:09:52.635 182939 INFO nova.compute.manager [-] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] VM Stopped (Lifecycle Event)
Jan 22 00:09:52 compute-0 nova_compute[182935]: 2026-01-22 00:09:52.693 182939 DEBUG nova.compute.manager [None req-92483536-2d8e-473d-826b-e8dcfc0f5028 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:52 compute-0 nova_compute[182935]: 2026-01-22 00:09:52.697 182939 DEBUG nova.compute.manager [None req-92483536-2d8e-473d-826b-e8dcfc0f5028 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: deleting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:09:52 compute-0 nova_compute[182935]: 2026-01-22 00:09:52.774 182939 INFO nova.compute.manager [None req-92483536-2d8e-473d-826b-e8dcfc0f5028 - - - - - -] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.143 182939 DEBUG neutronclient.v2_0.client [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port bc1908f8-b8ef-40b4-9e46-8e8664065a89 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.144 182939 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.145 182939 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquired lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.146 182939 DEBUG nova.network.neutron [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.267 182939 INFO nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Creating config drive at /var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk.config
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.271 182939 DEBUG oslo_concurrency.processutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi0uuopmh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.398 182939 DEBUG oslo_concurrency.processutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi0uuopmh" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:09:53 compute-0 kernel: tap3c6ae08f-a6: entered promiscuous mode
Jan 22 00:09:53 compute-0 NetworkManager[55139]: <info>  [1769040593.4592] manager: (tap3c6ae08f-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/216)
Jan 22 00:09:53 compute-0 ovn_controller[95047]: 2026-01-22T00:09:53Z|00461|binding|INFO|Claiming lport 3c6ae08f-a647-4ee0-be98-10c12c2d1911 for this chassis.
Jan 22 00:09:53 compute-0 ovn_controller[95047]: 2026-01-22T00:09:53Z|00462|binding|INFO|3c6ae08f-a647-4ee0-be98-10c12c2d1911: Claiming fa:16:3e:bb:44:1e 10.100.0.8
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.461 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.464 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.487 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:44:1e 10.100.0.8'], port_security=['fa:16:3e:bb:44:1e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'fa6f75b7-e928-4d1c-8867-417b02ad70ec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '99c50ccfe643400aa6cbd9e61e8ac16b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7480f7c4-d794-41bb-bf8d-5e324cb0e689', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=735ce6b4-4f6c-49b8-9876-1e1ba8ae2cd6, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=3c6ae08f-a647-4ee0-be98-10c12c2d1911) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.488 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 3c6ae08f-a647-4ee0-be98-10c12c2d1911 in datapath 74f5a9fd-74b2-4d9e-9d16-9485f59bc51c bound to our chassis
Jan 22 00:09:53 compute-0 systemd-udevd[230431]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.489 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74f5a9fd-74b2-4d9e-9d16-9485f59bc51c
Jan 22 00:09:53 compute-0 NetworkManager[55139]: <info>  [1769040593.5006] device (tap3c6ae08f-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:09:53 compute-0 NetworkManager[55139]: <info>  [1769040593.5013] device (tap3c6ae08f-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.502 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c62cbd7e-060f-4ea2-93d1-12fe654a34b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.503 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap74f5a9fd-71 in ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.506 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap74f5a9fd-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.506 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e902a589-6ea0-43e7-846a-fb1448dc2163]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.507 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[78e7de5c-0f9b-4335-9cec-22bd4fad63ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 systemd-machined[154182]: New machine qemu-60-instance-00000071.
Jan 22 00:09:53 compute-0 systemd[1]: Started Virtual Machine qemu-60-instance-00000071.
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.518 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[67c3da37-f222-41d0-84aa-9a0f870eba82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.517 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:53 compute-0 ovn_controller[95047]: 2026-01-22T00:09:53Z|00463|binding|INFO|Setting lport 3c6ae08f-a647-4ee0-be98-10c12c2d1911 ovn-installed in OVS
Jan 22 00:09:53 compute-0 ovn_controller[95047]: 2026-01-22T00:09:53Z|00464|binding|INFO|Setting lport 3c6ae08f-a647-4ee0-be98-10c12c2d1911 up in Southbound
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.529 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.540 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[415d9a03-0cf3-416a-930c-24c69931fd2f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.566 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[1d6fad7c-c70c-4311-8b1d-f1697c781d9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.571 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e80a6269-c5d2-4917-b8d2-80683553fa19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 systemd-udevd[230436]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:09:53 compute-0 NetworkManager[55139]: <info>  [1769040593.5732] manager: (tap74f5a9fd-70): new Veth device (/org/freedesktop/NetworkManager/Devices/217)
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.606 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[237eb523-36ff-473a-b4b0-12292daad174]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.609 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[4c49fb3f-5d05-4a25-b59f-14c511425eef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 NetworkManager[55139]: <info>  [1769040593.6300] device (tap74f5a9fd-70): carrier: link connected
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.640 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7254f062-4777-40be-a5bc-a6737c50afef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.661 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fd3818a9-590b-4566-b3eb-7f958bad49af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74f5a9fd-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:c5:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 148], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509636, 'reachable_time': 18295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230466, 'error': None, 'target': 'ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.676 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1b8c8402-34a6-4716-b65b-ca5fc0fedc15]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed2:c5b1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 509636, 'tstamp': 509636}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230467, 'error': None, 'target': 'ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.694 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b58100cd-508c-40e7-b69b-10f787cd70b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74f5a9fd-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:c5:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 148], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509636, 'reachable_time': 18295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230468, 'error': None, 'target': 'ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.729 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[110c6c11-3da3-43d7-b9c2-6d4341f6b5d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.786 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f52fc016-1627-4356-b816-12c3ea62c344]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.788 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74f5a9fd-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.789 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.790 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74f5a9fd-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.835 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:53 compute-0 NetworkManager[55139]: <info>  [1769040593.8364] manager: (tap74f5a9fd-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Jan 22 00:09:53 compute-0 kernel: tap74f5a9fd-70: entered promiscuous mode
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.838 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74f5a9fd-70, col_values=(('external_ids', {'iface-id': '5500ba7b-0c3a-4979-b561-04e84533c42e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.839 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:53 compute-0 ovn_controller[95047]: 2026-01-22T00:09:53Z|00465|binding|INFO|Releasing lport 5500ba7b-0c3a-4979-b561-04e84533c42e from this chassis (sb_readonly=0)
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.841 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.841 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/74f5a9fd-74b2-4d9e-9d16-9485f59bc51c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/74f5a9fd-74b2-4d9e-9d16-9485f59bc51c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.842 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b6bed978-134e-47af-bb7e-8cbc6fe44672]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.843 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/74f5a9fd-74b2-4d9e-9d16-9485f59bc51c.pid.haproxy
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 74f5a9fd-74b2-4d9e-9d16-9485f59bc51c
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:09:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:09:53.844 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c', 'env', 'PROCESS_TAG=haproxy-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/74f5a9fd-74b2-4d9e-9d16-9485f59bc51c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.851 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.880 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040593.880017, fa6f75b7-e928-4d1c-8867-417b02ad70ec => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.881 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] VM Started (Lifecycle Event)
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.990 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.996 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040593.8805335, fa6f75b7-e928-4d1c-8867-417b02ad70ec => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:53 compute-0 nova_compute[182935]: 2026-01-22 00:09:53.996 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] VM Paused (Lifecycle Event)
Jan 22 00:09:54 compute-0 nova_compute[182935]: 2026-01-22 00:09:54.092 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:54 compute-0 nova_compute[182935]: 2026-01-22 00:09:54.096 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:09:54 compute-0 nova_compute[182935]: 2026-01-22 00:09:54.149 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:09:54 compute-0 podman[230505]: 2026-01-22 00:09:54.222583882 +0000 UTC m=+0.049907764 container create 1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:09:54 compute-0 systemd[1]: Started libpod-conmon-1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d.scope.
Jan 22 00:09:54 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:09:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c56c83722794798eacbcaab7aa3306195d7715d4193baab9a953dd1eb87bff19/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:09:54 compute-0 podman[230505]: 2026-01-22 00:09:54.195967707 +0000 UTC m=+0.023291619 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:09:54 compute-0 podman[230505]: 2026-01-22 00:09:54.304064656 +0000 UTC m=+0.131388568 container init 1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 00:09:54 compute-0 podman[230505]: 2026-01-22 00:09:54.308729683 +0000 UTC m=+0.136053565 container start 1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 00:09:54 compute-0 neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c[230520]: [NOTICE]   (230524) : New worker (230526) forked
Jan 22 00:09:54 compute-0 neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c[230520]: [NOTICE]   (230524) : Loading success.
Jan 22 00:09:55 compute-0 podman[230536]: 2026-01-22 00:09:55.686406615 +0000 UTC m=+0.054697846 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:09:55 compute-0 podman[230535]: 2026-01-22 00:09:55.714608317 +0000 UTC m=+0.086701096 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.355 182939 DEBUG nova.compute.manager [req-e5cd2603-98a1-4823-9a95-ba90e2eeeaa9 req-9444989e-a147-4630-a6ff-83bb821ab77d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Received event network-vif-plugged-3c6ae08f-a647-4ee0-be98-10c12c2d1911 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.356 182939 DEBUG oslo_concurrency.lockutils [req-e5cd2603-98a1-4823-9a95-ba90e2eeeaa9 req-9444989e-a147-4630-a6ff-83bb821ab77d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.356 182939 DEBUG oslo_concurrency.lockutils [req-e5cd2603-98a1-4823-9a95-ba90e2eeeaa9 req-9444989e-a147-4630-a6ff-83bb821ab77d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.356 182939 DEBUG oslo_concurrency.lockutils [req-e5cd2603-98a1-4823-9a95-ba90e2eeeaa9 req-9444989e-a147-4630-a6ff-83bb821ab77d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.357 182939 DEBUG nova.compute.manager [req-e5cd2603-98a1-4823-9a95-ba90e2eeeaa9 req-9444989e-a147-4630-a6ff-83bb821ab77d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Processing event network-vif-plugged-3c6ae08f-a647-4ee0-be98-10c12c2d1911 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.358 182939 DEBUG nova.compute.manager [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.362 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040596.3619764, fa6f75b7-e928-4d1c-8867-417b02ad70ec => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.362 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] VM Resumed (Lifecycle Event)
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.407 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.408 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.412 182939 INFO nova.virt.libvirt.driver [-] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Instance spawned successfully.
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.413 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.426 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.431 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.467 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.468 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.468 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.469 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.469 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.469 182939 DEBUG nova.virt.libvirt.driver [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.531 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.695 182939 INFO nova.compute.manager [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Took 12.25 seconds to spawn the instance on the hypervisor.
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.696 182939 DEBUG nova.compute.manager [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.875 182939 INFO nova.compute.manager [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Took 12.94 seconds to build instance.
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.971 182939 DEBUG oslo_concurrency.lockutils [None req-3b96d547-5fd4-4517-aa08-be5dcf6e48b5 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:56 compute-0 nova_compute[182935]: 2026-01-22 00:09:56.972 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:57 compute-0 nova_compute[182935]: 2026-01-22 00:09:57.513 182939 DEBUG nova.network.neutron [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Updated VIF entry in instance network info cache for port 3c6ae08f-a647-4ee0-be98-10c12c2d1911. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:09:57 compute-0 nova_compute[182935]: 2026-01-22 00:09:57.513 182939 DEBUG nova.network.neutron [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Updating instance_info_cache with network_info: [{"id": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "address": "fa:16:3e:bb:44:1e", "network": {"id": "74f5a9fd-74b2-4d9e-9d16-9485f59bc51c", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-90331527-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99c50ccfe643400aa6cbd9e61e8ac16b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c6ae08f-a6", "ovs_interfaceid": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:57 compute-0 nova_compute[182935]: 2026-01-22 00:09:57.533 182939 DEBUG oslo_concurrency.lockutils [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-fa6f75b7-e928-4d1c-8867-417b02ad70ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:57 compute-0 nova_compute[182935]: 2026-01-22 00:09:57.534 182939 DEBUG nova.compute.manager [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:57 compute-0 nova_compute[182935]: 2026-01-22 00:09:57.534 182939 DEBUG oslo_concurrency.lockutils [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:57 compute-0 nova_compute[182935]: 2026-01-22 00:09:57.535 182939 DEBUG oslo_concurrency.lockutils [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:57 compute-0 nova_compute[182935]: 2026-01-22 00:09:57.535 182939 DEBUG oslo_concurrency.lockutils [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:57 compute-0 nova_compute[182935]: 2026-01-22 00:09:57.535 182939 DEBUG nova.compute.manager [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] No waiting events found dispatching network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:57 compute-0 nova_compute[182935]: 2026-01-22 00:09:57.535 182939 WARNING nova.compute.manager [req-7555d67c-4874-42d9-8410-6c5c5d368ae3 req-0a734f6d-743f-45e2-990c-7f76cad0f514 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Received unexpected event network-vif-plugged-bc1908f8-b8ef-40b4-9e46-8e8664065a89 for instance with vm_state resized and task_state None.
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.230 182939 DEBUG nova.network.neutron [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] [instance: 07bda903-2298-433c-aa7d-9a50380e24f1] Updating instance_info_cache with network_info: [{"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.271 182939 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Releasing lock "refresh_cache-07bda903-2298-433c-aa7d-9a50380e24f1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.272 182939 DEBUG nova.objects.instance [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lazy-loading 'migration_context' on Instance uuid 07bda903-2298-433c-aa7d-9a50380e24f1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.321 182939 DEBUG nova.virt.libvirt.vif [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:09:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-554893725',display_name='tempest-DeleteServersTestJSON-server-554893725',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-554893725',id=111,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:09:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3822e32efd5647aebf2d79a3dd038bd4',ramdisk_id='',reservation_id='r-4j2wmp3v',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-2033458913',owner_user_name='tempest-DeleteServersTestJSON-2033458913-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:09:50Z,user_data=None,user_id='74ad1bf274924c52af96aa4c6d431410',uuid=07bda903-2298-433c-aa7d-9a50380e24f1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.322 182939 DEBUG nova.network.os_vif_util [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converting VIF {"id": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "address": "fa:16:3e:45:56:97", "network": {"id": "d94993bc-77ac-42d2-88cb-3b0110dff29e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-173623444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3822e32efd5647aebf2d79a3dd038bd4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc1908f8-b8", "ovs_interfaceid": "bc1908f8-b8ef-40b4-9e46-8e8664065a89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.323 182939 DEBUG nova.network.os_vif_util [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.323 182939 DEBUG os_vif [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.327 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.328 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc1908f8-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.329 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.333 182939 INFO os_vif [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:56:97,bridge_name='br-int',has_traffic_filtering=True,id=bc1908f8-b8ef-40b4-9e46-8e8664065a89,network=Network(d94993bc-77ac-42d2-88cb-3b0110dff29e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc1908f8-b8')
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.333 182939 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.333 182939 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.456 182939 DEBUG nova.compute.manager [req-238a01a5-e32e-4aa6-8576-b12944d83785 req-d03d2880-9859-4d61-8d37-84757e28d2fc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Received event network-vif-plugged-3c6ae08f-a647-4ee0-be98-10c12c2d1911 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.457 182939 DEBUG oslo_concurrency.lockutils [req-238a01a5-e32e-4aa6-8576-b12944d83785 req-d03d2880-9859-4d61-8d37-84757e28d2fc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.457 182939 DEBUG oslo_concurrency.lockutils [req-238a01a5-e32e-4aa6-8576-b12944d83785 req-d03d2880-9859-4d61-8d37-84757e28d2fc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.457 182939 DEBUG oslo_concurrency.lockutils [req-238a01a5-e32e-4aa6-8576-b12944d83785 req-d03d2880-9859-4d61-8d37-84757e28d2fc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.457 182939 DEBUG nova.compute.manager [req-238a01a5-e32e-4aa6-8576-b12944d83785 req-d03d2880-9859-4d61-8d37-84757e28d2fc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] No waiting events found dispatching network-vif-plugged-3c6ae08f-a647-4ee0-be98-10c12c2d1911 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.457 182939 WARNING nova.compute.manager [req-238a01a5-e32e-4aa6-8576-b12944d83785 req-d03d2880-9859-4d61-8d37-84757e28d2fc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Received unexpected event network-vif-plugged-3c6ae08f-a647-4ee0-be98-10c12c2d1911 for instance with vm_state active and task_state None.
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.494 182939 DEBUG nova.compute.provider_tree [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.534 182939 DEBUG nova.scheduler.client.report [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:09:59 compute-0 nova_compute[182935]: 2026-01-22 00:09:59.663 182939 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:00 compute-0 nova_compute[182935]: 2026-01-22 00:10:00.005 182939 INFO nova.scheduler.client.report [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Deleted allocation for migration 3d4b06f6-bb79-4378-827a-10ce205dc76f
Jan 22 00:10:00 compute-0 nova_compute[182935]: 2026-01-22 00:10:00.163 182939 DEBUG oslo_concurrency.lockutils [None req-b3f8ea34-bb90-486b-a526-44ffe333950e 74ad1bf274924c52af96aa4c6d431410 3822e32efd5647aebf2d79a3dd038bd4 - - default default] Lock "07bda903-2298-433c-aa7d-9a50380e24f1" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 9.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.261 182939 DEBUG oslo_concurrency.lockutils [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Acquiring lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.261 182939 DEBUG oslo_concurrency.lockutils [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.262 182939 DEBUG oslo_concurrency.lockutils [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Acquiring lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.262 182939 DEBUG oslo_concurrency.lockutils [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.262 182939 DEBUG oslo_concurrency.lockutils [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.275 182939 INFO nova.compute.manager [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Terminating instance
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.292 182939 DEBUG nova.compute.manager [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:10:01 compute-0 kernel: tap3c6ae08f-a6 (unregistering): left promiscuous mode
Jan 22 00:10:01 compute-0 NetworkManager[55139]: <info>  [1769040601.3152] device (tap3c6ae08f-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:10:01 compute-0 ovn_controller[95047]: 2026-01-22T00:10:01Z|00466|binding|INFO|Releasing lport 3c6ae08f-a647-4ee0-be98-10c12c2d1911 from this chassis (sb_readonly=0)
Jan 22 00:10:01 compute-0 ovn_controller[95047]: 2026-01-22T00:10:01Z|00467|binding|INFO|Setting lport 3c6ae08f-a647-4ee0-be98-10c12c2d1911 down in Southbound
Jan 22 00:10:01 compute-0 ovn_controller[95047]: 2026-01-22T00:10:01Z|00468|binding|INFO|Removing iface tap3c6ae08f-a6 ovn-installed in OVS
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.326 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.329 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.335 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:44:1e 10.100.0.8'], port_security=['fa:16:3e:bb:44:1e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'fa6f75b7-e928-4d1c-8867-417b02ad70ec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '99c50ccfe643400aa6cbd9e61e8ac16b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7480f7c4-d794-41bb-bf8d-5e324cb0e689', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=735ce6b4-4f6c-49b8-9876-1e1ba8ae2cd6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=3c6ae08f-a647-4ee0-be98-10c12c2d1911) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.338 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 3c6ae08f-a647-4ee0-be98-10c12c2d1911 in datapath 74f5a9fd-74b2-4d9e-9d16-9485f59bc51c unbound from our chassis
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.339 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74f5a9fd-74b2-4d9e-9d16-9485f59bc51c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.341 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[234b0133-98c7-4b56-ab20-59f37c8f42d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.342 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c namespace which is not needed anymore
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.343 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:01 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000071.scope: Deactivated successfully.
Jan 22 00:10:01 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000071.scope: Consumed 5.347s CPU time.
Jan 22 00:10:01 compute-0 systemd-machined[154182]: Machine qemu-60-instance-00000071 terminated.
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.409 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:01 compute-0 neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c[230520]: [NOTICE]   (230524) : haproxy version is 2.8.14-c23fe91
Jan 22 00:10:01 compute-0 neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c[230520]: [NOTICE]   (230524) : path to executable is /usr/sbin/haproxy
Jan 22 00:10:01 compute-0 neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c[230520]: [WARNING]  (230524) : Exiting Master process...
Jan 22 00:10:01 compute-0 neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c[230520]: [ALERT]    (230524) : Current worker (230526) exited with code 143 (Terminated)
Jan 22 00:10:01 compute-0 neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c[230520]: [WARNING]  (230524) : All workers exited. Exiting... (0)
Jan 22 00:10:01 compute-0 systemd[1]: libpod-1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d.scope: Deactivated successfully.
Jan 22 00:10:01 compute-0 podman[230610]: 2026-01-22 00:10:01.491712811 +0000 UTC m=+0.057189133 container died 1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.514 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.518 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.554 182939 INFO nova.virt.libvirt.driver [-] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Instance destroyed successfully.
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.555 182939 DEBUG nova.objects.instance [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lazy-loading 'resources' on Instance uuid fa6f75b7-e928-4d1c-8867-417b02ad70ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:10:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-c56c83722794798eacbcaab7aa3306195d7715d4193baab9a953dd1eb87bff19-merged.mount: Deactivated successfully.
Jan 22 00:10:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d-userdata-shm.mount: Deactivated successfully.
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.586 182939 DEBUG nova.virt.libvirt.vif [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:09:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1331078077',display_name='tempest-ServerAddressesTestJSON-server-1331078077',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1331078077',id=113,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:09:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='99c50ccfe643400aa6cbd9e61e8ac16b',ramdisk_id='',reservation_id='r-dhb08rn9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesTestJSON-1063207139',owner_user_name='tempest-ServerAddressesTestJSON-1063207139-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:09:56Z,user_data=None,user_id='7ee17110d4744f99aaa3a6e7f5704bec',uuid=fa6f75b7-e928-4d1c-8867-417b02ad70ec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "address": "fa:16:3e:bb:44:1e", "network": {"id": "74f5a9fd-74b2-4d9e-9d16-9485f59bc51c", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-90331527-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99c50ccfe643400aa6cbd9e61e8ac16b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c6ae08f-a6", "ovs_interfaceid": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.587 182939 DEBUG nova.network.os_vif_util [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Converting VIF {"id": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "address": "fa:16:3e:bb:44:1e", "network": {"id": "74f5a9fd-74b2-4d9e-9d16-9485f59bc51c", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-90331527-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99c50ccfe643400aa6cbd9e61e8ac16b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c6ae08f-a6", "ovs_interfaceid": "3c6ae08f-a647-4ee0-be98-10c12c2d1911", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.587 182939 DEBUG nova.network.os_vif_util [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:44:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c6ae08f-a647-4ee0-be98-10c12c2d1911,network=Network(74f5a9fd-74b2-4d9e-9d16-9485f59bc51c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c6ae08f-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.588 182939 DEBUG os_vif [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:44:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c6ae08f-a647-4ee0-be98-10c12c2d1911,network=Network(74f5a9fd-74b2-4d9e-9d16-9485f59bc51c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c6ae08f-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.589 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.589 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c6ae08f-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.590 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.593 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.595 182939 INFO os_vif [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:44:1e,bridge_name='br-int',has_traffic_filtering=True,id=3c6ae08f-a647-4ee0-be98-10c12c2d1911,network=Network(74f5a9fd-74b2-4d9e-9d16-9485f59bc51c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c6ae08f-a6')
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.595 182939 INFO nova.virt.libvirt.driver [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Deleting instance files /var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec_del
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.596 182939 INFO nova.virt.libvirt.driver [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Deletion of /var/lib/nova/instances/fa6f75b7-e928-4d1c-8867-417b02ad70ec_del complete
Jan 22 00:10:01 compute-0 podman[230610]: 2026-01-22 00:10:01.61712898 +0000 UTC m=+0.182605302 container cleanup 1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 00:10:01 compute-0 systemd[1]: libpod-conmon-1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d.scope: Deactivated successfully.
Jan 22 00:10:01 compute-0 podman[230657]: 2026-01-22 00:10:01.705898672 +0000 UTC m=+0.066602260 container remove 1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.710 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a0011645-aca1-40ee-8981-46a05f53dbf2]: (4, ('Thu Jan 22 12:10:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c (1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d)\n1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d\nThu Jan 22 12:10:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c (1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d)\n1138a54828e5832f86376123526c0d8ff95bffa119dd847ea5c2f4949c4e055d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.712 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7d8138b7-8e97-49b6-9fd2-749c99555b1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.713 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74f5a9fd-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.715 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:01 compute-0 kernel: tap74f5a9fd-70: left promiscuous mode
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.726 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.728 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1e691c3e-f14e-4b86-998c-7963468e3335]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.739 182939 INFO nova.compute.manager [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Took 0.45 seconds to destroy the instance on the hypervisor.
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.741 182939 DEBUG oslo.service.loopingcall [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.740 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c37f82ec-8948-4ea4-9683-53b763a32889]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.741 182939 DEBUG nova.compute.manager [-] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.741 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[945156d5-95df-4201-9597-aa0ae522735f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.741 182939 DEBUG nova.network.neutron [-] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.758 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d201fd46-de19-4616-930c-d3024e705be4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509629, 'reachable_time': 22796, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230674, 'error': None, 'target': 'ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.761 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-74f5a9fd-74b2-4d9e-9d16-9485f59bc51c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:10:01 compute-0 systemd[1]: run-netns-ovnmeta\x2d74f5a9fd\x2d74b2\x2d4d9e\x2d9d16\x2d9485f59bc51c.mount: Deactivated successfully.
Jan 22 00:10:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:01.762 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[27258851-110b-4ea8-ad81-feb75c82b1dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:10:01 compute-0 podman[230671]: 2026-01-22 00:10:01.807339767 +0000 UTC m=+0.058251578 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:10:01 compute-0 nova_compute[182935]: 2026-01-22 00:10:01.973 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:03.206 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:03.207 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:03.207 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:03 compute-0 nova_compute[182935]: 2026-01-22 00:10:03.494 182939 DEBUG nova.compute.manager [req-a6aba01d-88e3-4ebc-8974-90394ed25807 req-81e16934-d92c-40f2-9f82-e0514faa55ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Received event network-vif-unplugged-3c6ae08f-a647-4ee0-be98-10c12c2d1911 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:03 compute-0 nova_compute[182935]: 2026-01-22 00:10:03.495 182939 DEBUG oslo_concurrency.lockutils [req-a6aba01d-88e3-4ebc-8974-90394ed25807 req-81e16934-d92c-40f2-9f82-e0514faa55ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:03 compute-0 nova_compute[182935]: 2026-01-22 00:10:03.496 182939 DEBUG oslo_concurrency.lockutils [req-a6aba01d-88e3-4ebc-8974-90394ed25807 req-81e16934-d92c-40f2-9f82-e0514faa55ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:03 compute-0 nova_compute[182935]: 2026-01-22 00:10:03.496 182939 DEBUG oslo_concurrency.lockutils [req-a6aba01d-88e3-4ebc-8974-90394ed25807 req-81e16934-d92c-40f2-9f82-e0514faa55ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:03 compute-0 nova_compute[182935]: 2026-01-22 00:10:03.497 182939 DEBUG nova.compute.manager [req-a6aba01d-88e3-4ebc-8974-90394ed25807 req-81e16934-d92c-40f2-9f82-e0514faa55ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] No waiting events found dispatching network-vif-unplugged-3c6ae08f-a647-4ee0-be98-10c12c2d1911 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:10:03 compute-0 nova_compute[182935]: 2026-01-22 00:10:03.497 182939 DEBUG nova.compute.manager [req-a6aba01d-88e3-4ebc-8974-90394ed25807 req-81e16934-d92c-40f2-9f82-e0514faa55ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Received event network-vif-unplugged-3c6ae08f-a647-4ee0-be98-10c12c2d1911 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:10:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:04.046 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:10:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:04.047 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:10:04 compute-0 nova_compute[182935]: 2026-01-22 00:10:04.048 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:04 compute-0 nova_compute[182935]: 2026-01-22 00:10:04.253 182939 DEBUG nova.network.neutron [-] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:10:04 compute-0 nova_compute[182935]: 2026-01-22 00:10:04.827 182939 DEBUG nova.compute.manager [req-3f9ec6d2-2729-4719-b32c-4b685c9e7d99 req-9f077866-6c11-4e5f-86a6-361be3a39b5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Received event network-vif-deleted-3c6ae08f-a647-4ee0-be98-10c12c2d1911 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:04 compute-0 nova_compute[182935]: 2026-01-22 00:10:04.828 182939 INFO nova.compute.manager [req-3f9ec6d2-2729-4719-b32c-4b685c9e7d99 req-9f077866-6c11-4e5f-86a6-361be3a39b5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Neutron deleted interface 3c6ae08f-a647-4ee0-be98-10c12c2d1911; detaching it from the instance and deleting it from the info cache
Jan 22 00:10:04 compute-0 nova_compute[182935]: 2026-01-22 00:10:04.828 182939 DEBUG nova.network.neutron [req-3f9ec6d2-2729-4719-b32c-4b685c9e7d99 req-9f077866-6c11-4e5f-86a6-361be3a39b5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:10:05 compute-0 nova_compute[182935]: 2026-01-22 00:10:05.221 182939 INFO nova.compute.manager [-] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Took 3.48 seconds to deallocate network for instance.
Jan 22 00:10:05 compute-0 nova_compute[182935]: 2026-01-22 00:10:05.511 182939 DEBUG nova.compute.manager [req-3f9ec6d2-2729-4719-b32c-4b685c9e7d99 req-9f077866-6c11-4e5f-86a6-361be3a39b5f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Detach interface failed, port_id=3c6ae08f-a647-4ee0-be98-10c12c2d1911, reason: Instance fa6f75b7-e928-4d1c-8867-417b02ad70ec could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:10:05 compute-0 nova_compute[182935]: 2026-01-22 00:10:05.780 182939 DEBUG nova.compute.manager [req-acf36361-6be8-40fa-90d4-3d9e1ea10456 req-7bd8e4e2-616f-4b60-9fa3-70d5ab77dfc2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Received event network-vif-plugged-3c6ae08f-a647-4ee0-be98-10c12c2d1911 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:10:05 compute-0 nova_compute[182935]: 2026-01-22 00:10:05.781 182939 DEBUG oslo_concurrency.lockutils [req-acf36361-6be8-40fa-90d4-3d9e1ea10456 req-7bd8e4e2-616f-4b60-9fa3-70d5ab77dfc2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:05 compute-0 nova_compute[182935]: 2026-01-22 00:10:05.781 182939 DEBUG oslo_concurrency.lockutils [req-acf36361-6be8-40fa-90d4-3d9e1ea10456 req-7bd8e4e2-616f-4b60-9fa3-70d5ab77dfc2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:05 compute-0 nova_compute[182935]: 2026-01-22 00:10:05.781 182939 DEBUG oslo_concurrency.lockutils [req-acf36361-6be8-40fa-90d4-3d9e1ea10456 req-7bd8e4e2-616f-4b60-9fa3-70d5ab77dfc2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:05 compute-0 nova_compute[182935]: 2026-01-22 00:10:05.782 182939 DEBUG nova.compute.manager [req-acf36361-6be8-40fa-90d4-3d9e1ea10456 req-7bd8e4e2-616f-4b60-9fa3-70d5ab77dfc2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] No waiting events found dispatching network-vif-plugged-3c6ae08f-a647-4ee0-be98-10c12c2d1911 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:10:05 compute-0 nova_compute[182935]: 2026-01-22 00:10:05.782 182939 WARNING nova.compute.manager [req-acf36361-6be8-40fa-90d4-3d9e1ea10456 req-7bd8e4e2-616f-4b60-9fa3-70d5ab77dfc2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Received unexpected event network-vif-plugged-3c6ae08f-a647-4ee0-be98-10c12c2d1911 for instance with vm_state deleted and task_state None.
Jan 22 00:10:05 compute-0 nova_compute[182935]: 2026-01-22 00:10:05.851 182939 DEBUG oslo_concurrency.lockutils [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:05 compute-0 nova_compute[182935]: 2026-01-22 00:10:05.851 182939 DEBUG oslo_concurrency.lockutils [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:05 compute-0 nova_compute[182935]: 2026-01-22 00:10:05.946 182939 DEBUG nova.compute.provider_tree [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:10:05 compute-0 nova_compute[182935]: 2026-01-22 00:10:05.982 182939 DEBUG nova.scheduler.client.report [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:10:06 compute-0 nova_compute[182935]: 2026-01-22 00:10:06.026 182939 DEBUG oslo_concurrency.lockutils [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:06 compute-0 nova_compute[182935]: 2026-01-22 00:10:06.100 182939 INFO nova.scheduler.client.report [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Deleted allocations for instance fa6f75b7-e928-4d1c-8867-417b02ad70ec
Jan 22 00:10:06 compute-0 nova_compute[182935]: 2026-01-22 00:10:06.241 182939 DEBUG oslo_concurrency.lockutils [None req-67e91e7c-2f09-4985-934f-5e6e9506efe8 7ee17110d4744f99aaa3a6e7f5704bec 99c50ccfe643400aa6cbd9e61e8ac16b - - default default] Lock "fa6f75b7-e928-4d1c-8867-417b02ad70ec" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.979s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:06 compute-0 nova_compute[182935]: 2026-01-22 00:10:06.592 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:07 compute-0 nova_compute[182935]: 2026-01-22 00:10:07.002 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:07 compute-0 podman[230698]: 2026-01-22 00:10:07.674988074 +0000 UTC m=+0.048139614 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:10:08 compute-0 sshd-session[230717]: Invalid user svn from 188.166.69.60 port 38188
Jan 22 00:10:08 compute-0 sshd-session[230717]: Connection closed by invalid user svn 188.166.69.60 port 38188 [preauth]
Jan 22 00:10:11 compute-0 nova_compute[182935]: 2026-01-22 00:10:11.596 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:12 compute-0 nova_compute[182935]: 2026-01-22 00:10:12.004 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:10:12.050 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:10:12 compute-0 podman[230719]: 2026-01-22 00:10:12.684959598 +0000 UTC m=+0.056956758 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, architecture=x86_64)
Jan 22 00:10:12 compute-0 podman[230720]: 2026-01-22 00:10:12.715799211 +0000 UTC m=+0.084211358 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 00:10:16 compute-0 nova_compute[182935]: 2026-01-22 00:10:16.553 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040601.550981, fa6f75b7-e928-4d1c-8867-417b02ad70ec => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:10:16 compute-0 nova_compute[182935]: 2026-01-22 00:10:16.554 182939 INFO nova.compute.manager [-] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] VM Stopped (Lifecycle Event)
Jan 22 00:10:16 compute-0 nova_compute[182935]: 2026-01-22 00:10:16.599 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:16 compute-0 nova_compute[182935]: 2026-01-22 00:10:16.622 182939 DEBUG nova.compute.manager [None req-df55d6fd-b7a9-4e47-aa83-cdcb69eb87df - - - - - -] [instance: fa6f75b7-e928-4d1c-8867-417b02ad70ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:10:17 compute-0 nova_compute[182935]: 2026-01-22 00:10:17.039 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:18 compute-0 nova_compute[182935]: 2026-01-22 00:10:18.257 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:21 compute-0 nova_compute[182935]: 2026-01-22 00:10:21.603 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:22 compute-0 nova_compute[182935]: 2026-01-22 00:10:22.041 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:10:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:10:26 compute-0 nova_compute[182935]: 2026-01-22 00:10:26.659 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:26 compute-0 podman[230764]: 2026-01-22 00:10:26.751793467 +0000 UTC m=+0.058114334 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:10:26 compute-0 podman[230763]: 2026-01-22 00:10:26.784837912 +0000 UTC m=+0.097174568 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 22 00:10:27 compute-0 nova_compute[182935]: 2026-01-22 00:10:27.043 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:31 compute-0 nova_compute[182935]: 2026-01-22 00:10:31.662 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:32 compute-0 nova_compute[182935]: 2026-01-22 00:10:32.070 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:32 compute-0 podman[230811]: 2026-01-22 00:10:32.692818472 +0000 UTC m=+0.065667788 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:10:36 compute-0 nova_compute[182935]: 2026-01-22 00:10:36.664 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:36 compute-0 nova_compute[182935]: 2026-01-22 00:10:36.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:36 compute-0 nova_compute[182935]: 2026-01-22 00:10:36.843 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:36 compute-0 nova_compute[182935]: 2026-01-22 00:10:36.843 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:36 compute-0 nova_compute[182935]: 2026-01-22 00:10:36.843 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:36 compute-0 nova_compute[182935]: 2026-01-22 00:10:36.844 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:10:37 compute-0 nova_compute[182935]: 2026-01-22 00:10:37.019 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:10:37 compute-0 nova_compute[182935]: 2026-01-22 00:10:37.020 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5702MB free_disk=73.12791061401367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:10:37 compute-0 nova_compute[182935]: 2026-01-22 00:10:37.020 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:10:37 compute-0 nova_compute[182935]: 2026-01-22 00:10:37.021 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:10:37 compute-0 nova_compute[182935]: 2026-01-22 00:10:37.073 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:37 compute-0 nova_compute[182935]: 2026-01-22 00:10:37.242 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:10:37 compute-0 nova_compute[182935]: 2026-01-22 00:10:37.242 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:10:37 compute-0 nova_compute[182935]: 2026-01-22 00:10:37.337 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:10:37 compute-0 nova_compute[182935]: 2026-01-22 00:10:37.355 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:10:37 compute-0 nova_compute[182935]: 2026-01-22 00:10:37.377 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:10:37 compute-0 nova_compute[182935]: 2026-01-22 00:10:37.377 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:10:38 compute-0 nova_compute[182935]: 2026-01-22 00:10:38.377 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:38 compute-0 nova_compute[182935]: 2026-01-22 00:10:38.378 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:10:38 compute-0 nova_compute[182935]: 2026-01-22 00:10:38.378 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:10:38 compute-0 nova_compute[182935]: 2026-01-22 00:10:38.461 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:10:38 compute-0 podman[230837]: 2026-01-22 00:10:38.678315122 +0000 UTC m=+0.050121208 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 00:10:38 compute-0 nova_compute[182935]: 2026-01-22 00:10:38.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:38 compute-0 nova_compute[182935]: 2026-01-22 00:10:38.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:10:40 compute-0 nova_compute[182935]: 2026-01-22 00:10:40.790 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:40 compute-0 nova_compute[182935]: 2026-01-22 00:10:40.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:40 compute-0 nova_compute[182935]: 2026-01-22 00:10:40.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:41 compute-0 nova_compute[182935]: 2026-01-22 00:10:41.667 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:41 compute-0 nova_compute[182935]: 2026-01-22 00:10:41.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:42 compute-0 nova_compute[182935]: 2026-01-22 00:10:42.075 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:43 compute-0 podman[230857]: 2026-01-22 00:10:43.684540301 +0000 UTC m=+0.056562599 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64)
Jan 22 00:10:43 compute-0 podman[230858]: 2026-01-22 00:10:43.697475299 +0000 UTC m=+0.065210488 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:10:44 compute-0 nova_compute[182935]: 2026-01-22 00:10:44.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:46 compute-0 nova_compute[182935]: 2026-01-22 00:10:46.671 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:47 compute-0 nova_compute[182935]: 2026-01-22 00:10:47.080 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:51 compute-0 nova_compute[182935]: 2026-01-22 00:10:51.674 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:51 compute-0 sshd-session[230898]: Invalid user docker from 188.166.69.60 port 59774
Jan 22 00:10:52 compute-0 sshd-session[230898]: Connection closed by invalid user docker 188.166.69.60 port 59774 [preauth]
Jan 22 00:10:52 compute-0 nova_compute[182935]: 2026-01-22 00:10:52.123 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:56 compute-0 nova_compute[182935]: 2026-01-22 00:10:56.677 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:56 compute-0 nova_compute[182935]: 2026-01-22 00:10:56.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:10:57 compute-0 nova_compute[182935]: 2026-01-22 00:10:57.125 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:10:57 compute-0 podman[230901]: 2026-01-22 00:10:57.70924682 +0000 UTC m=+0.067921642 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:10:57 compute-0 podman[230900]: 2026-01-22 00:10:57.716003015 +0000 UTC m=+0.086938491 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:11:01 compute-0 nova_compute[182935]: 2026-01-22 00:11:01.680 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:02 compute-0 nova_compute[182935]: 2026-01-22 00:11:02.171 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:11:03.208 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:11:03.208 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:11:03.208 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:03 compute-0 podman[230949]: 2026-01-22 00:11:03.700718997 +0000 UTC m=+0.071818402 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:11:06 compute-0 ovn_controller[95047]: 2026-01-22T00:11:06Z|00469|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 22 00:11:06 compute-0 nova_compute[182935]: 2026-01-22 00:11:06.715 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:07 compute-0 nova_compute[182935]: 2026-01-22 00:11:07.173 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:11:08.410 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:11:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:11:08.411 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:11:08 compute-0 nova_compute[182935]: 2026-01-22 00:11:08.411 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:09 compute-0 podman[230972]: 2026-01-22 00:11:09.673544766 +0000 UTC m=+0.051393040 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:11:11 compute-0 nova_compute[182935]: 2026-01-22 00:11:11.718 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:12 compute-0 nova_compute[182935]: 2026-01-22 00:11:12.175 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:14 compute-0 podman[230992]: 2026-01-22 00:11:14.69260721 +0000 UTC m=+0.064079802 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:11:14 compute-0 podman[230993]: 2026-01-22 00:11:14.714734371 +0000 UTC m=+0.086543541 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:11:16 compute-0 nova_compute[182935]: 2026-01-22 00:11:16.720 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:17 compute-0 nova_compute[182935]: 2026-01-22 00:11:17.177 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:11:17.412 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:11:21 compute-0 nova_compute[182935]: 2026-01-22 00:11:21.722 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:22 compute-0 nova_compute[182935]: 2026-01-22 00:11:22.179 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:26 compute-0 nova_compute[182935]: 2026-01-22 00:11:26.724 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:27 compute-0 nova_compute[182935]: 2026-01-22 00:11:27.181 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:28 compute-0 podman[231032]: 2026-01-22 00:11:28.685259247 +0000 UTC m=+0.056671561 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:11:28 compute-0 podman[231031]: 2026-01-22 00:11:28.711915813 +0000 UTC m=+0.088251421 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:11:31 compute-0 nova_compute[182935]: 2026-01-22 00:11:31.726 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:32 compute-0 nova_compute[182935]: 2026-01-22 00:11:32.183 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:34 compute-0 podman[231082]: 2026-01-22 00:11:34.670930392 +0000 UTC m=+0.049364524 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:11:36 compute-0 nova_compute[182935]: 2026-01-22 00:11:36.729 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:37 compute-0 sshd-session[231106]: Invalid user docker from 188.166.69.60 port 50384
Jan 22 00:11:37 compute-0 sshd-session[231106]: Connection closed by invalid user docker 188.166.69.60 port 50384 [preauth]
Jan 22 00:11:37 compute-0 nova_compute[182935]: 2026-01-22 00:11:37.186 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:38 compute-0 nova_compute[182935]: 2026-01-22 00:11:38.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:38 compute-0 nova_compute[182935]: 2026-01-22 00:11:38.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:11:38 compute-0 nova_compute[182935]: 2026-01-22 00:11:38.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:11:39 compute-0 nova_compute[182935]: 2026-01-22 00:11:39.101 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:11:39 compute-0 nova_compute[182935]: 2026-01-22 00:11:39.102 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:39 compute-0 nova_compute[182935]: 2026-01-22 00:11:39.171 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:39 compute-0 nova_compute[182935]: 2026-01-22 00:11:39.172 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:39 compute-0 nova_compute[182935]: 2026-01-22 00:11:39.172 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:39 compute-0 nova_compute[182935]: 2026-01-22 00:11:39.172 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:11:39 compute-0 nova_compute[182935]: 2026-01-22 00:11:39.328 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:11:39 compute-0 nova_compute[182935]: 2026-01-22 00:11:39.329 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5703MB free_disk=73.12789154052734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:11:39 compute-0 nova_compute[182935]: 2026-01-22 00:11:39.329 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:39 compute-0 nova_compute[182935]: 2026-01-22 00:11:39.329 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:40 compute-0 nova_compute[182935]: 2026-01-22 00:11:40.053 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:11:40 compute-0 nova_compute[182935]: 2026-01-22 00:11:40.053 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:11:40 compute-0 nova_compute[182935]: 2026-01-22 00:11:40.083 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:11:40 compute-0 nova_compute[182935]: 2026-01-22 00:11:40.113 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:11:40 compute-0 nova_compute[182935]: 2026-01-22 00:11:40.115 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:11:40 compute-0 nova_compute[182935]: 2026-01-22 00:11:40.115 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:40 compute-0 podman[231108]: 2026-01-22 00:11:40.687151344 +0000 UTC m=+0.059478009 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 00:11:40 compute-0 nova_compute[182935]: 2026-01-22 00:11:40.807 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:40 compute-0 nova_compute[182935]: 2026-01-22 00:11:40.807 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:40 compute-0 nova_compute[182935]: 2026-01-22 00:11:40.807 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:40 compute-0 nova_compute[182935]: 2026-01-22 00:11:40.807 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:11:41 compute-0 nova_compute[182935]: 2026-01-22 00:11:41.732 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:42 compute-0 nova_compute[182935]: 2026-01-22 00:11:42.189 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:42 compute-0 nova_compute[182935]: 2026-01-22 00:11:42.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:43 compute-0 nova_compute[182935]: 2026-01-22 00:11:43.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:45 compute-0 podman[231127]: 2026-01-22 00:11:45.700847389 +0000 UTC m=+0.069922294 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 00:11:45 compute-0 podman[231128]: 2026-01-22 00:11:45.703649684 +0000 UTC m=+0.068745306 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 00:11:46 compute-0 nova_compute[182935]: 2026-01-22 00:11:46.736 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:46 compute-0 nova_compute[182935]: 2026-01-22 00:11:46.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:47 compute-0 nova_compute[182935]: 2026-01-22 00:11:47.236 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:47 compute-0 nova_compute[182935]: 2026-01-22 00:11:47.631 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:47 compute-0 nova_compute[182935]: 2026-01-22 00:11:47.631 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:47 compute-0 nova_compute[182935]: 2026-01-22 00:11:47.847 182939 DEBUG nova.compute.manager [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:11:49 compute-0 nova_compute[182935]: 2026-01-22 00:11:49.749 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:49 compute-0 nova_compute[182935]: 2026-01-22 00:11:49.749 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:49 compute-0 nova_compute[182935]: 2026-01-22 00:11:49.755 182939 DEBUG nova.virt.hardware [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:11:49 compute-0 nova_compute[182935]: 2026-01-22 00:11:49.755 182939 INFO nova.compute.claims [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:11:50 compute-0 nova_compute[182935]: 2026-01-22 00:11:50.753 182939 DEBUG nova.compute.provider_tree [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:11:51 compute-0 nova_compute[182935]: 2026-01-22 00:11:51.020 182939 DEBUG nova.scheduler.client.report [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:11:51 compute-0 nova_compute[182935]: 2026-01-22 00:11:51.503 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:51 compute-0 nova_compute[182935]: 2026-01-22 00:11:51.504 182939 DEBUG nova.compute.manager [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:11:51 compute-0 nova_compute[182935]: 2026-01-22 00:11:51.739 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:51 compute-0 nova_compute[182935]: 2026-01-22 00:11:51.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:52 compute-0 nova_compute[182935]: 2026-01-22 00:11:52.236 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:52 compute-0 nova_compute[182935]: 2026-01-22 00:11:52.805 182939 DEBUG nova.compute.manager [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:11:52 compute-0 nova_compute[182935]: 2026-01-22 00:11:52.805 182939 DEBUG nova.network.neutron [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:11:52 compute-0 nova_compute[182935]: 2026-01-22 00:11:52.993 182939 DEBUG nova.policy [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:11:53 compute-0 nova_compute[182935]: 2026-01-22 00:11:53.287 182939 INFO nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:11:53 compute-0 nova_compute[182935]: 2026-01-22 00:11:53.700 182939 DEBUG nova.compute.manager [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.566 182939 DEBUG nova.compute.manager [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.568 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.568 182939 INFO nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Creating image(s)
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.569 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.570 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.571 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.587 182939 DEBUG oslo_concurrency.processutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.651 182939 DEBUG nova.network.neutron [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Successfully created port: 58acff7a-ba54-40d0-8427-a8ff28f15fd2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.677 182939 DEBUG oslo_concurrency.processutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.678 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.679 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.690 182939 DEBUG oslo_concurrency.processutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.752 182939 DEBUG oslo_concurrency.processutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.753 182939 DEBUG oslo_concurrency.processutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.787 182939 DEBUG oslo_concurrency.processutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.789 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.790 182939 DEBUG oslo_concurrency.processutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.852 182939 DEBUG oslo_concurrency.processutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.853 182939 DEBUG nova.virt.disk.api [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.853 182939 DEBUG oslo_concurrency.processutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.919 182939 DEBUG oslo_concurrency.processutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.920 182939 DEBUG nova.virt.disk.api [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:11:54 compute-0 nova_compute[182935]: 2026-01-22 00:11:54.920 182939 DEBUG nova.objects.instance [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 3f8ad853-99b9-4999-bafc-9d63e7f9726f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:11:55 compute-0 nova_compute[182935]: 2026-01-22 00:11:55.091 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:11:55 compute-0 nova_compute[182935]: 2026-01-22 00:11:55.091 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Ensure instance console log exists: /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:11:55 compute-0 nova_compute[182935]: 2026-01-22 00:11:55.092 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:11:55 compute-0 nova_compute[182935]: 2026-01-22 00:11:55.092 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:11:55 compute-0 nova_compute[182935]: 2026-01-22 00:11:55.092 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:11:56 compute-0 nova_compute[182935]: 2026-01-22 00:11:56.741 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:57 compute-0 nova_compute[182935]: 2026-01-22 00:11:57.238 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:11:57 compute-0 nova_compute[182935]: 2026-01-22 00:11:57.600 182939 DEBUG nova.network.neutron [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Successfully updated port: 58acff7a-ba54-40d0-8427-a8ff28f15fd2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:11:57 compute-0 nova_compute[182935]: 2026-01-22 00:11:57.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:11:57 compute-0 nova_compute[182935]: 2026-01-22 00:11:57.885 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:11:57 compute-0 nova_compute[182935]: 2026-01-22 00:11:57.886 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:11:57 compute-0 nova_compute[182935]: 2026-01-22 00:11:57.886 182939 DEBUG nova.network.neutron [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:11:57 compute-0 nova_compute[182935]: 2026-01-22 00:11:57.922 182939 DEBUG nova.compute.manager [req-ee119963-1e33-4c8c-8168-b8c3fd710b47 req-cc7dfbcf-8e79-4624-b86a-d8bf56fafc02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Received event network-changed-58acff7a-ba54-40d0-8427-a8ff28f15fd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:11:57 compute-0 nova_compute[182935]: 2026-01-22 00:11:57.922 182939 DEBUG nova.compute.manager [req-ee119963-1e33-4c8c-8168-b8c3fd710b47 req-cc7dfbcf-8e79-4624-b86a-d8bf56fafc02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Refreshing instance network info cache due to event network-changed-58acff7a-ba54-40d0-8427-a8ff28f15fd2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:11:57 compute-0 nova_compute[182935]: 2026-01-22 00:11:57.923 182939 DEBUG oslo_concurrency.lockutils [req-ee119963-1e33-4c8c-8168-b8c3fd710b47 req-cc7dfbcf-8e79-4624-b86a-d8bf56fafc02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:11:58 compute-0 nova_compute[182935]: 2026-01-22 00:11:58.550 182939 DEBUG nova.network.neutron [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:11:59 compute-0 podman[231185]: 2026-01-22 00:11:59.718985387 +0000 UTC m=+0.082512858 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:11:59 compute-0 podman[231184]: 2026-01-22 00:11:59.731369956 +0000 UTC m=+0.104188124 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:12:01 compute-0 nova_compute[182935]: 2026-01-22 00:12:01.744 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.275 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.754 182939 DEBUG nova.network.neutron [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Updating instance_info_cache with network_info: [{"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.944 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.944 182939 DEBUG nova.compute.manager [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Instance network_info: |[{"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.945 182939 DEBUG oslo_concurrency.lockutils [req-ee119963-1e33-4c8c-8168-b8c3fd710b47 req-cc7dfbcf-8e79-4624-b86a-d8bf56fafc02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.945 182939 DEBUG nova.network.neutron [req-ee119963-1e33-4c8c-8168-b8c3fd710b47 req-cc7dfbcf-8e79-4624-b86a-d8bf56fafc02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Refreshing network info cache for port 58acff7a-ba54-40d0-8427-a8ff28f15fd2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.955 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Start _get_guest_xml network_info=[{"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.959 182939 WARNING nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.965 182939 DEBUG nova.virt.libvirt.host [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.965 182939 DEBUG nova.virt.libvirt.host [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.968 182939 DEBUG nova.virt.libvirt.host [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.969 182939 DEBUG nova.virt.libvirt.host [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.970 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.970 182939 DEBUG nova.virt.hardware [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.971 182939 DEBUG nova.virt.hardware [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.971 182939 DEBUG nova.virt.hardware [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.971 182939 DEBUG nova.virt.hardware [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.971 182939 DEBUG nova.virt.hardware [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.971 182939 DEBUG nova.virt.hardware [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.972 182939 DEBUG nova.virt.hardware [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.972 182939 DEBUG nova.virt.hardware [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.972 182939 DEBUG nova.virt.hardware [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.972 182939 DEBUG nova.virt.hardware [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.972 182939 DEBUG nova.virt.hardware [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.976 182939 DEBUG nova.virt.libvirt.vif [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:11:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-648879642',display_name='tempest-TestNetworkBasicOps-server-648879642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-648879642',id=117,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfmPsjlhriOW7dRWovArAk9oG0vlTV4UF00kpkAcU8b+GiH2HuvDQwwI391+p3h3ZkbsOGbPDPBtDG3YybgfwiG7zKi8EOlbFfoqTDO+fE0iPWic901YZUPREuM/hSbHA==',key_name='tempest-TestNetworkBasicOps-1936956188',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-jzui0m77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:11:53Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=3f8ad853-99b9-4999-bafc-9d63e7f9726f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.976 182939 DEBUG nova.network.os_vif_util [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.977 182939 DEBUG nova.network.os_vif_util [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:80:7a,bridge_name='br-int',has_traffic_filtering=True,id=58acff7a-ba54-40d0-8427-a8ff28f15fd2,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58acff7a-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:12:02 compute-0 nova_compute[182935]: 2026-01-22 00:12:02.978 182939 DEBUG nova.objects.instance [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f8ad853-99b9-4999-bafc-9d63e7f9726f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.158 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:12:03 compute-0 nova_compute[182935]:   <uuid>3f8ad853-99b9-4999-bafc-9d63e7f9726f</uuid>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   <name>instance-00000075</name>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <nova:name>tempest-TestNetworkBasicOps-server-648879642</nova:name>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:12:02</nova:creationTime>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:12:03 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:12:03 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:12:03 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:12:03 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:12:03 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:12:03 compute-0 nova_compute[182935]:         <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:12:03 compute-0 nova_compute[182935]:         <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:12:03 compute-0 nova_compute[182935]:         <nova:port uuid="58acff7a-ba54-40d0-8427-a8ff28f15fd2">
Jan 22 00:12:03 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <system>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <entry name="serial">3f8ad853-99b9-4999-bafc-9d63e7f9726f</entry>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <entry name="uuid">3f8ad853-99b9-4999-bafc-9d63e7f9726f</entry>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     </system>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   <os>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   </os>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   <features>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   </features>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.config"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:fb:80:7a"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <target dev="tap58acff7a-ba"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/console.log" append="off"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <video>
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     </video>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:12:03 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:12:03 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:12:03 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:12:03 compute-0 nova_compute[182935]: </domain>
Jan 22 00:12:03 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.160 182939 DEBUG nova.compute.manager [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Preparing to wait for external event network-vif-plugged-58acff7a-ba54-40d0-8427-a8ff28f15fd2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.161 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.161 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.161 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.162 182939 DEBUG nova.virt.libvirt.vif [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:11:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-648879642',display_name='tempest-TestNetworkBasicOps-server-648879642',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-648879642',id=117,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfmPsjlhriOW7dRWovArAk9oG0vlTV4UF00kpkAcU8b+GiH2HuvDQwwI391+p3h3ZkbsOGbPDPBtDG3YybgfwiG7zKi8EOlbFfoqTDO+fE0iPWic901YZUPREuM/hSbHA==',key_name='tempest-TestNetworkBasicOps-1936956188',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-jzui0m77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:11:53Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=3f8ad853-99b9-4999-bafc-9d63e7f9726f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.162 182939 DEBUG nova.network.os_vif_util [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.163 182939 DEBUG nova.network.os_vif_util [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:80:7a,bridge_name='br-int',has_traffic_filtering=True,id=58acff7a-ba54-40d0-8427-a8ff28f15fd2,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58acff7a-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.164 182939 DEBUG os_vif [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:80:7a,bridge_name='br-int',has_traffic_filtering=True,id=58acff7a-ba54-40d0-8427-a8ff28f15fd2,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58acff7a-ba') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.164 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.165 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.165 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.169 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.169 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap58acff7a-ba, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.170 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap58acff7a-ba, col_values=(('external_ids', {'iface-id': '58acff7a-ba54-40d0-8427-a8ff28f15fd2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fb:80:7a', 'vm-uuid': '3f8ad853-99b9-4999-bafc-9d63e7f9726f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.172 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:03 compute-0 NetworkManager[55139]: <info>  [1769040723.1729] manager: (tap58acff7a-ba): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/219)
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.173 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.179 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.181 182939 INFO os_vif [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:80:7a,bridge_name='br-int',has_traffic_filtering=True,id=58acff7a-ba54-40d0-8427-a8ff28f15fd2,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58acff7a-ba')
Jan 22 00:12:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:03.208 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:03.209 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:03.209 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.296 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.296 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.296 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:fb:80:7a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:12:03 compute-0 nova_compute[182935]: 2026-01-22 00:12:03.297 182939 INFO nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Using config drive
Jan 22 00:12:04 compute-0 nova_compute[182935]: 2026-01-22 00:12:04.066 182939 INFO nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Creating config drive at /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.config
Jan 22 00:12:04 compute-0 nova_compute[182935]: 2026-01-22 00:12:04.072 182939 DEBUG oslo_concurrency.processutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7ql3u3vu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:12:04 compute-0 nova_compute[182935]: 2026-01-22 00:12:04.202 182939 DEBUG oslo_concurrency.processutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7ql3u3vu" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:12:04 compute-0 kernel: tap58acff7a-ba: entered promiscuous mode
Jan 22 00:12:04 compute-0 NetworkManager[55139]: <info>  [1769040724.2672] manager: (tap58acff7a-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/220)
Jan 22 00:12:04 compute-0 ovn_controller[95047]: 2026-01-22T00:12:04Z|00470|binding|INFO|Claiming lport 58acff7a-ba54-40d0-8427-a8ff28f15fd2 for this chassis.
Jan 22 00:12:04 compute-0 ovn_controller[95047]: 2026-01-22T00:12:04Z|00471|binding|INFO|58acff7a-ba54-40d0-8427-a8ff28f15fd2: Claiming fa:16:3e:fb:80:7a 10.100.0.6
Jan 22 00:12:04 compute-0 nova_compute[182935]: 2026-01-22 00:12:04.269 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:04 compute-0 nova_compute[182935]: 2026-01-22 00:12:04.272 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:04 compute-0 nova_compute[182935]: 2026-01-22 00:12:04.277 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:04 compute-0 systemd-udevd[231255]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:12:04 compute-0 systemd-machined[154182]: New machine qemu-61-instance-00000075.
Jan 22 00:12:04 compute-0 NetworkManager[55139]: <info>  [1769040724.3098] device (tap58acff7a-ba): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:12:04 compute-0 NetworkManager[55139]: <info>  [1769040724.3103] device (tap58acff7a-ba): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:12:04 compute-0 ovn_controller[95047]: 2026-01-22T00:12:04Z|00472|binding|INFO|Setting lport 58acff7a-ba54-40d0-8427-a8ff28f15fd2 ovn-installed in OVS
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.378 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:80:7a 10.100.0.6'], port_security=['fa:16:3e:fb:80:7a 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c2d6e47-df04-4d44-b308-f39e21535b4b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d452ef76-084d-4578-ab80-dfb49c9c8f9b, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=58acff7a-ba54-40d0-8427-a8ff28f15fd2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:12:04 compute-0 ovn_controller[95047]: 2026-01-22T00:12:04Z|00473|binding|INFO|Setting lport 58acff7a-ba54-40d0-8427-a8ff28f15fd2 up in Southbound
Jan 22 00:12:04 compute-0 nova_compute[182935]: 2026-01-22 00:12:04.378 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.379 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 58acff7a-ba54-40d0-8427-a8ff28f15fd2 in datapath 88a7330a-aaa1-424a-b4dc-f7500e450abb bound to our chassis
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.380 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88a7330a-aaa1-424a-b4dc-f7500e450abb
Jan 22 00:12:04 compute-0 systemd[1]: Started Virtual Machine qemu-61-instance-00000075.
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.392 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d19ed1af-ebd4-49c4-9183-97bb678648f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.393 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88a7330a-a1 in ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.396 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88a7330a-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.396 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c04fb662-29b9-43ed-b627-606faee82f40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.397 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[642634bf-9992-46c9-b813-a29b8e546129]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.412 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[08bcc0d1-b9c1-4dea-a5ab-917ee56edc77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.438 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b9c2dbe5-e1b5-4a14-a9ed-63b1bd2c57be]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.469 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8772e8dd-870d-41f6-93b8-a8f1ff38fcf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.474 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7ce16195-bd55-4828-a0ac-0f8f4b7d86b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 systemd-udevd[231258]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:12:04 compute-0 NetworkManager[55139]: <info>  [1769040724.4764] manager: (tap88a7330a-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/221)
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.506 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[738e955d-5a79-4c8e-b7be-860c75375b2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.511 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d535751a-f3b3-48ea-b74b-217e927fa780]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 NetworkManager[55139]: <info>  [1769040724.5315] device (tap88a7330a-a0): carrier: link connected
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.540 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[1c7fcc0c-d6a3-43e9-ae3a-80deff2f6aaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.557 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d034331a-8730-49f7-8b99-2227d1192896]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88a7330a-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:32:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522727, 'reachable_time': 37594, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231289, 'error': None, 'target': 'ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.573 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7abfb15e-5aae-4c0d-85ac-bf75884c41bf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6d:3252'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 522727, 'tstamp': 522727}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231290, 'error': None, 'target': 'ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.590 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8f755070-e014-4ba9-981e-860cbc8ca030]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88a7330a-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6d:32:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 180, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 180, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522727, 'reachable_time': 37594, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231291, 'error': None, 'target': 'ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.626 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d22d0ea3-f00a-4417-b4c4-6388452108fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.710 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[09462652-a725-42e4-a5c5-4794a2a502ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.712 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88a7330a-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.713 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.714 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88a7330a-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:12:04 compute-0 nova_compute[182935]: 2026-01-22 00:12:04.717 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:04 compute-0 kernel: tap88a7330a-a0: entered promiscuous mode
Jan 22 00:12:04 compute-0 NetworkManager[55139]: <info>  [1769040724.7189] manager: (tap88a7330a-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/222)
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.719 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88a7330a-a0, col_values=(('external_ids', {'iface-id': 'f63f34ac-9af7-4a13-911f-2c9f043a5c66'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:12:04 compute-0 nova_compute[182935]: 2026-01-22 00:12:04.720 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:04 compute-0 ovn_controller[95047]: 2026-01-22T00:12:04Z|00474|binding|INFO|Releasing lport f63f34ac-9af7-4a13-911f-2c9f043a5c66 from this chassis (sb_readonly=0)
Jan 22 00:12:04 compute-0 nova_compute[182935]: 2026-01-22 00:12:04.722 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.723 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88a7330a-aaa1-424a-b4dc-f7500e450abb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88a7330a-aaa1-424a-b4dc-f7500e450abb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.732 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[93661160-1dd2-4885-bd2f-91dc497f59fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:12:04 compute-0 nova_compute[182935]: 2026-01-22 00:12:04.733 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.734 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-88a7330a-aaa1-424a-b4dc-f7500e450abb
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/88a7330a-aaa1-424a-b4dc-f7500e450abb.pid.haproxy
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 88a7330a-aaa1-424a-b4dc-f7500e450abb
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:12:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:04.735 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'env', 'PROCESS_TAG=haproxy-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88a7330a-aaa1-424a-b4dc-f7500e450abb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:12:05 compute-0 podman[231323]: 2026-01-22 00:12:05.091946759 +0000 UTC m=+0.058074816 container create ec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 00:12:05 compute-0 systemd[1]: Started libpod-conmon-ec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2.scope.
Jan 22 00:12:05 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:12:05 compute-0 podman[231323]: 2026-01-22 00:12:05.06156556 +0000 UTC m=+0.027693637 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:12:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c829cd4c5da113e24294ca329231c67e0864c7abb12b9e2a64d7dc86f3756574/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:12:05 compute-0 podman[231323]: 2026-01-22 00:12:05.166844018 +0000 UTC m=+0.132972095 container init ec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 00:12:05 compute-0 podman[231323]: 2026-01-22 00:12:05.172610443 +0000 UTC m=+0.138738500 container start ec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:12:05 compute-0 podman[231341]: 2026-01-22 00:12:05.178835528 +0000 UTC m=+0.062328316 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:12:05 compute-0 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[231350]: [NOTICE]   (231373) : New worker (231375) forked
Jan 22 00:12:05 compute-0 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[231350]: [NOTICE]   (231373) : Loading success.
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.201 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040725.2005033, 3f8ad853-99b9-4999-bafc-9d63e7f9726f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.202 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] VM Started (Lifecycle Event)
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.478 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.483 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040725.2008288, 3f8ad853-99b9-4999-bafc-9d63e7f9726f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.483 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] VM Paused (Lifecycle Event)
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.582 182939 DEBUG nova.compute.manager [req-6d56f573-9359-46d1-b2b2-d131da5332a1 req-73da5aa3-0390-4dfd-8232-e4f8f95fa9f8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Received event network-vif-plugged-58acff7a-ba54-40d0-8427-a8ff28f15fd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.583 182939 DEBUG oslo_concurrency.lockutils [req-6d56f573-9359-46d1-b2b2-d131da5332a1 req-73da5aa3-0390-4dfd-8232-e4f8f95fa9f8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.583 182939 DEBUG oslo_concurrency.lockutils [req-6d56f573-9359-46d1-b2b2-d131da5332a1 req-73da5aa3-0390-4dfd-8232-e4f8f95fa9f8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.583 182939 DEBUG oslo_concurrency.lockutils [req-6d56f573-9359-46d1-b2b2-d131da5332a1 req-73da5aa3-0390-4dfd-8232-e4f8f95fa9f8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.583 182939 DEBUG nova.compute.manager [req-6d56f573-9359-46d1-b2b2-d131da5332a1 req-73da5aa3-0390-4dfd-8232-e4f8f95fa9f8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Processing event network-vif-plugged-58acff7a-ba54-40d0-8427-a8ff28f15fd2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.584 182939 DEBUG nova.compute.manager [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.588 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.591 182939 INFO nova.virt.libvirt.driver [-] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Instance spawned successfully.
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.592 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.607 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.613 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040725.588374, 3f8ad853-99b9-4999-bafc-9d63e7f9726f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.613 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] VM Resumed (Lifecycle Event)
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.618 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.619 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.619 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.620 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.620 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.620 182939 DEBUG nova.virt.libvirt.driver [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.633 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.635 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.638 182939 DEBUG nova.network.neutron [req-ee119963-1e33-4c8c-8168-b8c3fd710b47 req-cc7dfbcf-8e79-4624-b86a-d8bf56fafc02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Updated VIF entry in instance network info cache for port 58acff7a-ba54-40d0-8427-a8ff28f15fd2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:12:05 compute-0 nova_compute[182935]: 2026-01-22 00:12:05.638 182939 DEBUG nova.network.neutron [req-ee119963-1e33-4c8c-8168-b8c3fd710b47 req-cc7dfbcf-8e79-4624-b86a-d8bf56fafc02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Updating instance_info_cache with network_info: [{"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:12:06 compute-0 nova_compute[182935]: 2026-01-22 00:12:06.007 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:12:06 compute-0 nova_compute[182935]: 2026-01-22 00:12:06.008 182939 DEBUG oslo_concurrency.lockutils [req-ee119963-1e33-4c8c-8168-b8c3fd710b47 req-cc7dfbcf-8e79-4624-b86a-d8bf56fafc02 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:12:06 compute-0 nova_compute[182935]: 2026-01-22 00:12:06.081 182939 INFO nova.compute.manager [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Took 11.51 seconds to spawn the instance on the hypervisor.
Jan 22 00:12:06 compute-0 nova_compute[182935]: 2026-01-22 00:12:06.082 182939 DEBUG nova.compute.manager [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:12:07 compute-0 nova_compute[182935]: 2026-01-22 00:12:07.276 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:07 compute-0 nova_compute[182935]: 2026-01-22 00:12:07.729 182939 DEBUG nova.compute.manager [req-1c3df375-0221-4e5c-bc7a-df908c992069 req-44ed97c3-6319-4f60-9123-cdf9d0703ed8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Received event network-vif-plugged-58acff7a-ba54-40d0-8427-a8ff28f15fd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:07 compute-0 nova_compute[182935]: 2026-01-22 00:12:07.729 182939 DEBUG oslo_concurrency.lockutils [req-1c3df375-0221-4e5c-bc7a-df908c992069 req-44ed97c3-6319-4f60-9123-cdf9d0703ed8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:07 compute-0 nova_compute[182935]: 2026-01-22 00:12:07.730 182939 DEBUG oslo_concurrency.lockutils [req-1c3df375-0221-4e5c-bc7a-df908c992069 req-44ed97c3-6319-4f60-9123-cdf9d0703ed8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:07 compute-0 nova_compute[182935]: 2026-01-22 00:12:07.730 182939 DEBUG oslo_concurrency.lockutils [req-1c3df375-0221-4e5c-bc7a-df908c992069 req-44ed97c3-6319-4f60-9123-cdf9d0703ed8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:07 compute-0 nova_compute[182935]: 2026-01-22 00:12:07.730 182939 DEBUG nova.compute.manager [req-1c3df375-0221-4e5c-bc7a-df908c992069 req-44ed97c3-6319-4f60-9123-cdf9d0703ed8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] No waiting events found dispatching network-vif-plugged-58acff7a-ba54-40d0-8427-a8ff28f15fd2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:12:07 compute-0 nova_compute[182935]: 2026-01-22 00:12:07.731 182939 WARNING nova.compute.manager [req-1c3df375-0221-4e5c-bc7a-df908c992069 req-44ed97c3-6319-4f60-9123-cdf9d0703ed8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Received unexpected event network-vif-plugged-58acff7a-ba54-40d0-8427-a8ff28f15fd2 for instance with vm_state active and task_state None.
Jan 22 00:12:07 compute-0 nova_compute[182935]: 2026-01-22 00:12:07.859 182939 INFO nova.compute.manager [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Took 18.66 seconds to build instance.
Jan 22 00:12:08 compute-0 nova_compute[182935]: 2026-01-22 00:12:08.072 182939 DEBUG oslo_concurrency.lockutils [None req-c369b132-0808-45c0-8adf-baafd240da50 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.440s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:08 compute-0 nova_compute[182935]: 2026-01-22 00:12:08.173 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:08 compute-0 nova_compute[182935]: 2026-01-22 00:12:08.973 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:08.973 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:12:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:08.976 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:12:11 compute-0 podman[231385]: 2026-01-22 00:12:11.68586603 +0000 UTC m=+0.058731322 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:12:12 compute-0 nova_compute[182935]: 2026-01-22 00:12:12.277 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:13 compute-0 NetworkManager[55139]: <info>  [1769040733.1044] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Jan 22 00:12:13 compute-0 NetworkManager[55139]: <info>  [1769040733.1051] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Jan 22 00:12:13 compute-0 nova_compute[182935]: 2026-01-22 00:12:13.104 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:13 compute-0 nova_compute[182935]: 2026-01-22 00:12:13.167 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:13 compute-0 ovn_controller[95047]: 2026-01-22T00:12:13Z|00475|binding|INFO|Releasing lport f63f34ac-9af7-4a13-911f-2c9f043a5c66 from this chassis (sb_readonly=0)
Jan 22 00:12:13 compute-0 nova_compute[182935]: 2026-01-22 00:12:13.174 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:13 compute-0 nova_compute[182935]: 2026-01-22 00:12:13.176 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:14.978 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:12:16 compute-0 podman[231408]: 2026-01-22 00:12:16.698759885 +0000 UTC m=+0.067617539 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Jan 22 00:12:16 compute-0 podman[231407]: 2026-01-22 00:12:16.718459575 +0000 UTC m=+0.081478693 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:12:16 compute-0 nova_compute[182935]: 2026-01-22 00:12:16.859 182939 DEBUG nova.compute.manager [req-4aaf3409-ef35-412b-bfb2-2109c4fb9e03 req-30e10031-0278-49c3-9c70-09b9c0955bb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Received event network-changed-58acff7a-ba54-40d0-8427-a8ff28f15fd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:16 compute-0 nova_compute[182935]: 2026-01-22 00:12:16.860 182939 DEBUG nova.compute.manager [req-4aaf3409-ef35-412b-bfb2-2109c4fb9e03 req-30e10031-0278-49c3-9c70-09b9c0955bb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Refreshing instance network info cache due to event network-changed-58acff7a-ba54-40d0-8427-a8ff28f15fd2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:12:16 compute-0 nova_compute[182935]: 2026-01-22 00:12:16.860 182939 DEBUG oslo_concurrency.lockutils [req-4aaf3409-ef35-412b-bfb2-2109c4fb9e03 req-30e10031-0278-49c3-9c70-09b9c0955bb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:12:16 compute-0 nova_compute[182935]: 2026-01-22 00:12:16.860 182939 DEBUG oslo_concurrency.lockutils [req-4aaf3409-ef35-412b-bfb2-2109c4fb9e03 req-30e10031-0278-49c3-9c70-09b9c0955bb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:12:16 compute-0 nova_compute[182935]: 2026-01-22 00:12:16.860 182939 DEBUG nova.network.neutron [req-4aaf3409-ef35-412b-bfb2-2109c4fb9e03 req-30e10031-0278-49c3-9c70-09b9c0955bb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Refreshing network info cache for port 58acff7a-ba54-40d0-8427-a8ff28f15fd2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:12:17 compute-0 nova_compute[182935]: 2026-01-22 00:12:17.279 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:18 compute-0 nova_compute[182935]: 2026-01-22 00:12:18.177 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:19 compute-0 ovn_controller[95047]: 2026-01-22T00:12:19Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fb:80:7a 10.100.0.6
Jan 22 00:12:19 compute-0 ovn_controller[95047]: 2026-01-22T00:12:19Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fb:80:7a 10.100.0.6
Jan 22 00:12:20 compute-0 nova_compute[182935]: 2026-01-22 00:12:20.402 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:20 compute-0 sshd-session[231463]: Invalid user docker from 188.166.69.60 port 57246
Jan 22 00:12:20 compute-0 sshd-session[231463]: Connection closed by invalid user docker 188.166.69.60 port 57246 [preauth]
Jan 22 00:12:21 compute-0 nova_compute[182935]: 2026-01-22 00:12:21.650 182939 DEBUG nova.network.neutron [req-4aaf3409-ef35-412b-bfb2-2109c4fb9e03 req-30e10031-0278-49c3-9c70-09b9c0955bb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Updated VIF entry in instance network info cache for port 58acff7a-ba54-40d0-8427-a8ff28f15fd2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:12:21 compute-0 nova_compute[182935]: 2026-01-22 00:12:21.650 182939 DEBUG nova.network.neutron [req-4aaf3409-ef35-412b-bfb2-2109c4fb9e03 req-30e10031-0278-49c3-9c70-09b9c0955bb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Updating instance_info_cache with network_info: [{"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:12:21 compute-0 nova_compute[182935]: 2026-01-22 00:12:21.880 182939 DEBUG oslo_concurrency.lockutils [req-4aaf3409-ef35-412b-bfb2-2109c4fb9e03 req-30e10031-0278-49c3-9c70-09b9c0955bb8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:12:22 compute-0 nova_compute[182935]: 2026-01-22 00:12:22.282 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:23 compute-0 nova_compute[182935]: 2026-01-22 00:12:23.212 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.317 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'name': 'tempest-TestNetworkBasicOps-server-648879642', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000075', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '34b96b4037d24a0ea19383ca2477b2fd', 'user_id': '833f1e9dce90456ea55a443da6704907', 'hostId': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.318 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.333 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.335 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f450cdf8-2ebe-4ba9-92e4-48a6a66465eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-vda', 'timestamp': '2026-01-22T00:12:23.319215', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06805680-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.113590112, 'message_signature': '98e024cf7a4ac9dd74873c824cf1f685e8d14e743a4b7098df66adb30305558c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-sda', 'timestamp': '2026-01-22T00:12:23.319215', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '068079f8-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.113590112, 'message_signature': '02a41f434dcc30ed87480e12a1e6f08b889c3bb4a346993aee72a135bf96c4df'}]}, 'timestamp': '2026-01-22 00:12:23.335768', '_unique_id': 'b0b8ee9d2bbe4b92930d663f0bace928'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.341 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.343 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.344 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d2df8d9-5964-43a4-81b3-2c597f4e43e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-vda', 'timestamp': '2026-01-22T00:12:23.343838', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0681cf24-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.113590112, 'message_signature': 'a835e3a00ee7008959af3a34753872e186eefa2ab1c36d5d969bfdaabc04f9c1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-sda', 'timestamp': '2026-01-22T00:12:23.343838', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0681dd0c-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.113590112, 'message_signature': '6ca6a99475ddafcd4fa5c3bdaaeb5e14257f76a0619131519f4ff9b29cff2d28'}]}, 'timestamp': '2026-01-22 00:12:23.344596', '_unique_id': '47100278513e45fcb99cddddba1102ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.346 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.346 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.347 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-648879642>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-648879642>]
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.347 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.347 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.347 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-648879642>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-648879642>]
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.347 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.348 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.348 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-648879642>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-648879642>]
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.348 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.375 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.read.latency volume: 141450268 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.376 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.read.latency volume: 20455461 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '149890b7-c641-4c02-98f5-9772d335f3d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 141450268, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-vda', 'timestamp': '2026-01-22T00:12:23.348533', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0686b7c8-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.142850295, 'message_signature': '3f1e79582732d89aec372177848dfdd37523f7a418ecf7e4d259859d82b65a16'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20455461, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-sda', 'timestamp': '2026-01-22T00:12:23.348533', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0686c902-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.142850295, 'message_signature': '6fbe485c726c828f658d6af813d6a9cd33a971e89f989a25f6e4a2477018cb35'}]}, 'timestamp': '2026-01-22 00:12:23.376936', '_unique_id': '9a8d46912f1c4756a14eae866fd5f8a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.378 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.379 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.380 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.read.requests volume: 1086 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.380 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b98c5ff4-ceb9-4745-8c94-4a4d8c4bb65f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1086, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-vda', 'timestamp': '2026-01-22T00:12:23.380028', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0687543a-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.142850295, 'message_signature': '577c4ebe01218ed6dbb58fb30262bc1f1326fa25273c0b977fa2493bb8689b58'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-sda', 'timestamp': '2026-01-22T00:12:23.380028', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '06876196-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.142850295, 'message_signature': 'e1eacbb5effbb98e055c7f1740cff0a582d4881a096e39f6dcb51e852695f55c'}]}, 'timestamp': '2026-01-22 00:12:23.380749', '_unique_id': 'c451e719216a49c0a5185165ff18f25e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.381 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.382 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.389 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3f8ad853-99b9-4999-bafc-9d63e7f9726f / tap58acff7a-ba inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.389 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be1edc77-771e-4752-b176-e2cd97a7180c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000075-3f8ad853-99b9-4999-bafc-9d63e7f9726f-tap58acff7a-ba', 'timestamp': '2026-01-22T00:12:23.382727', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'tap58acff7a-ba', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:80:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58acff7a-ba'}, 'message_id': '0688c68a-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.177034003, 'message_signature': 'd9b09d7c27caa142fae3a637bf52797abe4c21771bc42c3e72c56e64c0851696'}]}, 'timestamp': '2026-01-22 00:12:23.390052', '_unique_id': 'fc04110084aa4e5d8e298fcb4ea6e6c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.391 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.392 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.392 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.392 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-648879642>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-648879642>]
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.392 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.392 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.read.bytes volume: 30202368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22ed7734-d7e8-45ce-a2e5-1a5a2cc25a3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30202368, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-vda', 'timestamp': '2026-01-22T00:12:23.392883', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0689470e-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.142850295, 'message_signature': '8c6450940495990c5d8761fa8a818e75ced5866a99a68ad980be1251f628dbdf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-sda', 'timestamp': '2026-01-22T00:12:23.392883', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '068952c6-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.142850295, 'message_signature': 'a07169aecb17724e98aa3520219bf25f416a59bc565d7fbb4577708009fbec5d'}]}, 'timestamp': '2026-01-22 00:12:23.393455', '_unique_id': '1f2e1e78cc6248d68ed6866f4525eade'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.393 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.394 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '827b24b6-40b9-4062-b695-43143e05f49a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000075-3f8ad853-99b9-4999-bafc-9d63e7f9726f-tap58acff7a-ba', 'timestamp': '2026-01-22T00:12:23.395057', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'tap58acff7a-ba', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:80:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58acff7a-ba'}, 'message_id': '06899d4e-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.177034003, 'message_signature': 'e79bfb0238bddc087f8f0418bffdd84494a5938d7fe23bf7052ad51f8c3a41a4'}]}, 'timestamp': '2026-01-22 00:12:23.395379', '_unique_id': '0c3ea0d73113437998219c44b882b8cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.395 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.397 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.397 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3bf07682-8d69-490c-b0ad-a19e70d8cb1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000075-3f8ad853-99b9-4999-bafc-9d63e7f9726f-tap58acff7a-ba', 'timestamp': '2026-01-22T00:12:23.397228', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'tap58acff7a-ba', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:80:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58acff7a-ba'}, 'message_id': '0689f262-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.177034003, 'message_signature': 'c6e2782efd5903a5f09c81bfd74970b0d9e3838974374998cda73fb4499e4743'}]}, 'timestamp': '2026-01-22 00:12:23.397551', '_unique_id': 'b44ce8a200eb4d809722184a2ac285ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.398 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/network.incoming.bytes volume: 1646 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70bffec3-589e-4726-bfd2-0ca5ee0a7c03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1646, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000075-3f8ad853-99b9-4999-bafc-9d63e7f9726f-tap58acff7a-ba', 'timestamp': '2026-01-22T00:12:23.398686', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'tap58acff7a-ba', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:80:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58acff7a-ba'}, 'message_id': '068a2868-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.177034003, 'message_signature': 'b7bb5e36a7e02cba94f010c3e15e6e09cbd1c76e7335d8ea74d04d2840d7b575'}]}, 'timestamp': '2026-01-22 00:12:23.398932', '_unique_id': 'fd6eb77cc0fb4857a1dd4548a2443e38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.399 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.400 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.400 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.write.bytes volume: 72753152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.400 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55a1c441-aad5-4b0b-8b75-b08c4849c61e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72753152, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-vda', 'timestamp': '2026-01-22T00:12:23.400097', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '068a5ee6-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.142850295, 'message_signature': 'a6bbaa0a51ae27d4736632da29bc3c028e47cda04c797980860288bf2e180fa3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-sda', 'timestamp': '2026-01-22T00:12:23.400097', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '068a68dc-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.142850295, 'message_signature': 'e3092840c194621dda94331ea161464af7760c48b82cbbdbfd411b5388992496'}]}, 'timestamp': '2026-01-22 00:12:23.400569', '_unique_id': 'cc553f890ef7417196bbd8284c6d01ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.write.latency volume: 4040733186 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.401 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b316b64-61ba-4da1-b400-acd684240b34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4040733186, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-vda', 'timestamp': '2026-01-22T00:12:23.401715', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '068aa086-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.142850295, 'message_signature': '63c2a659fa8a7dc3f7b771ef6c83d6fe434f4d5a685abdae0736d7de0a876975'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-sda', 'timestamp': '2026-01-22T00:12:23.401715', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '068aa8b0-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.142850295, 'message_signature': 'b6ddc2ef84c00a85182532067b0bb6fc9028b09b32419d75444d5434858cdaf9'}]}, 'timestamp': '2026-01-22 00:12:23.402200', '_unique_id': 'c8a4e61ea09843c1af00d74b2bbbb98d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.402 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.403 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.403 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '593c86d8-c5c7-4a3a-a421-e46fa2c60231', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000075-3f8ad853-99b9-4999-bafc-9d63e7f9726f-tap58acff7a-ba', 'timestamp': '2026-01-22T00:12:23.403497', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'tap58acff7a-ba', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:80:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58acff7a-ba'}, 'message_id': '068ae410-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.177034003, 'message_signature': 'bbc7364fc3939b0ba8190ef5efb4aa212811e24734a312e629f08da45a58be9e'}]}, 'timestamp': '2026-01-22 00:12:23.403755', '_unique_id': '806967e803234963890c2929caa8f12e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.404 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66c72cbd-3abf-4a9e-a02b-3a3e1ad301f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000075-3f8ad853-99b9-4999-bafc-9d63e7f9726f-tap58acff7a-ba', 'timestamp': '2026-01-22T00:12:23.404986', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'tap58acff7a-ba', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:80:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58acff7a-ba'}, 'message_id': '068b1df4-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.177034003, 'message_signature': '40cb54938e5fa7f12670bbc3504002b481de85b6bc1e24eb1d3606e2f1a11c58'}]}, 'timestamp': '2026-01-22 00:12:23.405217', '_unique_id': '7b5f83156bd14ad0b04372f1c589c074'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.406 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.406 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.allocation volume: 30023680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.406 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2f49e0b-1f27-4ceb-ae4d-0ce2d150005f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30023680, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-vda', 'timestamp': '2026-01-22T00:12:23.406450', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '068b5a1c-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.113590112, 'message_signature': '7d93a71d685abad22508f7c8bd3f14023cc3a8ff0a6bdf8e116c6ed503555985'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-sda', 'timestamp': '2026-01-22T00:12:23.406450', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '068b652a-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.113590112, 'message_signature': '99f1e27bf354d31b49e0732c31e542f41f264d433b11c4868dc7a8c1370916fe'}]}, 'timestamp': '2026-01-22 00:12:23.407064', '_unique_id': 'daa33254a7304a51a714f4daab11477d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.408 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.408 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e07433f6-2ad0-41e3-af87-4469cb989caf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000075-3f8ad853-99b9-4999-bafc-9d63e7f9726f-tap58acff7a-ba', 'timestamp': '2026-01-22T00:12:23.408311', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'tap58acff7a-ba', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:80:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58acff7a-ba'}, 'message_id': '068b9ff4-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.177034003, 'message_signature': '6d75f1e88dd4123de4b2d1c97ab9a73f2feda2f8008b7f42b64a7948a622d840'}]}, 'timestamp': '2026-01-22 00:12:23.408547', '_unique_id': 'db2aa41284a549d0a9061fc48149ecca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.409 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.425 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/cpu volume: 11790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8eaa9bc-e42e-4e0b-9e45-f721fbda5be2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11790000000, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'timestamp': '2026-01-22T00:12:23.409755', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '068e3926-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.219380901, 'message_signature': '7cfede81b3e21739e79b759f9f0b47a15be21fd36d84b126e21d8cd82a0cbea5'}]}, 'timestamp': '2026-01-22 00:12:23.425611', '_unique_id': 'afa8ee79244a415b82cd187ecbbd74ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd46eec41-8cea-4811-9101-cce283c411f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000075-3f8ad853-99b9-4999-bafc-9d63e7f9726f-tap58acff7a-ba', 'timestamp': '2026-01-22T00:12:23.427137', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'tap58acff7a-ba', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:80:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58acff7a-ba'}, 'message_id': '068e7f6c-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.177034003, 'message_signature': 'd044ae9dc61d435e6511e15979dda626bab75ce2997471021220f8020d2928c3'}]}, 'timestamp': '2026-01-22 00:12:23.427399', '_unique_id': 'da5f499276af41b29c85989c07db974b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.427 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.428 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.428 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cda328e5-0d1e-4aea-849f-a3e5e88b5996', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'timestamp': '2026-01-22T00:12:23.428560', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '068eb6d0-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.219380901, 'message_signature': '4d4e5fabd1d6cfb1793474da7ee01aa7f73d56cadc6de7633f2cb82211e88fc0'}]}, 'timestamp': '2026-01-22 00:12:23.428818', '_unique_id': 'e0d69e64e3394f8f839bfa0723f584f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.429 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2217436-0754-4752-8552-47772ab0f165', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000075-3f8ad853-99b9-4999-bafc-9d63e7f9726f-tap58acff7a-ba', 'timestamp': '2026-01-22T00:12:23.429945', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'tap58acff7a-ba', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:80:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58acff7a-ba'}, 'message_id': '068eece0-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.177034003, 'message_signature': '23120421a076b1ddbe4642c416cbc6003073d837f3ffaf0db7a2a78ea792e1ec'}]}, 'timestamp': '2026-01-22 00:12:23.430223', '_unique_id': '2fb3194ff3a24d49bace8ee67a3b8574'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.430 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.431 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.431 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/network.outgoing.bytes volume: 1396 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53ca16e2-a1b1-4152-ba9b-f807b32f8a19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1396, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': 'instance-00000075-3f8ad853-99b9-4999-bafc-9d63e7f9726f-tap58acff7a-ba', 'timestamp': '2026-01-22T00:12:23.431349', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'tap58acff7a-ba', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:80:7a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap58acff7a-ba'}, 'message_id': '068f23ae-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.177034003, 'message_signature': '23b07aeff2dc4e1abe06750becc166462107352e23d6495cbc7442dc9d780550'}]}, 'timestamp': '2026-01-22 00:12:23.431574', '_unique_id': 'd2211fb49fec4d3faf1e74f2aed317ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.write.requests volume: 298 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.432 12 DEBUG ceilometer.compute.pollsters [-] 3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df00202c-0279-4884-9bb2-1738525fbad4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 298, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-vda', 'timestamp': '2026-01-22T00:12:23.432724', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '068f5a04-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.142850295, 'message_signature': '7f20d704725d59549474b50f7c54f0418258150ed3ff10a463523581b23960b7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_name': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_name': None, 'resource_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f-sda', 'timestamp': '2026-01-22T00:12:23.432724', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-648879642', 'name': 'instance-00000075', 'instance_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'instance_type': 'm1.nano', 'host': '8827de75cfe113d77b8a2463485f3b27d31eb9ba553e14a4d78b8c4f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '068f6210-f727-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5246.142850295, 'message_signature': '79ea3ffa984de8fc732b4852ea520a072bf502bc05ccbdf4f8154c0849d95cd7'}]}, 'timestamp': '2026-01-22 00:12:23.433159', '_unique_id': '4ad330140597414cad89b8e2454d8a4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:12:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:12:23.433 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:12:25 compute-0 nova_compute[182935]: 2026-01-22 00:12:25.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:26 compute-0 nova_compute[182935]: 2026-01-22 00:12:26.231 182939 INFO nova.compute.manager [None req-71097397-f376-4942-8dc6-b357ac06beb9 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Get console output
Jan 22 00:12:26 compute-0 nova_compute[182935]: 2026-01-22 00:12:26.321 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:12:27 compute-0 nova_compute[182935]: 2026-01-22 00:12:27.284 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:28 compute-0 nova_compute[182935]: 2026-01-22 00:12:28.214 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:29 compute-0 nova_compute[182935]: 2026-01-22 00:12:29.344 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:29 compute-0 nova_compute[182935]: 2026-01-22 00:12:29.427 182939 DEBUG nova.compute.manager [req-13f8982f-e704-470d-90c5-ffc867791334 req-6d18f045-f7d4-4f97-84e7-8356471f82b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Received event network-changed-58acff7a-ba54-40d0-8427-a8ff28f15fd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:12:29 compute-0 nova_compute[182935]: 2026-01-22 00:12:29.427 182939 DEBUG nova.compute.manager [req-13f8982f-e704-470d-90c5-ffc867791334 req-6d18f045-f7d4-4f97-84e7-8356471f82b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Refreshing instance network info cache due to event network-changed-58acff7a-ba54-40d0-8427-a8ff28f15fd2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:12:29 compute-0 nova_compute[182935]: 2026-01-22 00:12:29.427 182939 DEBUG oslo_concurrency.lockutils [req-13f8982f-e704-470d-90c5-ffc867791334 req-6d18f045-f7d4-4f97-84e7-8356471f82b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:12:29 compute-0 nova_compute[182935]: 2026-01-22 00:12:29.427 182939 DEBUG oslo_concurrency.lockutils [req-13f8982f-e704-470d-90c5-ffc867791334 req-6d18f045-f7d4-4f97-84e7-8356471f82b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:12:29 compute-0 nova_compute[182935]: 2026-01-22 00:12:29.428 182939 DEBUG nova.network.neutron [req-13f8982f-e704-470d-90c5-ffc867791334 req-6d18f045-f7d4-4f97-84e7-8356471f82b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Refreshing network info cache for port 58acff7a-ba54-40d0-8427-a8ff28f15fd2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:12:30 compute-0 podman[231465]: 2026-01-22 00:12:30.717534267 +0000 UTC m=+0.093252407 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 22 00:12:30 compute-0 podman[231466]: 2026-01-22 00:12:30.725869402 +0000 UTC m=+0.089417778 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:12:32 compute-0 nova_compute[182935]: 2026-01-22 00:12:32.286 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:33 compute-0 nova_compute[182935]: 2026-01-22 00:12:33.217 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:35 compute-0 podman[231516]: 2026-01-22 00:12:35.669971272 +0000 UTC m=+0.048753250 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:12:36 compute-0 nova_compute[182935]: 2026-01-22 00:12:36.800 182939 DEBUG nova.network.neutron [req-13f8982f-e704-470d-90c5-ffc867791334 req-6d18f045-f7d4-4f97-84e7-8356471f82b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Updated VIF entry in instance network info cache for port 58acff7a-ba54-40d0-8427-a8ff28f15fd2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:12:36 compute-0 nova_compute[182935]: 2026-01-22 00:12:36.800 182939 DEBUG nova.network.neutron [req-13f8982f-e704-470d-90c5-ffc867791334 req-6d18f045-f7d4-4f97-84e7-8356471f82b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Updating instance_info_cache with network_info: [{"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:12:37 compute-0 nova_compute[182935]: 2026-01-22 00:12:37.288 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:38 compute-0 nova_compute[182935]: 2026-01-22 00:12:38.095 182939 DEBUG oslo_concurrency.lockutils [req-13f8982f-e704-470d-90c5-ffc867791334 req-6d18f045-f7d4-4f97-84e7-8356471f82b0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:12:38 compute-0 nova_compute[182935]: 2026-01-22 00:12:38.220 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:38 compute-0 nova_compute[182935]: 2026-01-22 00:12:38.880 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:38 compute-0 nova_compute[182935]: 2026-01-22 00:12:38.881 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:12:38 compute-0 nova_compute[182935]: 2026-01-22 00:12:38.881 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:12:39 compute-0 nova_compute[182935]: 2026-01-22 00:12:39.555 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:12:39 compute-0 nova_compute[182935]: 2026-01-22 00:12:39.556 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:12:39 compute-0 nova_compute[182935]: 2026-01-22 00:12:39.556 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:12:39 compute-0 nova_compute[182935]: 2026-01-22 00:12:39.557 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3f8ad853-99b9-4999-bafc-9d63e7f9726f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:12:42 compute-0 nova_compute[182935]: 2026-01-22 00:12:42.289 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:42 compute-0 podman[231540]: 2026-01-22 00:12:42.680726275 +0000 UTC m=+0.052484787 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 00:12:42 compute-0 nova_compute[182935]: 2026-01-22 00:12:42.751 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.222 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.534 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Updating instance_info_cache with network_info: [{"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.556 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.556 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.557 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.557 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.558 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.558 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.559 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.591 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.591 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.592 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.592 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.685 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.745 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.746 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.811 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.978 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.980 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5544MB free_disk=73.09841918945312GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.980 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:12:43 compute-0 nova_compute[182935]: 2026-01-22 00:12:43.981 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:12:44 compute-0 nova_compute[182935]: 2026-01-22 00:12:44.075 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 3f8ad853-99b9-4999-bafc-9d63e7f9726f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:12:44 compute-0 nova_compute[182935]: 2026-01-22 00:12:44.075 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:12:44 compute-0 nova_compute[182935]: 2026-01-22 00:12:44.075 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:12:44 compute-0 nova_compute[182935]: 2026-01-22 00:12:44.122 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:12:44 compute-0 nova_compute[182935]: 2026-01-22 00:12:44.141 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:12:44 compute-0 nova_compute[182935]: 2026-01-22 00:12:44.180 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:12:44 compute-0 nova_compute[182935]: 2026-01-22 00:12:44.181 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:12:44 compute-0 nova_compute[182935]: 2026-01-22 00:12:44.418 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:44 compute-0 nova_compute[182935]: 2026-01-22 00:12:44.419 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:44 compute-0 nova_compute[182935]: 2026-01-22 00:12:44.655 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:44 compute-0 nova_compute[182935]: 2026-01-22 00:12:44.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:44 compute-0 nova_compute[182935]: 2026-01-22 00:12:44.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:12:47 compute-0 nova_compute[182935]: 2026-01-22 00:12:47.291 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:47 compute-0 podman[231568]: 2026-01-22 00:12:47.703234335 +0000 UTC m=+0.066534325 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 00:12:47 compute-0 podman[231569]: 2026-01-22 00:12:47.709930601 +0000 UTC m=+0.070054596 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:12:48 compute-0 nova_compute[182935]: 2026-01-22 00:12:48.267 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:49 compute-0 nova_compute[182935]: 2026-01-22 00:12:49.204 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:50.936 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:12:50 compute-0 nova_compute[182935]: 2026-01-22 00:12:50.936 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:12:50.937 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:12:52 compute-0 nova_compute[182935]: 2026-01-22 00:12:52.293 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:53 compute-0 nova_compute[182935]: 2026-01-22 00:12:53.270 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:57 compute-0 nova_compute[182935]: 2026-01-22 00:12:57.295 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:58 compute-0 nova_compute[182935]: 2026-01-22 00:12:58.273 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:58 compute-0 nova_compute[182935]: 2026-01-22 00:12:58.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:58 compute-0 nova_compute[182935]: 2026-01-22 00:12:58.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:12:58 compute-0 nova_compute[182935]: 2026-01-22 00:12:58.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:12:58 compute-0 nova_compute[182935]: 2026-01-22 00:12:58.919 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:12:59 compute-0 ovn_controller[95047]: 2026-01-22T00:12:59Z|00476|binding|INFO|Releasing lport f63f34ac-9af7-4a13-911f-2c9f043a5c66 from this chassis (sb_readonly=0)
Jan 22 00:12:59 compute-0 nova_compute[182935]: 2026-01-22 00:12:59.138 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:12:59 compute-0 ovn_controller[95047]: 2026-01-22T00:12:59Z|00477|binding|INFO|Releasing lport f63f34ac-9af7-4a13-911f-2c9f043a5c66 from this chassis (sb_readonly=0)
Jan 22 00:12:59 compute-0 nova_compute[182935]: 2026-01-22 00:12:59.292 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:00.939 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:13:01 compute-0 podman[231609]: 2026-01-22 00:13:01.688737201 +0000 UTC m=+0.054873573 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:13:01 compute-0 podman[231608]: 2026-01-22 00:13:01.723531732 +0000 UTC m=+0.089821048 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:13:02 compute-0 nova_compute[182935]: 2026-01-22 00:13:02.297 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:03.209 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:03.210 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:03.211 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:03 compute-0 nova_compute[182935]: 2026-01-22 00:13:03.275 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:04 compute-0 sshd-session[231654]: Invalid user docker from 188.166.69.60 port 59944
Jan 22 00:13:04 compute-0 sshd-session[231654]: Connection closed by invalid user docker 188.166.69.60 port 59944 [preauth]
Jan 22 00:13:06 compute-0 podman[231656]: 2026-01-22 00:13:06.668694927 +0000 UTC m=+0.045904272 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:13:07 compute-0 nova_compute[182935]: 2026-01-22 00:13:07.299 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:08 compute-0 nova_compute[182935]: 2026-01-22 00:13:08.326 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:12 compute-0 nova_compute[182935]: 2026-01-22 00:13:12.301 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:13 compute-0 nova_compute[182935]: 2026-01-22 00:13:13.328 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:13 compute-0 podman[231680]: 2026-01-22 00:13:13.698901105 +0000 UTC m=+0.059597152 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:13:17 compute-0 nova_compute[182935]: 2026-01-22 00:13:17.303 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:18 compute-0 nova_compute[182935]: 2026-01-22 00:13:18.340 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:18 compute-0 podman[231702]: 2026-01-22 00:13:18.687663935 +0000 UTC m=+0.060477752 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 00:13:18 compute-0 podman[231701]: 2026-01-22 00:13:18.705001871 +0000 UTC m=+0.081171456 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 00:13:22 compute-0 nova_compute[182935]: 2026-01-22 00:13:22.306 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:23 compute-0 nova_compute[182935]: 2026-01-22 00:13:23.342 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:25 compute-0 nova_compute[182935]: 2026-01-22 00:13:25.446 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:25 compute-0 NetworkManager[55139]: <info>  [1769040805.4472] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Jan 22 00:13:25 compute-0 NetworkManager[55139]: <info>  [1769040805.4484] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Jan 22 00:13:25 compute-0 nova_compute[182935]: 2026-01-22 00:13:25.510 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:25 compute-0 ovn_controller[95047]: 2026-01-22T00:13:25Z|00478|binding|INFO|Releasing lport f63f34ac-9af7-4a13-911f-2c9f043a5c66 from this chassis (sb_readonly=0)
Jan 22 00:13:25 compute-0 nova_compute[182935]: 2026-01-22 00:13:25.522 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:27 compute-0 nova_compute[182935]: 2026-01-22 00:13:27.307 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:27 compute-0 nova_compute[182935]: 2026-01-22 00:13:27.926 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:28 compute-0 nova_compute[182935]: 2026-01-22 00:13:28.345 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:31 compute-0 nova_compute[182935]: 2026-01-22 00:13:31.613 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:31.615 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:13:31 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:31.616 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:13:32 compute-0 nova_compute[182935]: 2026-01-22 00:13:32.309 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:32 compute-0 podman[231754]: 2026-01-22 00:13:32.684628549 +0000 UTC m=+0.050875529 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:13:32 compute-0 podman[231753]: 2026-01-22 00:13:32.718460489 +0000 UTC m=+0.084696869 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 00:13:33 compute-0 nova_compute[182935]: 2026-01-22 00:13:33.347 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:35 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:35.618 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:13:36 compute-0 nova_compute[182935]: 2026-01-22 00:13:36.514 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:37 compute-0 nova_compute[182935]: 2026-01-22 00:13:37.312 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:37 compute-0 podman[231804]: 2026-01-22 00:13:37.713656181 +0000 UTC m=+0.088509468 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:13:38 compute-0 nova_compute[182935]: 2026-01-22 00:13:38.350 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:39 compute-0 nova_compute[182935]: 2026-01-22 00:13:39.919 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:39 compute-0 nova_compute[182935]: 2026-01-22 00:13:39.920 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:13:39 compute-0 nova_compute[182935]: 2026-01-22 00:13:39.920 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:13:40 compute-0 nova_compute[182935]: 2026-01-22 00:13:40.675 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:13:40 compute-0 nova_compute[182935]: 2026-01-22 00:13:40.676 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:13:40 compute-0 nova_compute[182935]: 2026-01-22 00:13:40.676 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:13:40 compute-0 nova_compute[182935]: 2026-01-22 00:13:40.676 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3f8ad853-99b9-4999-bafc-9d63e7f9726f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.198 182939 DEBUG oslo_concurrency.lockutils [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.198 182939 DEBUG oslo_concurrency.lockutils [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.202 182939 DEBUG oslo_concurrency.lockutils [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.202 182939 DEBUG oslo_concurrency.lockutils [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.202 182939 DEBUG oslo_concurrency.lockutils [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.219 182939 INFO nova.compute.manager [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Terminating instance
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.234 182939 DEBUG nova.compute.manager [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:13:41 compute-0 kernel: tap58acff7a-ba (unregistering): left promiscuous mode
Jan 22 00:13:41 compute-0 NetworkManager[55139]: <info>  [1769040821.2654] device (tap58acff7a-ba): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:13:41 compute-0 ovn_controller[95047]: 2026-01-22T00:13:41Z|00479|binding|INFO|Releasing lport 58acff7a-ba54-40d0-8427-a8ff28f15fd2 from this chassis (sb_readonly=0)
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.275 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:41 compute-0 ovn_controller[95047]: 2026-01-22T00:13:41Z|00480|binding|INFO|Setting lport 58acff7a-ba54-40d0-8427-a8ff28f15fd2 down in Southbound
Jan 22 00:13:41 compute-0 ovn_controller[95047]: 2026-01-22T00:13:41Z|00481|binding|INFO|Removing iface tap58acff7a-ba ovn-installed in OVS
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.288 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.291 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.311 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:80:7a 10.100.0.6'], port_security=['fa:16:3e:fb:80:7a 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '3f8ad853-99b9-4999-bafc-9d63e7f9726f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c2d6e47-df04-4d44-b308-f39e21535b4b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d452ef76-084d-4578-ab80-dfb49c9c8f9b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=58acff7a-ba54-40d0-8427-a8ff28f15fd2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.312 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 58acff7a-ba54-40d0-8427-a8ff28f15fd2 in datapath 88a7330a-aaa1-424a-b4dc-f7500e450abb unbound from our chassis
Jan 22 00:13:41 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000075.scope: Deactivated successfully.
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.313 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88a7330a-aaa1-424a-b4dc-f7500e450abb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:13:41 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000075.scope: Consumed 16.799s CPU time.
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.315 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fb1cf163-a78d-4659-84ff-e28b4212c732]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.316 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb namespace which is not needed anymore
Jan 22 00:13:41 compute-0 systemd-machined[154182]: Machine qemu-61-instance-00000075 terminated.
Jan 22 00:13:41 compute-0 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[231350]: [NOTICE]   (231373) : haproxy version is 2.8.14-c23fe91
Jan 22 00:13:41 compute-0 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[231350]: [NOTICE]   (231373) : path to executable is /usr/sbin/haproxy
Jan 22 00:13:41 compute-0 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[231350]: [WARNING]  (231373) : Exiting Master process...
Jan 22 00:13:41 compute-0 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[231350]: [ALERT]    (231373) : Current worker (231375) exited with code 143 (Terminated)
Jan 22 00:13:41 compute-0 neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb[231350]: [WARNING]  (231373) : All workers exited. Exiting... (0)
Jan 22 00:13:41 compute-0 systemd[1]: libpod-ec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2.scope: Deactivated successfully.
Jan 22 00:13:41 compute-0 podman[231853]: 2026-01-22 00:13:41.448798631 +0000 UTC m=+0.044566301 container died ec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 22 00:13:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2-userdata-shm.mount: Deactivated successfully.
Jan 22 00:13:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-c829cd4c5da113e24294ca329231c67e0864c7abb12b9e2a64d7dc86f3756574-merged.mount: Deactivated successfully.
Jan 22 00:13:41 compute-0 podman[231853]: 2026-01-22 00:13:41.493294681 +0000 UTC m=+0.089062371 container cleanup ec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:13:41 compute-0 systemd[1]: libpod-conmon-ec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2.scope: Deactivated successfully.
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.503 182939 INFO nova.virt.libvirt.driver [-] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Instance destroyed successfully.
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.504 182939 DEBUG nova.objects.instance [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 3f8ad853-99b9-4999-bafc-9d63e7f9726f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.524 182939 DEBUG nova.virt.libvirt.vif [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:11:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-648879642',display_name='tempest-TestNetworkBasicOps-server-648879642',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-648879642',id=117,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfmPsjlhriOW7dRWovArAk9oG0vlTV4UF00kpkAcU8b+GiH2HuvDQwwI391+p3h3ZkbsOGbPDPBtDG3YybgfwiG7zKi8EOlbFfoqTDO+fE0iPWic901YZUPREuM/hSbHA==',key_name='tempest-TestNetworkBasicOps-1936956188',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:12:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-jzui0m77',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:12:06Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=3f8ad853-99b9-4999-bafc-9d63e7f9726f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.525 182939 DEBUG nova.network.os_vif_util [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.526 182939 DEBUG nova.network.os_vif_util [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fb:80:7a,bridge_name='br-int',has_traffic_filtering=True,id=58acff7a-ba54-40d0-8427-a8ff28f15fd2,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58acff7a-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.526 182939 DEBUG os_vif [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fb:80:7a,bridge_name='br-int',has_traffic_filtering=True,id=58acff7a-ba54-40d0-8427-a8ff28f15fd2,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58acff7a-ba') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.528 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.529 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap58acff7a-ba, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.530 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.533 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.539 182939 INFO os_vif [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fb:80:7a,bridge_name='br-int',has_traffic_filtering=True,id=58acff7a-ba54-40d0-8427-a8ff28f15fd2,network=Network(88a7330a-aaa1-424a-b4dc-f7500e450abb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap58acff7a-ba')
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.540 182939 INFO nova.virt.libvirt.driver [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Deleting instance files /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f_del
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.541 182939 INFO nova.virt.libvirt.driver [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Deletion of /var/lib/nova/instances/3f8ad853-99b9-4999-bafc-9d63e7f9726f_del complete
Jan 22 00:13:41 compute-0 podman[231897]: 2026-01-22 00:13:41.584548691 +0000 UTC m=+0.068997083 container remove ec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.590 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[58674b9c-c78e-46b2-81c4-add3e47d92af]: (4, ('Thu Jan 22 12:13:41 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb (ec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2)\nec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2\nThu Jan 22 12:13:41 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb (ec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2)\nec2f9fcba1d5738dd0319100a8b426c898e490689a181cecf49f6b790a0d64b2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.592 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7ce093a4-d574-44e2-b408-87a9b572de6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.593 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88a7330a-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.595 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:41 compute-0 kernel: tap88a7330a-a0: left promiscuous mode
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.606 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.609 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0dba25fb-8178-4568-8d3f-a258ff91c546]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.625 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dfe695ce-4945-4808-881a-d86e20a683d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.626 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[20d61003-e06b-42d1-918c-19355305f04d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.644 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8fb57fec-e91e-4449-ba35-32a9e991855d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 522720, 'reachable_time': 16271, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231912, 'error': None, 'target': 'ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.647 182939 INFO nova.compute.manager [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.648 182939 DEBUG oslo.service.loopingcall [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.648 182939 DEBUG nova.compute.manager [-] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.649 182939 DEBUG nova.network.neutron [-] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:13:41 compute-0 systemd[1]: run-netns-ovnmeta\x2d88a7330a\x2daaa1\x2d424a\x2db4dc\x2df7500e450abb.mount: Deactivated successfully.
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.649 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88a7330a-aaa1-424a-b4dc-f7500e450abb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:13:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:13:41.650 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[64554574-8114-47f4-adc8-554cf4747afb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.801 182939 DEBUG nova.compute.manager [req-41ef496d-c1d8-46d4-bff5-96fe6976e9e5 req-77762025-8015-402b-910e-9d6fa0c61706 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Received event network-vif-unplugged-58acff7a-ba54-40d0-8427-a8ff28f15fd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.802 182939 DEBUG oslo_concurrency.lockutils [req-41ef496d-c1d8-46d4-bff5-96fe6976e9e5 req-77762025-8015-402b-910e-9d6fa0c61706 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.802 182939 DEBUG oslo_concurrency.lockutils [req-41ef496d-c1d8-46d4-bff5-96fe6976e9e5 req-77762025-8015-402b-910e-9d6fa0c61706 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.802 182939 DEBUG oslo_concurrency.lockutils [req-41ef496d-c1d8-46d4-bff5-96fe6976e9e5 req-77762025-8015-402b-910e-9d6fa0c61706 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.803 182939 DEBUG nova.compute.manager [req-41ef496d-c1d8-46d4-bff5-96fe6976e9e5 req-77762025-8015-402b-910e-9d6fa0c61706 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] No waiting events found dispatching network-vif-unplugged-58acff7a-ba54-40d0-8427-a8ff28f15fd2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:13:41 compute-0 nova_compute[182935]: 2026-01-22 00:13:41.804 182939 DEBUG nova.compute.manager [req-41ef496d-c1d8-46d4-bff5-96fe6976e9e5 req-77762025-8015-402b-910e-9d6fa0c61706 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Received event network-vif-unplugged-58acff7a-ba54-40d0-8427-a8ff28f15fd2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:13:42 compute-0 nova_compute[182935]: 2026-01-22 00:13:42.314 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:42 compute-0 nova_compute[182935]: 2026-01-22 00:13:42.871 182939 DEBUG nova.network.neutron [-] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:13:42 compute-0 nova_compute[182935]: 2026-01-22 00:13:42.907 182939 INFO nova.compute.manager [-] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Took 1.26 seconds to deallocate network for instance.
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.025 182939 DEBUG nova.compute.manager [req-ae4130cd-bea5-4ab3-867e-e5aa02ce8633 req-aca48edc-5f8f-45ff-873d-2f15a4a78cd2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Received event network-vif-deleted-58acff7a-ba54-40d0-8427-a8ff28f15fd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.027 182939 DEBUG oslo_concurrency.lockutils [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.027 182939 DEBUG oslo_concurrency.lockutils [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.097 182939 DEBUG nova.compute.provider_tree [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.134 182939 DEBUG nova.scheduler.client.report [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.180 182939 DEBUG oslo_concurrency.lockutils [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.231 182939 INFO nova.scheduler.client.report [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 3f8ad853-99b9-4999-bafc-9d63e7f9726f
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.371 182939 DEBUG oslo_concurrency.lockutils [None req-9655c95b-5993-450a-9c2e-22c8fd356ca2 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.674 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Updating instance_info_cache with network_info: [{"id": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "address": "fa:16:3e:fb:80:7a", "network": {"id": "88a7330a-aaa1-424a-b4dc-f7500e450abb", "bridge": "br-int", "label": "tempest-network-smoke--616986641", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap58acff7a-ba", "ovs_interfaceid": "58acff7a-ba54-40d0-8427-a8ff28f15fd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.696 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-3f8ad853-99b9-4999-bafc-9d63e7f9726f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.696 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.696 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.696 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.697 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.697 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.715 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.716 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.716 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.716 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:13:43 compute-0 podman[231915]: 2026-01-22 00:13:43.820852157 +0000 UTC m=+0.058205581 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.895 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.896 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5735MB free_disk=73.12788391113281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.897 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.897 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.971 182939 DEBUG nova.compute.manager [req-688c42b8-f968-4292-b090-24f58a96d91e req-289fb53b-d1b8-4bc1-b9ff-8c53d7db7530 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Received event network-vif-plugged-58acff7a-ba54-40d0-8427-a8ff28f15fd2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.971 182939 DEBUG oslo_concurrency.lockutils [req-688c42b8-f968-4292-b090-24f58a96d91e req-289fb53b-d1b8-4bc1-b9ff-8c53d7db7530 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.972 182939 DEBUG oslo_concurrency.lockutils [req-688c42b8-f968-4292-b090-24f58a96d91e req-289fb53b-d1b8-4bc1-b9ff-8c53d7db7530 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.972 182939 DEBUG oslo_concurrency.lockutils [req-688c42b8-f968-4292-b090-24f58a96d91e req-289fb53b-d1b8-4bc1-b9ff-8c53d7db7530 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3f8ad853-99b9-4999-bafc-9d63e7f9726f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.972 182939 DEBUG nova.compute.manager [req-688c42b8-f968-4292-b090-24f58a96d91e req-289fb53b-d1b8-4bc1-b9ff-8c53d7db7530 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] No waiting events found dispatching network-vif-plugged-58acff7a-ba54-40d0-8427-a8ff28f15fd2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.972 182939 WARNING nova.compute.manager [req-688c42b8-f968-4292-b090-24f58a96d91e req-289fb53b-d1b8-4bc1-b9ff-8c53d7db7530 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Received unexpected event network-vif-plugged-58acff7a-ba54-40d0-8427-a8ff28f15fd2 for instance with vm_state deleted and task_state None.
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.974 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.974 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:13:43 compute-0 nova_compute[182935]: 2026-01-22 00:13:43.995 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:13:44 compute-0 nova_compute[182935]: 2026-01-22 00:13:44.015 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:13:44 compute-0 nova_compute[182935]: 2026-01-22 00:13:44.042 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:44 compute-0 nova_compute[182935]: 2026-01-22 00:13:44.054 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:13:44 compute-0 nova_compute[182935]: 2026-01-22 00:13:44.055 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:13:44 compute-0 nova_compute[182935]: 2026-01-22 00:13:44.151 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:44 compute-0 nova_compute[182935]: 2026-01-22 00:13:44.211 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:44 compute-0 nova_compute[182935]: 2026-01-22 00:13:44.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:45 compute-0 nova_compute[182935]: 2026-01-22 00:13:45.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:46 compute-0 sshd-session[231936]: Invalid user docker from 188.166.69.60 port 50204
Jan 22 00:13:46 compute-0 sshd-session[231936]: Connection closed by invalid user docker 188.166.69.60 port 50204 [preauth]
Jan 22 00:13:46 compute-0 nova_compute[182935]: 2026-01-22 00:13:46.531 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:47 compute-0 nova_compute[182935]: 2026-01-22 00:13:47.316 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:49 compute-0 podman[231938]: 2026-01-22 00:13:49.681916405 +0000 UTC m=+0.055649170 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.buildah.version=1.33.7, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6)
Jan 22 00:13:49 compute-0 podman[231939]: 2026-01-22 00:13:49.682005497 +0000 UTC m=+0.053441568 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:13:50 compute-0 nova_compute[182935]: 2026-01-22 00:13:50.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:51 compute-0 nova_compute[182935]: 2026-01-22 00:13:51.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:52 compute-0 nova_compute[182935]: 2026-01-22 00:13:52.319 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:53 compute-0 nova_compute[182935]: 2026-01-22 00:13:53.791 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:13:56 compute-0 nova_compute[182935]: 2026-01-22 00:13:56.502 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040821.5010028, 3f8ad853-99b9-4999-bafc-9d63e7f9726f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:13:56 compute-0 nova_compute[182935]: 2026-01-22 00:13:56.503 182939 INFO nova.compute.manager [-] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] VM Stopped (Lifecycle Event)
Jan 22 00:13:56 compute-0 nova_compute[182935]: 2026-01-22 00:13:56.539 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:56 compute-0 nova_compute[182935]: 2026-01-22 00:13:56.771 182939 DEBUG nova.compute.manager [None req-2953c9b1-79e1-405d-90a2-83d07473a871 - - - - - -] [instance: 3f8ad853-99b9-4999-bafc-9d63e7f9726f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:13:57 compute-0 nova_compute[182935]: 2026-01-22 00:13:57.322 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:13:59 compute-0 nova_compute[182935]: 2026-01-22 00:13:59.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:01 compute-0 nova_compute[182935]: 2026-01-22 00:14:01.555 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:02 compute-0 nova_compute[182935]: 2026-01-22 00:14:02.323 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:03.211 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:03.211 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:03.212 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:03 compute-0 podman[231977]: 2026-01-22 00:14:03.725888564 +0000 UTC m=+0.102318990 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:14:03 compute-0 podman[231976]: 2026-01-22 00:14:03.755073636 +0000 UTC m=+0.131568764 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 22 00:14:06 compute-0 nova_compute[182935]: 2026-01-22 00:14:06.557 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:07 compute-0 nova_compute[182935]: 2026-01-22 00:14:07.325 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:08 compute-0 podman[232025]: 2026-01-22 00:14:08.677184203 +0000 UTC m=+0.051126946 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:14:11 compute-0 nova_compute[182935]: 2026-01-22 00:14:11.559 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:12 compute-0 nova_compute[182935]: 2026-01-22 00:14:12.330 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:14 compute-0 nova_compute[182935]: 2026-01-22 00:14:14.437 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:14 compute-0 nova_compute[182935]: 2026-01-22 00:14:14.438 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:14 compute-0 nova_compute[182935]: 2026-01-22 00:14:14.473 182939 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:14:14 compute-0 podman[232051]: 2026-01-22 00:14:14.672690821 +0000 UTC m=+0.044541522 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 22 00:14:14 compute-0 nova_compute[182935]: 2026-01-22 00:14:14.844 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:14 compute-0 nova_compute[182935]: 2026-01-22 00:14:14.844 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:14 compute-0 nova_compute[182935]: 2026-01-22 00:14:14.855 182939 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:14:14 compute-0 nova_compute[182935]: 2026-01-22 00:14:14.855 182939 INFO nova.compute.claims [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.130 182939 DEBUG nova.compute.provider_tree [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.166 182939 DEBUG nova.scheduler.client.report [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.229 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.230 182939 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.375 182939 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.375 182939 DEBUG nova.network.neutron [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.436 182939 INFO nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.491 182939 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.699 182939 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.701 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.702 182939 INFO nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Creating image(s)
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.703 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "/var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.704 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "/var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.705 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "/var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.732 182939 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.793 182939 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.795 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.795 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.807 182939 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.872 182939 DEBUG nova.policy [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00a7d470e36045deabd5584bd3a9c73e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.878 182939 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.879 182939 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.915 182939 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.916 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.917 182939 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.987 182939 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.988 182939 DEBUG nova.virt.disk.api [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Checking if we can resize image /var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:14:15 compute-0 nova_compute[182935]: 2026-01-22 00:14:15.989 182939 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:16 compute-0 nova_compute[182935]: 2026-01-22 00:14:16.047 182939 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:16 compute-0 nova_compute[182935]: 2026-01-22 00:14:16.048 182939 DEBUG nova.virt.disk.api [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Cannot resize image /var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:14:16 compute-0 nova_compute[182935]: 2026-01-22 00:14:16.048 182939 DEBUG nova.objects.instance [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lazy-loading 'migration_context' on Instance uuid a3a09b65-9f43-4029-b1e4-dd463c6e4964 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:14:16 compute-0 nova_compute[182935]: 2026-01-22 00:14:16.598 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:17 compute-0 nova_compute[182935]: 2026-01-22 00:14:17.331 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:17 compute-0 nova_compute[182935]: 2026-01-22 00:14:17.903 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:14:17 compute-0 nova_compute[182935]: 2026-01-22 00:14:17.904 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Ensure instance console log exists: /var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:14:17 compute-0 nova_compute[182935]: 2026-01-22 00:14:17.906 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:17 compute-0 nova_compute[182935]: 2026-01-22 00:14:17.906 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:17 compute-0 nova_compute[182935]: 2026-01-22 00:14:17.907 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:20 compute-0 podman[232084]: 2026-01-22 00:14:20.683524947 +0000 UTC m=+0.057788530 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter)
Jan 22 00:14:20 compute-0 podman[232085]: 2026-01-22 00:14:20.71064669 +0000 UTC m=+0.081013313 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 00:14:21 compute-0 nova_compute[182935]: 2026-01-22 00:14:21.601 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:21 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:21.915 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:14:21 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:21.916 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:14:21 compute-0 nova_compute[182935]: 2026-01-22 00:14:21.917 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:22 compute-0 nova_compute[182935]: 2026-01-22 00:14:22.165 182939 DEBUG nova.network.neutron [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Successfully created port: 8729a102-f8f0-4c90-9f82-e055c76fc104 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:14:22 compute-0 nova_compute[182935]: 2026-01-22 00:14:22.332 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:14:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:14:25 compute-0 nova_compute[182935]: 2026-01-22 00:14:25.365 182939 DEBUG nova.network.neutron [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Successfully updated port: 8729a102-f8f0-4c90-9f82-e055c76fc104 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:14:25 compute-0 nova_compute[182935]: 2026-01-22 00:14:25.618 182939 DEBUG nova.compute.manager [req-f29ba52d-4d84-4aaa-a2e9-f41ea4117e72 req-42ba3c12-d174-491b-b801-fedc347208a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Received event network-changed-8729a102-f8f0-4c90-9f82-e055c76fc104 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:14:25 compute-0 nova_compute[182935]: 2026-01-22 00:14:25.619 182939 DEBUG nova.compute.manager [req-f29ba52d-4d84-4aaa-a2e9-f41ea4117e72 req-42ba3c12-d174-491b-b801-fedc347208a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Refreshing instance network info cache due to event network-changed-8729a102-f8f0-4c90-9f82-e055c76fc104. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:14:25 compute-0 nova_compute[182935]: 2026-01-22 00:14:25.619 182939 DEBUG oslo_concurrency.lockutils [req-f29ba52d-4d84-4aaa-a2e9-f41ea4117e72 req-42ba3c12-d174-491b-b801-fedc347208a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-a3a09b65-9f43-4029-b1e4-dd463c6e4964" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:14:25 compute-0 nova_compute[182935]: 2026-01-22 00:14:25.619 182939 DEBUG oslo_concurrency.lockutils [req-f29ba52d-4d84-4aaa-a2e9-f41ea4117e72 req-42ba3c12-d174-491b-b801-fedc347208a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-a3a09b65-9f43-4029-b1e4-dd463c6e4964" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:14:25 compute-0 nova_compute[182935]: 2026-01-22 00:14:25.619 182939 DEBUG nova.network.neutron [req-f29ba52d-4d84-4aaa-a2e9-f41ea4117e72 req-42ba3c12-d174-491b-b801-fedc347208a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Refreshing network info cache for port 8729a102-f8f0-4c90-9f82-e055c76fc104 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:14:25 compute-0 nova_compute[182935]: 2026-01-22 00:14:25.622 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "refresh_cache-a3a09b65-9f43-4029-b1e4-dd463c6e4964" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:14:26 compute-0 nova_compute[182935]: 2026-01-22 00:14:26.055 182939 DEBUG nova.network.neutron [req-f29ba52d-4d84-4aaa-a2e9-f41ea4117e72 req-42ba3c12-d174-491b-b801-fedc347208a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:14:26 compute-0 nova_compute[182935]: 2026-01-22 00:14:26.651 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:27 compute-0 nova_compute[182935]: 2026-01-22 00:14:27.314 182939 DEBUG nova.network.neutron [req-f29ba52d-4d84-4aaa-a2e9-f41ea4117e72 req-42ba3c12-d174-491b-b801-fedc347208a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:14:27 compute-0 nova_compute[182935]: 2026-01-22 00:14:27.333 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:27 compute-0 nova_compute[182935]: 2026-01-22 00:14:27.500 182939 DEBUG oslo_concurrency.lockutils [req-f29ba52d-4d84-4aaa-a2e9-f41ea4117e72 req-42ba3c12-d174-491b-b801-fedc347208a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-a3a09b65-9f43-4029-b1e4-dd463c6e4964" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:14:27 compute-0 nova_compute[182935]: 2026-01-22 00:14:27.501 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquired lock "refresh_cache-a3a09b65-9f43-4029-b1e4-dd463c6e4964" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:14:27 compute-0 nova_compute[182935]: 2026-01-22 00:14:27.501 182939 DEBUG nova.network.neutron [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:14:27 compute-0 nova_compute[182935]: 2026-01-22 00:14:27.994 182939 DEBUG nova.network.neutron [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:14:28 compute-0 sshd-session[232124]: Invalid user docker from 188.166.69.60 port 55150
Jan 22 00:14:28 compute-0 sshd-session[232124]: Connection closed by invalid user docker 188.166.69.60 port 55150 [preauth]
Jan 22 00:14:30 compute-0 nova_compute[182935]: 2026-01-22 00:14:30.778 182939 DEBUG nova.network.neutron [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Updating instance_info_cache with network_info: [{"id": "8729a102-f8f0-4c90-9f82-e055c76fc104", "address": "fa:16:3e:91:e7:e8", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8729a102-f8", "ovs_interfaceid": "8729a102-f8f0-4c90-9f82-e055c76fc104", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:14:30 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:30.918 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.098 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Releasing lock "refresh_cache-a3a09b65-9f43-4029-b1e4-dd463c6e4964" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.099 182939 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Instance network_info: |[{"id": "8729a102-f8f0-4c90-9f82-e055c76fc104", "address": "fa:16:3e:91:e7:e8", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8729a102-f8", "ovs_interfaceid": "8729a102-f8f0-4c90-9f82-e055c76fc104", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.102 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Start _get_guest_xml network_info=[{"id": "8729a102-f8f0-4c90-9f82-e055c76fc104", "address": "fa:16:3e:91:e7:e8", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8729a102-f8", "ovs_interfaceid": "8729a102-f8f0-4c90-9f82-e055c76fc104", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.107 182939 WARNING nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.114 182939 DEBUG nova.virt.libvirt.host [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.115 182939 DEBUG nova.virt.libvirt.host [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.118 182939 DEBUG nova.virt.libvirt.host [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.118 182939 DEBUG nova.virt.libvirt.host [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.120 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.120 182939 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.121 182939 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.121 182939 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.121 182939 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.122 182939 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.122 182939 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.122 182939 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.123 182939 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.123 182939 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.124 182939 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.124 182939 DEBUG nova.virt.hardware [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.129 182939 DEBUG nova.virt.libvirt.vif [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:14:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1102130362',display_name='tempest-tempest.common.compute-instance-1102130362-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1102130362-1',id=123,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f02fc2085f6340ffa895cb894fdf5882',ramdisk_id='',reservation_id='r-u2tu2ftn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-620854064',owner_user_name='tempest-MultipleCreateTestJSON-620854064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:14:15Z,user_data=None,user_id='00a7d470e36045deabd5584bd3a9c73e',uuid=a3a09b65-9f43-4029-b1e4-dd463c6e4964,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8729a102-f8f0-4c90-9f82-e055c76fc104", "address": "fa:16:3e:91:e7:e8", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8729a102-f8", "ovs_interfaceid": "8729a102-f8f0-4c90-9f82-e055c76fc104", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.130 182939 DEBUG nova.network.os_vif_util [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converting VIF {"id": "8729a102-f8f0-4c90-9f82-e055c76fc104", "address": "fa:16:3e:91:e7:e8", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8729a102-f8", "ovs_interfaceid": "8729a102-f8f0-4c90-9f82-e055c76fc104", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.131 182939 DEBUG nova.network.os_vif_util [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:e7:e8,bridge_name='br-int',has_traffic_filtering=True,id=8729a102-f8f0-4c90-9f82-e055c76fc104,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8729a102-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.132 182939 DEBUG nova.objects.instance [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lazy-loading 'pci_devices' on Instance uuid a3a09b65-9f43-4029-b1e4-dd463c6e4964 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.320 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:14:31 compute-0 nova_compute[182935]:   <uuid>a3a09b65-9f43-4029-b1e4-dd463c6e4964</uuid>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   <name>instance-0000007b</name>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <nova:name>tempest-tempest.common.compute-instance-1102130362-1</nova:name>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:14:31</nova:creationTime>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:14:31 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:14:31 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:14:31 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:14:31 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:14:31 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:14:31 compute-0 nova_compute[182935]:         <nova:user uuid="00a7d470e36045deabd5584bd3a9c73e">tempest-MultipleCreateTestJSON-620854064-project-member</nova:user>
Jan 22 00:14:31 compute-0 nova_compute[182935]:         <nova:project uuid="f02fc2085f6340ffa895cb894fdf5882">tempest-MultipleCreateTestJSON-620854064</nova:project>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:14:31 compute-0 nova_compute[182935]:         <nova:port uuid="8729a102-f8f0-4c90-9f82-e055c76fc104">
Jan 22 00:14:31 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <system>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <entry name="serial">a3a09b65-9f43-4029-b1e4-dd463c6e4964</entry>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <entry name="uuid">a3a09b65-9f43-4029-b1e4-dd463c6e4964</entry>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     </system>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   <os>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   </os>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   <features>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   </features>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk.config"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:91:e7:e8"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <target dev="tap8729a102-f8"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/console.log" append="off"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <video>
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     </video>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:14:31 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:14:31 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:14:31 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:14:31 compute-0 nova_compute[182935]: </domain>
Jan 22 00:14:31 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.321 182939 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Preparing to wait for external event network-vif-plugged-8729a102-f8f0-4c90-9f82-e055c76fc104 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.322 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.322 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.322 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.323 182939 DEBUG nova.virt.libvirt.vif [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:14:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1102130362',display_name='tempest-tempest.common.compute-instance-1102130362-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1102130362-1',id=123,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f02fc2085f6340ffa895cb894fdf5882',ramdisk_id='',reservation_id='r-u2tu2ftn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-620854064',owner_user_name='tempest-MultipleCreateTestJSON-620854064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:14:15Z,user_data=None,user_id='00a7d470e36045deabd5584bd3a9c73e',uuid=a3a09b65-9f43-4029-b1e4-dd463c6e4964,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8729a102-f8f0-4c90-9f82-e055c76fc104", "address": "fa:16:3e:91:e7:e8", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8729a102-f8", "ovs_interfaceid": "8729a102-f8f0-4c90-9f82-e055c76fc104", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.323 182939 DEBUG nova.network.os_vif_util [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converting VIF {"id": "8729a102-f8f0-4c90-9f82-e055c76fc104", "address": "fa:16:3e:91:e7:e8", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8729a102-f8", "ovs_interfaceid": "8729a102-f8f0-4c90-9f82-e055c76fc104", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.323 182939 DEBUG nova.network.os_vif_util [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:e7:e8,bridge_name='br-int',has_traffic_filtering=True,id=8729a102-f8f0-4c90-9f82-e055c76fc104,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8729a102-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.324 182939 DEBUG os_vif [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:e7:e8,bridge_name='br-int',has_traffic_filtering=True,id=8729a102-f8f0-4c90-9f82-e055c76fc104,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8729a102-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.324 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.325 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.325 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.328 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.329 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8729a102-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.329 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8729a102-f8, col_values=(('external_ids', {'iface-id': '8729a102-f8f0-4c90-9f82-e055c76fc104', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:e7:e8', 'vm-uuid': 'a3a09b65-9f43-4029-b1e4-dd463c6e4964'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.331 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:31 compute-0 NetworkManager[55139]: <info>  [1769040871.3322] manager: (tap8729a102-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.332 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.337 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.338 182939 INFO os_vif [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:e7:e8,bridge_name='br-int',has_traffic_filtering=True,id=8729a102-f8f0-4c90-9f82-e055c76fc104,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8729a102-f8')
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.559 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.559 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.559 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] No VIF found with MAC fa:16:3e:91:e7:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:14:31 compute-0 nova_compute[182935]: 2026-01-22 00:14:31.560 182939 INFO nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Using config drive
Jan 22 00:14:32 compute-0 nova_compute[182935]: 2026-01-22 00:14:32.350 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:32 compute-0 nova_compute[182935]: 2026-01-22 00:14:32.662 182939 INFO nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Creating config drive at /var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk.config
Jan 22 00:14:32 compute-0 nova_compute[182935]: 2026-01-22 00:14:32.667 182939 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7nvgd1w5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:14:32 compute-0 nova_compute[182935]: 2026-01-22 00:14:32.801 182939 DEBUG oslo_concurrency.processutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7nvgd1w5" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:14:32 compute-0 kernel: tap8729a102-f8: entered promiscuous mode
Jan 22 00:14:32 compute-0 NetworkManager[55139]: <info>  [1769040872.8659] manager: (tap8729a102-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/228)
Jan 22 00:14:32 compute-0 ovn_controller[95047]: 2026-01-22T00:14:32Z|00482|binding|INFO|Claiming lport 8729a102-f8f0-4c90-9f82-e055c76fc104 for this chassis.
Jan 22 00:14:32 compute-0 ovn_controller[95047]: 2026-01-22T00:14:32Z|00483|binding|INFO|8729a102-f8f0-4c90-9f82-e055c76fc104: Claiming fa:16:3e:91:e7:e8 10.100.0.12
Jan 22 00:14:32 compute-0 nova_compute[182935]: 2026-01-22 00:14:32.868 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:32 compute-0 nova_compute[182935]: 2026-01-22 00:14:32.871 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:32 compute-0 nova_compute[182935]: 2026-01-22 00:14:32.875 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:32.886 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:e7:e8 10.100.0.12'], port_security=['fa:16:3e:91:e7:e8 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a3a09b65-9f43-4029-b1e4-dd463c6e4964', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c19848fe-a435-4c66-8190-94e8e9e1b266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'neutron:revision_number': '2', 'neutron:security_group_ids': '01430d09-4466-4c63-8f42-d6bde77fcc79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=639fe658-8c59-48e8-bb7b-52cdb7487f54, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=8729a102-f8f0-4c90-9f82-e055c76fc104) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:14:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:32.887 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 8729a102-f8f0-4c90-9f82-e055c76fc104 in datapath c19848fe-a435-4c66-8190-94e8e9e1b266 bound to our chassis
Jan 22 00:14:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:32.888 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c19848fe-a435-4c66-8190-94e8e9e1b266
Jan 22 00:14:32 compute-0 systemd-udevd[232145]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:14:32 compute-0 systemd-machined[154182]: New machine qemu-62-instance-0000007b.
Jan 22 00:14:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:32.900 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b58026b1-e6b2-438e-a964-4822d963288d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:32.901 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc19848fe-a1 in ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:14:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:32.904 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc19848fe-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:14:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:32.904 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ee552120-bd5a-4d80-bacb-7c297ce463bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:32.905 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[677e00de-ef89-4e79-b6c8-d1c21887c255]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:32 compute-0 NetworkManager[55139]: <info>  [1769040872.9076] device (tap8729a102-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:14:32 compute-0 NetworkManager[55139]: <info>  [1769040872.9082] device (tap8729a102-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:14:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:32.916 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[d774cac5-7e36-4b8a-8e22-d12e9f9dcd00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:32 compute-0 nova_compute[182935]: 2026-01-22 00:14:32.926 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:32 compute-0 ovn_controller[95047]: 2026-01-22T00:14:32Z|00484|binding|INFO|Setting lport 8729a102-f8f0-4c90-9f82-e055c76fc104 ovn-installed in OVS
Jan 22 00:14:32 compute-0 ovn_controller[95047]: 2026-01-22T00:14:32Z|00485|binding|INFO|Setting lport 8729a102-f8f0-4c90-9f82-e055c76fc104 up in Southbound
Jan 22 00:14:32 compute-0 nova_compute[182935]: 2026-01-22 00:14:32.931 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:32 compute-0 systemd[1]: Started Virtual Machine qemu-62-instance-0000007b.
Jan 22 00:14:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:32.941 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8524ed3a-6eea-44cb-b7cc-56cfdab8e2e9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:32.972 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[ac1c94dd-fda1-437c-a3e7-cc3087e09a6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:32 compute-0 NetworkManager[55139]: <info>  [1769040872.9810] manager: (tapc19848fe-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/229)
Jan 22 00:14:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:32.979 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[efc60f96-329a-406b-b18e-52dd122c74d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.012 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ff4899-46dc-436b-96e2-c2cc860f4341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.016 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c67171cd-1ed6-4131-a601-60363eeee247]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:33 compute-0 NetworkManager[55139]: <info>  [1769040873.0360] device (tapc19848fe-a0): carrier: link connected
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.040 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[183f4b0a-a438-43e9-a788-c0f461c1495e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.060 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e4f77713-479e-4832-afbd-23849faf9682]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc19848fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:5c:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537577, 'reachable_time': 37959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232179, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.074 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e5f0218d-c9a1-4d7e-8ed8-c0b94c6522ae]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:5cb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537577, 'tstamp': 537577}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232180, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.090 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0c7722-d393-404f-a2eb-b2ea7c088337]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc19848fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:5c:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537577, 'reachable_time': 37959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232181, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.116 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac5ffe7-f1e6-4b70-9567-de751588cda5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.179 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[87aa5723-28d8-4041-9a21-2270161efbb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.180 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc19848fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.181 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.181 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc19848fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:33 compute-0 nova_compute[182935]: 2026-01-22 00:14:33.183 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:33 compute-0 NetworkManager[55139]: <info>  [1769040873.1843] manager: (tapc19848fe-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Jan 22 00:14:33 compute-0 kernel: tapc19848fe-a0: entered promiscuous mode
Jan 22 00:14:33 compute-0 nova_compute[182935]: 2026-01-22 00:14:33.185 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.189 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc19848fe-a0, col_values=(('external_ids', {'iface-id': 'ba768391-9e0e-4cf0-83c5-526ca3a05a58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:33 compute-0 nova_compute[182935]: 2026-01-22 00:14:33.190 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:33 compute-0 ovn_controller[95047]: 2026-01-22T00:14:33Z|00486|binding|INFO|Releasing lport ba768391-9e0e-4cf0-83c5-526ca3a05a58 from this chassis (sb_readonly=0)
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.193 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c19848fe-a435-4c66-8190-94e8e9e1b266.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c19848fe-a435-4c66-8190-94e8e9e1b266.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.194 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[073bf792-3340-4f15-82ac-e5312674c487]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.195 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-c19848fe-a435-4c66-8190-94e8e9e1b266
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/c19848fe-a435-4c66-8190-94e8e9e1b266.pid.haproxy
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID c19848fe-a435-4c66-8190-94e8e9e1b266
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:14:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:33.195 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'env', 'PROCESS_TAG=haproxy-c19848fe-a435-4c66-8190-94e8e9e1b266', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c19848fe-a435-4c66-8190-94e8e9e1b266.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:14:33 compute-0 nova_compute[182935]: 2026-01-22 00:14:33.201 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:33 compute-0 nova_compute[182935]: 2026-01-22 00:14:33.240 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040873.239919, a3a09b65-9f43-4029-b1e4-dd463c6e4964 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:14:33 compute-0 nova_compute[182935]: 2026-01-22 00:14:33.241 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] VM Started (Lifecycle Event)
Jan 22 00:14:33 compute-0 nova_compute[182935]: 2026-01-22 00:14:33.277 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:33 compute-0 nova_compute[182935]: 2026-01-22 00:14:33.282 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040873.2410817, a3a09b65-9f43-4029-b1e4-dd463c6e4964 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:14:33 compute-0 nova_compute[182935]: 2026-01-22 00:14:33.282 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] VM Paused (Lifecycle Event)
Jan 22 00:14:33 compute-0 nova_compute[182935]: 2026-01-22 00:14:33.373 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:33 compute-0 nova_compute[182935]: 2026-01-22 00:14:33.376 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:14:33 compute-0 nova_compute[182935]: 2026-01-22 00:14:33.437 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:14:33 compute-0 podman[232220]: 2026-01-22 00:14:33.539155843 +0000 UTC m=+0.050035589 container create 3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:14:33 compute-0 systemd[1]: Started libpod-conmon-3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb.scope.
Jan 22 00:14:33 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:14:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0198c32a3088ea4929db247082cac350e6d64fdd5a6ae4a7a3734989aa30dea7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:14:33 compute-0 podman[232220]: 2026-01-22 00:14:33.510981405 +0000 UTC m=+0.021861171 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:14:33 compute-0 podman[232220]: 2026-01-22 00:14:33.893083847 +0000 UTC m=+0.403963633 container init 3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 00:14:33 compute-0 podman[232220]: 2026-01-22 00:14:33.902505737 +0000 UTC m=+0.413385503 container start 3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:14:33 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232237]: [NOTICE]   (232241) : New worker (232243) forked
Jan 22 00:14:33 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232237]: [NOTICE]   (232241) : Loading success.
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.021 182939 DEBUG nova.compute.manager [req-edb8e8a9-04a7-49f0-86a2-9d5f21c4b4b3 req-890baccf-b82a-434d-8ac6-ea8628c0246f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Received event network-vif-plugged-8729a102-f8f0-4c90-9f82-e055c76fc104 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.022 182939 DEBUG oslo_concurrency.lockutils [req-edb8e8a9-04a7-49f0-86a2-9d5f21c4b4b3 req-890baccf-b82a-434d-8ac6-ea8628c0246f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.023 182939 DEBUG oslo_concurrency.lockutils [req-edb8e8a9-04a7-49f0-86a2-9d5f21c4b4b3 req-890baccf-b82a-434d-8ac6-ea8628c0246f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.023 182939 DEBUG oslo_concurrency.lockutils [req-edb8e8a9-04a7-49f0-86a2-9d5f21c4b4b3 req-890baccf-b82a-434d-8ac6-ea8628c0246f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.023 182939 DEBUG nova.compute.manager [req-edb8e8a9-04a7-49f0-86a2-9d5f21c4b4b3 req-890baccf-b82a-434d-8ac6-ea8628c0246f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Processing event network-vif-plugged-8729a102-f8f0-4c90-9f82-e055c76fc104 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.024 182939 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.028 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040874.0282736, a3a09b65-9f43-4029-b1e4-dd463c6e4964 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.028 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] VM Resumed (Lifecycle Event)
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.031 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.037 182939 INFO nova.virt.libvirt.driver [-] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Instance spawned successfully.
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.038 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.059 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.066 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.070 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.070 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.071 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.071 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.072 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.072 182939 DEBUG nova.virt.libvirt.driver [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.115 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.187 182939 INFO nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Took 18.49 seconds to spawn the instance on the hypervisor.
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.187 182939 DEBUG nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.358 182939 INFO nova.compute.manager [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Took 19.70 seconds to build instance.
Jan 22 00:14:34 compute-0 nova_compute[182935]: 2026-01-22 00:14:34.403 182939 DEBUG oslo_concurrency.lockutils [None req-32069043-92dc-4888-8b4d-8dfd7a4b0e89 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.965s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:34 compute-0 podman[232253]: 2026-01-22 00:14:34.693732301 +0000 UTC m=+0.060272758 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:14:34 compute-0 podman[232252]: 2026-01-22 00:14:34.724790606 +0000 UTC m=+0.096596597 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 00:14:36 compute-0 nova_compute[182935]: 2026-01-22 00:14:36.330 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:36 compute-0 nova_compute[182935]: 2026-01-22 00:14:36.472 182939 DEBUG nova.compute.manager [req-9c5643ae-a24c-4043-aa69-047a5fb0c874 req-5360f3df-7c63-48da-ba0d-19705db29ed9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Received event network-vif-plugged-8729a102-f8f0-4c90-9f82-e055c76fc104 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:14:36 compute-0 nova_compute[182935]: 2026-01-22 00:14:36.472 182939 DEBUG oslo_concurrency.lockutils [req-9c5643ae-a24c-4043-aa69-047a5fb0c874 req-5360f3df-7c63-48da-ba0d-19705db29ed9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:36 compute-0 nova_compute[182935]: 2026-01-22 00:14:36.473 182939 DEBUG oslo_concurrency.lockutils [req-9c5643ae-a24c-4043-aa69-047a5fb0c874 req-5360f3df-7c63-48da-ba0d-19705db29ed9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:36 compute-0 nova_compute[182935]: 2026-01-22 00:14:36.473 182939 DEBUG oslo_concurrency.lockutils [req-9c5643ae-a24c-4043-aa69-047a5fb0c874 req-5360f3df-7c63-48da-ba0d-19705db29ed9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:36 compute-0 nova_compute[182935]: 2026-01-22 00:14:36.474 182939 DEBUG nova.compute.manager [req-9c5643ae-a24c-4043-aa69-047a5fb0c874 req-5360f3df-7c63-48da-ba0d-19705db29ed9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] No waiting events found dispatching network-vif-plugged-8729a102-f8f0-4c90-9f82-e055c76fc104 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:14:36 compute-0 nova_compute[182935]: 2026-01-22 00:14:36.474 182939 WARNING nova.compute.manager [req-9c5643ae-a24c-4043-aa69-047a5fb0c874 req-5360f3df-7c63-48da-ba0d-19705db29ed9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Received unexpected event network-vif-plugged-8729a102-f8f0-4c90-9f82-e055c76fc104 for instance with vm_state active and task_state None.
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.352 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.545 182939 DEBUG oslo_concurrency.lockutils [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.546 182939 DEBUG oslo_concurrency.lockutils [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.546 182939 DEBUG oslo_concurrency.lockutils [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.547 182939 DEBUG oslo_concurrency.lockutils [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.547 182939 DEBUG oslo_concurrency.lockutils [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.561 182939 INFO nova.compute.manager [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Terminating instance
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.575 182939 DEBUG nova.compute.manager [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:14:37 compute-0 kernel: tap8729a102-f8 (unregistering): left promiscuous mode
Jan 22 00:14:37 compute-0 NetworkManager[55139]: <info>  [1769040877.6011] device (tap8729a102-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:14:37 compute-0 ovn_controller[95047]: 2026-01-22T00:14:37Z|00487|binding|INFO|Releasing lport 8729a102-f8f0-4c90-9f82-e055c76fc104 from this chassis (sb_readonly=0)
Jan 22 00:14:37 compute-0 ovn_controller[95047]: 2026-01-22T00:14:37Z|00488|binding|INFO|Setting lport 8729a102-f8f0-4c90-9f82-e055c76fc104 down in Southbound
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.614 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:37 compute-0 ovn_controller[95047]: 2026-01-22T00:14:37Z|00489|binding|INFO|Removing iface tap8729a102-f8 ovn-installed in OVS
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.617 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.637 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:e7:e8 10.100.0.12'], port_security=['fa:16:3e:91:e7:e8 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a3a09b65-9f43-4029-b1e4-dd463c6e4964', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c19848fe-a435-4c66-8190-94e8e9e1b266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'neutron:revision_number': '4', 'neutron:security_group_ids': '01430d09-4466-4c63-8f42-d6bde77fcc79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=639fe658-8c59-48e8-bb7b-52cdb7487f54, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=8729a102-f8f0-4c90-9f82-e055c76fc104) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.639 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 8729a102-f8f0-4c90-9f82-e055c76fc104 in datapath c19848fe-a435-4c66-8190-94e8e9e1b266 unbound from our chassis
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.640 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c19848fe-a435-4c66-8190-94e8e9e1b266, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.642 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1da1246e-4940-48f9-ada1-c4b437a4f173]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.643 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 namespace which is not needed anymore
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.647 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:37 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000007b.scope: Deactivated successfully.
Jan 22 00:14:37 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000007b.scope: Consumed 3.876s CPU time.
Jan 22 00:14:37 compute-0 systemd-machined[154182]: Machine qemu-62-instance-0000007b terminated.
Jan 22 00:14:37 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232237]: [NOTICE]   (232241) : haproxy version is 2.8.14-c23fe91
Jan 22 00:14:37 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232237]: [NOTICE]   (232241) : path to executable is /usr/sbin/haproxy
Jan 22 00:14:37 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232237]: [WARNING]  (232241) : Exiting Master process...
Jan 22 00:14:37 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232237]: [ALERT]    (232241) : Current worker (232243) exited with code 143 (Terminated)
Jan 22 00:14:37 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232237]: [WARNING]  (232241) : All workers exited. Exiting... (0)
Jan 22 00:14:37 compute-0 systemd[1]: libpod-3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb.scope: Deactivated successfully.
Jan 22 00:14:37 compute-0 podman[232327]: 2026-01-22 00:14:37.778914137 +0000 UTC m=+0.047009530 container died 3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.804 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.808 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb-userdata-shm.mount: Deactivated successfully.
Jan 22 00:14:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-0198c32a3088ea4929db247082cac350e6d64fdd5a6ae4a7a3734989aa30dea7-merged.mount: Deactivated successfully.
Jan 22 00:14:37 compute-0 podman[232327]: 2026-01-22 00:14:37.833477631 +0000 UTC m=+0.101573014 container cleanup 3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 00:14:37 compute-0 systemd[1]: libpod-conmon-3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb.scope: Deactivated successfully.
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.850 182939 INFO nova.virt.libvirt.driver [-] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Instance destroyed successfully.
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.851 182939 DEBUG nova.objects.instance [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lazy-loading 'resources' on Instance uuid a3a09b65-9f43-4029-b1e4-dd463c6e4964 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.888 182939 DEBUG nova.virt.libvirt.vif [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:14:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1102130362',display_name='tempest-tempest.common.compute-instance-1102130362-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1102130362-1',id=123,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:14:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f02fc2085f6340ffa895cb894fdf5882',ramdisk_id='',reservation_id='r-u2tu2ftn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-620854064',owner_user_name='tempest-MultipleCreateTestJSON-620854064-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:14:34Z,user_data=None,user_id='00a7d470e36045deabd5584bd3a9c73e',uuid=a3a09b65-9f43-4029-b1e4-dd463c6e4964,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8729a102-f8f0-4c90-9f82-e055c76fc104", "address": "fa:16:3e:91:e7:e8", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8729a102-f8", "ovs_interfaceid": "8729a102-f8f0-4c90-9f82-e055c76fc104", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.889 182939 DEBUG nova.network.os_vif_util [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converting VIF {"id": "8729a102-f8f0-4c90-9f82-e055c76fc104", "address": "fa:16:3e:91:e7:e8", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8729a102-f8", "ovs_interfaceid": "8729a102-f8f0-4c90-9f82-e055c76fc104", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.890 182939 DEBUG nova.network.os_vif_util [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:e7:e8,bridge_name='br-int',has_traffic_filtering=True,id=8729a102-f8f0-4c90-9f82-e055c76fc104,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8729a102-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.891 182939 DEBUG os_vif [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:e7:e8,bridge_name='br-int',has_traffic_filtering=True,id=8729a102-f8f0-4c90-9f82-e055c76fc104,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8729a102-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.893 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.893 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8729a102-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.895 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.896 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.899 182939 INFO os_vif [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:e7:e8,bridge_name='br-int',has_traffic_filtering=True,id=8729a102-f8f0-4c90-9f82-e055c76fc104,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8729a102-f8')
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.899 182939 INFO nova.virt.libvirt.driver [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Deleting instance files /var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964_del
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.900 182939 INFO nova.virt.libvirt.driver [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Deletion of /var/lib/nova/instances/a3a09b65-9f43-4029-b1e4-dd463c6e4964_del complete
Jan 22 00:14:37 compute-0 podman[232375]: 2026-01-22 00:14:37.907265083 +0000 UTC m=+0.044279315 container remove 3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.913 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e6612aaf-37b1-4e36-94a5-d0a77ba4d052]: (4, ('Thu Jan 22 12:14:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 (3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb)\n3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb\nThu Jan 22 12:14:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 (3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb)\n3af405dba3ee7cf1485b919050826796688ec1750a46d50029851f0a8135bffb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.914 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[739ee70f-52c9-40b7-8bd9-a22e66bc364a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.915 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc19848fe-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.917 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:37 compute-0 kernel: tapc19848fe-a0: left promiscuous mode
Jan 22 00:14:37 compute-0 nova_compute[182935]: 2026-01-22 00:14:37.931 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.933 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7e6009ae-bf7d-4b79-b71a-40317e268d83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.951 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8f43659f-8cfd-4234-a5b6-72ccffb504db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.952 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d925cd6a-d8f7-4640-bbc9-1d8e4b5417b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.967 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[37986ed7-435e-4766-ad2a-bb893c2aadb9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537570, 'reachable_time': 22931, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232390, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.970 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:14:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:14:37.970 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[bd84045b-07a4-488e-b436-9c41c84990a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:14:37 compute-0 systemd[1]: run-netns-ovnmeta\x2dc19848fe\x2da435\x2d4c66\x2d8190\x2d94e8e9e1b266.mount: Deactivated successfully.
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.026 182939 INFO nova.compute.manager [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Took 0.45 seconds to destroy the instance on the hypervisor.
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.027 182939 DEBUG oslo.service.loopingcall [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.027 182939 DEBUG nova.compute.manager [-] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.027 182939 DEBUG nova.network.neutron [-] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.691 182939 DEBUG nova.compute.manager [req-f8bf0fae-4c58-4b59-a6a1-f6666f58a754 req-55743873-e390-402b-a2bc-73b8595822a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Received event network-vif-unplugged-8729a102-f8f0-4c90-9f82-e055c76fc104 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.692 182939 DEBUG oslo_concurrency.lockutils [req-f8bf0fae-4c58-4b59-a6a1-f6666f58a754 req-55743873-e390-402b-a2bc-73b8595822a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.692 182939 DEBUG oslo_concurrency.lockutils [req-f8bf0fae-4c58-4b59-a6a1-f6666f58a754 req-55743873-e390-402b-a2bc-73b8595822a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.692 182939 DEBUG oslo_concurrency.lockutils [req-f8bf0fae-4c58-4b59-a6a1-f6666f58a754 req-55743873-e390-402b-a2bc-73b8595822a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.693 182939 DEBUG nova.compute.manager [req-f8bf0fae-4c58-4b59-a6a1-f6666f58a754 req-55743873-e390-402b-a2bc-73b8595822a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] No waiting events found dispatching network-vif-unplugged-8729a102-f8f0-4c90-9f82-e055c76fc104 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.693 182939 DEBUG nova.compute.manager [req-f8bf0fae-4c58-4b59-a6a1-f6666f58a754 req-55743873-e390-402b-a2bc-73b8595822a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Received event network-vif-unplugged-8729a102-f8f0-4c90-9f82-e055c76fc104 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.694 182939 DEBUG nova.compute.manager [req-f8bf0fae-4c58-4b59-a6a1-f6666f58a754 req-55743873-e390-402b-a2bc-73b8595822a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Received event network-vif-plugged-8729a102-f8f0-4c90-9f82-e055c76fc104 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.694 182939 DEBUG oslo_concurrency.lockutils [req-f8bf0fae-4c58-4b59-a6a1-f6666f58a754 req-55743873-e390-402b-a2bc-73b8595822a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.694 182939 DEBUG oslo_concurrency.lockutils [req-f8bf0fae-4c58-4b59-a6a1-f6666f58a754 req-55743873-e390-402b-a2bc-73b8595822a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.695 182939 DEBUG oslo_concurrency.lockutils [req-f8bf0fae-4c58-4b59-a6a1-f6666f58a754 req-55743873-e390-402b-a2bc-73b8595822a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.695 182939 DEBUG nova.compute.manager [req-f8bf0fae-4c58-4b59-a6a1-f6666f58a754 req-55743873-e390-402b-a2bc-73b8595822a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] No waiting events found dispatching network-vif-plugged-8729a102-f8f0-4c90-9f82-e055c76fc104 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:14:38 compute-0 nova_compute[182935]: 2026-01-22 00:14:38.695 182939 WARNING nova.compute.manager [req-f8bf0fae-4c58-4b59-a6a1-f6666f58a754 req-55743873-e390-402b-a2bc-73b8595822a8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Received unexpected event network-vif-plugged-8729a102-f8f0-4c90-9f82-e055c76fc104 for instance with vm_state active and task_state deleting.
Jan 22 00:14:39 compute-0 nova_compute[182935]: 2026-01-22 00:14:39.162 182939 DEBUG nova.network.neutron [-] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:14:39 compute-0 nova_compute[182935]: 2026-01-22 00:14:39.218 182939 INFO nova.compute.manager [-] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Took 1.19 seconds to deallocate network for instance.
Jan 22 00:14:39 compute-0 nova_compute[182935]: 2026-01-22 00:14:39.335 182939 DEBUG oslo_concurrency.lockutils [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:39 compute-0 nova_compute[182935]: 2026-01-22 00:14:39.336 182939 DEBUG oslo_concurrency.lockutils [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:39 compute-0 nova_compute[182935]: 2026-01-22 00:14:39.627 182939 DEBUG nova.compute.manager [req-38f55a4d-b150-4556-993e-28b2d9a38b8f req-fc041c29-d7bd-4a5b-a681-ede5297164dd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Received event network-vif-deleted-8729a102-f8f0-4c90-9f82-e055c76fc104 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:14:39 compute-0 podman[232391]: 2026-01-22 00:14:39.713155529 +0000 UTC m=+0.086017989 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:14:39 compute-0 nova_compute[182935]: 2026-01-22 00:14:39.984 182939 DEBUG nova.scheduler.client.report [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.023 182939 DEBUG nova.scheduler.client.report [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.023 182939 DEBUG nova.compute.provider_tree [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.054 182939 DEBUG nova.scheduler.client.report [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.095 182939 DEBUG nova.scheduler.client.report [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.167 182939 DEBUG nova.compute.provider_tree [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.192 182939 DEBUG nova.scheduler.client.report [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.221 182939 DEBUG oslo_concurrency.lockutils [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.885s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.257 182939 INFO nova.scheduler.client.report [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Deleted allocations for instance a3a09b65-9f43-4029-b1e4-dd463c6e4964
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.405 182939 DEBUG oslo_concurrency.lockutils [None req-ee570736-6035-414e-a8be-1bce8587fbe2 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "a3a09b65-9f43-4029-b1e4-dd463c6e4964" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.859s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.860 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.860 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.861 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:40 compute-0 nova_compute[182935]: 2026-01-22 00:14:40.861 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:14:41 compute-0 nova_compute[182935]: 2026-01-22 00:14:41.032 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:14:41 compute-0 nova_compute[182935]: 2026-01-22 00:14:41.035 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5712MB free_disk=73.12738800048828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:14:41 compute-0 nova_compute[182935]: 2026-01-22 00:14:41.035 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:41 compute-0 nova_compute[182935]: 2026-01-22 00:14:41.036 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:41 compute-0 nova_compute[182935]: 2026-01-22 00:14:41.146 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:14:41 compute-0 nova_compute[182935]: 2026-01-22 00:14:41.147 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:14:41 compute-0 nova_compute[182935]: 2026-01-22 00:14:41.183 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:14:41 compute-0 nova_compute[182935]: 2026-01-22 00:14:41.208 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:14:41 compute-0 nova_compute[182935]: 2026-01-22 00:14:41.232 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:14:41 compute-0 nova_compute[182935]: 2026-01-22 00:14:41.233 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:14:42 compute-0 nova_compute[182935]: 2026-01-22 00:14:42.233 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:42 compute-0 nova_compute[182935]: 2026-01-22 00:14:42.234 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:14:42 compute-0 nova_compute[182935]: 2026-01-22 00:14:42.234 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:14:42 compute-0 nova_compute[182935]: 2026-01-22 00:14:42.269 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:14:42 compute-0 nova_compute[182935]: 2026-01-22 00:14:42.392 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:42 compute-0 nova_compute[182935]: 2026-01-22 00:14:42.896 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:43 compute-0 nova_compute[182935]: 2026-01-22 00:14:43.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:43 compute-0 nova_compute[182935]: 2026-01-22 00:14:43.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:43 compute-0 nova_compute[182935]: 2026-01-22 00:14:43.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:14:45 compute-0 podman[232418]: 2026-01-22 00:14:45.69551541 +0000 UTC m=+0.060795591 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Jan 22 00:14:45 compute-0 nova_compute[182935]: 2026-01-22 00:14:45.790 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:45 compute-0 nova_compute[182935]: 2026-01-22 00:14:45.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:47 compute-0 nova_compute[182935]: 2026-01-22 00:14:47.393 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:47 compute-0 nova_compute[182935]: 2026-01-22 00:14:47.898 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:51 compute-0 podman[232438]: 2026-01-22 00:14:51.700252943 +0000 UTC m=+0.068549192 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:14:51 compute-0 podman[232437]: 2026-01-22 00:14:51.711559477 +0000 UTC m=+0.082408495 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=openstack_network_exporter, version=9.6, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Jan 22 00:14:51 compute-0 nova_compute[182935]: 2026-01-22 00:14:51.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:52 compute-0 nova_compute[182935]: 2026-01-22 00:14:52.395 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:52 compute-0 nova_compute[182935]: 2026-01-22 00:14:52.848 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040877.8472035, a3a09b65-9f43-4029-b1e4-dd463c6e4964 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:14:52 compute-0 nova_compute[182935]: 2026-01-22 00:14:52.848 182939 INFO nova.compute.manager [-] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] VM Stopped (Lifecycle Event)
Jan 22 00:14:52 compute-0 nova_compute[182935]: 2026-01-22 00:14:52.946 182939 DEBUG nova.compute.manager [None req-e7b57497-0459-464f-a8b8-bf75b8a8f27c - - - - - -] [instance: a3a09b65-9f43-4029-b1e4-dd463c6e4964] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:14:52 compute-0 nova_compute[182935]: 2026-01-22 00:14:52.947 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:57 compute-0 nova_compute[182935]: 2026-01-22 00:14:57.397 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:57 compute-0 nova_compute[182935]: 2026-01-22 00:14:57.589 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "f75b1ffc-1260-4530-b589-8d5036f53a54" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:57 compute-0 nova_compute[182935]: 2026-01-22 00:14:57.590 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "f75b1ffc-1260-4530-b589-8d5036f53a54" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:57 compute-0 nova_compute[182935]: 2026-01-22 00:14:57.941 182939 DEBUG nova.compute.manager [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:14:57 compute-0 nova_compute[182935]: 2026-01-22 00:14:57.995 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:14:59 compute-0 nova_compute[182935]: 2026-01-22 00:14:59.346 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:59 compute-0 nova_compute[182935]: 2026-01-22 00:14:59.347 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:59 compute-0 nova_compute[182935]: 2026-01-22 00:14:59.359 182939 DEBUG nova.virt.hardware [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:14:59 compute-0 nova_compute[182935]: 2026-01-22 00:14:59.359 182939 INFO nova.compute.claims [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:14:59 compute-0 nova_compute[182935]: 2026-01-22 00:14:59.463 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:14:59 compute-0 nova_compute[182935]: 2026-01-22 00:14:59.463 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:14:59 compute-0 nova_compute[182935]: 2026-01-22 00:14:59.593 182939 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:14:59 compute-0 nova_compute[182935]: 2026-01-22 00:14:59.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:14:59 compute-0 nova_compute[182935]: 2026-01-22 00:14:59.808 182939 DEBUG nova.compute.provider_tree [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.041 182939 DEBUG nova.scheduler.client.report [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.182 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.191 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.192 182939 DEBUG nova.compute.manager [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.195 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.202 182939 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.202 182939 INFO nova.compute.claims [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.405 182939 DEBUG nova.compute.manager [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.422 182939 INFO nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.453 182939 DEBUG nova.compute.manager [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.472 182939 DEBUG nova.compute.provider_tree [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.493 182939 DEBUG nova.scheduler.client.report [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.526 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.527 182939 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.645 182939 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.646 182939 DEBUG nova.network.neutron [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.667 182939 DEBUG nova.compute.manager [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.668 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.669 182939 INFO nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Creating image(s)
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.669 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "/var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.670 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "/var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.670 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "/var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.683 182939 INFO nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.686 182939 DEBUG oslo_concurrency.processutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.714 182939 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.783 182939 DEBUG oslo_concurrency.processutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.785 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.785 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.798 182939 DEBUG oslo_concurrency.processutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.857 182939 DEBUG oslo_concurrency.processutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.858 182939 DEBUG oslo_concurrency.processutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.906 182939 DEBUG oslo_concurrency.processutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.907 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.907 182939 DEBUG oslo_concurrency.processutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.932 182939 DEBUG nova.policy [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00a7d470e36045deabd5584bd3a9c73e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.967 182939 DEBUG oslo_concurrency.processutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.968 182939 DEBUG nova.virt.disk.api [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Checking if we can resize image /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:15:00 compute-0 nova_compute[182935]: 2026-01-22 00:15:00.969 182939 DEBUG oslo_concurrency.processutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.036 182939 DEBUG oslo_concurrency.processutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.037 182939 DEBUG nova.virt.disk.api [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Cannot resize image /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.038 182939 DEBUG nova.objects.instance [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lazy-loading 'migration_context' on Instance uuid f75b1ffc-1260-4530-b589-8d5036f53a54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.292 182939 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.293 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.294 182939 INFO nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Creating image(s)
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.294 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "/var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.295 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "/var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.295 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "/var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.308 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.308 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Ensure instance console log exists: /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.308 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.309 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.309 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.311 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.312 182939 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.339 182939 WARNING nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.346 182939 DEBUG nova.virt.libvirt.host [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.347 182939 DEBUG nova.virt.libvirt.host [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.350 182939 DEBUG nova.virt.libvirt.host [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.351 182939 DEBUG nova.virt.libvirt.host [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.352 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.352 182939 DEBUG nova.virt.hardware [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.353 182939 DEBUG nova.virt.hardware [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.353 182939 DEBUG nova.virt.hardware [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.353 182939 DEBUG nova.virt.hardware [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.354 182939 DEBUG nova.virt.hardware [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.354 182939 DEBUG nova.virt.hardware [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.354 182939 DEBUG nova.virt.hardware [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.354 182939 DEBUG nova.virt.hardware [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.355 182939 DEBUG nova.virt.hardware [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.355 182939 DEBUG nova.virt.hardware [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.355 182939 DEBUG nova.virt.hardware [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.360 182939 DEBUG nova.objects.instance [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lazy-loading 'pci_devices' on Instance uuid f75b1ffc-1260-4530-b589-8d5036f53a54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.387 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:15:01 compute-0 nova_compute[182935]:   <uuid>f75b1ffc-1260-4530-b589-8d5036f53a54</uuid>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   <name>instance-00000080</name>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerShowV254Test-server-1890884869</nova:name>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:15:01</nova:creationTime>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:15:01 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:15:01 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:15:01 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:15:01 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:15:01 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:15:01 compute-0 nova_compute[182935]:         <nova:user uuid="3ce6bd0cf3584a949e6c70eda3442ebd">tempest-ServerShowV254Test-1444323064-project-member</nova:user>
Jan 22 00:15:01 compute-0 nova_compute[182935]:         <nova:project uuid="e6d261bcfda9475db1fa317bd2df68f4">tempest-ServerShowV254Test-1444323064</nova:project>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <system>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <entry name="serial">f75b1ffc-1260-4530-b589-8d5036f53a54</entry>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <entry name="uuid">f75b1ffc-1260-4530-b589-8d5036f53a54</entry>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     </system>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   <os>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   </os>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   <features>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   </features>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.config"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/console.log" append="off"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <video>
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     </video>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:15:01 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:15:01 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:15:01 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:15:01 compute-0 nova_compute[182935]: </domain>
Jan 22 00:15:01 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.394 182939 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.394 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.395 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.406 182939 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.463 182939 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.465 182939 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.502 182939 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.503 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.503 182939 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.561 182939 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.562 182939 DEBUG nova.virt.disk.api [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Checking if we can resize image /var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.562 182939 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.622 182939 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.624 182939 DEBUG nova.virt.disk.api [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Cannot resize image /var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.624 182939 DEBUG nova.objects.instance [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lazy-loading 'migration_context' on Instance uuid eba259ea-7c59-4a4a-9f0d-cb645fcc5eae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.707 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.708 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.708 182939 INFO nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Using config drive
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.744 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.745 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Ensure instance console log exists: /var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.745 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.745 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:01 compute-0 nova_compute[182935]: 2026-01-22 00:15:01.746 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.385 182939 INFO nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Creating config drive at /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.config
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.390 182939 DEBUG oslo_concurrency.processutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjifnxu1n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.410 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.521 182939 DEBUG oslo_concurrency.processutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjifnxu1n" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:02 compute-0 systemd-machined[154182]: New machine qemu-63-instance-00000080.
Jan 22 00:15:02 compute-0 systemd[1]: Started Virtual Machine qemu-63-instance-00000080.
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.901 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040902.9006143, f75b1ffc-1260-4530-b589-8d5036f53a54 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.903 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] VM Resumed (Lifecycle Event)
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.906 182939 DEBUG nova.compute.manager [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.907 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.912 182939 INFO nova.virt.libvirt.driver [-] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Instance spawned successfully.
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.913 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.982 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.987 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:15:02 compute-0 nova_compute[182935]: 2026-01-22 00:15:02.997 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:03 compute-0 nova_compute[182935]: 2026-01-22 00:15:03.014 182939 DEBUG nova.network.neutron [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Successfully created port: b40275b3-8698-4cfd-8377-b4cd53eda629 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:15:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:03.212 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:03.213 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:03.213 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:03 compute-0 nova_compute[182935]: 2026-01-22 00:15:03.278 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:03 compute-0 nova_compute[182935]: 2026-01-22 00:15:03.279 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:03 compute-0 nova_compute[182935]: 2026-01-22 00:15:03.280 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:03 compute-0 nova_compute[182935]: 2026-01-22 00:15:03.281 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:03 compute-0 nova_compute[182935]: 2026-01-22 00:15:03.282 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:03 compute-0 nova_compute[182935]: 2026-01-22 00:15:03.282 182939 DEBUG nova.virt.libvirt.driver [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:03 compute-0 nova_compute[182935]: 2026-01-22 00:15:03.340 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:15:03 compute-0 nova_compute[182935]: 2026-01-22 00:15:03.341 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040902.9023278, f75b1ffc-1260-4530-b589-8d5036f53a54 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:03 compute-0 nova_compute[182935]: 2026-01-22 00:15:03.342 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] VM Started (Lifecycle Event)
Jan 22 00:15:05 compute-0 nova_compute[182935]: 2026-01-22 00:15:05.194 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:05 compute-0 nova_compute[182935]: 2026-01-22 00:15:05.199 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:15:05 compute-0 podman[232535]: 2026-01-22 00:15:05.717132721 +0000 UTC m=+0.081488663 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:15:05 compute-0 podman[232534]: 2026-01-22 00:15:05.745798541 +0000 UTC m=+0.103574470 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:15:07 compute-0 nova_compute[182935]: 2026-01-22 00:15:07.139 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:15:07 compute-0 nova_compute[182935]: 2026-01-22 00:15:07.375 182939 INFO nova.compute.manager [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Took 6.71 seconds to spawn the instance on the hypervisor.
Jan 22 00:15:07 compute-0 nova_compute[182935]: 2026-01-22 00:15:07.376 182939 DEBUG nova.compute.manager [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:07 compute-0 nova_compute[182935]: 2026-01-22 00:15:07.402 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:08 compute-0 nova_compute[182935]: 2026-01-22 00:15:08.000 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:08 compute-0 nova_compute[182935]: 2026-01-22 00:15:08.426 182939 INFO nova.compute.manager [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Took 9.26 seconds to build instance.
Jan 22 00:15:08 compute-0 nova_compute[182935]: 2026-01-22 00:15:08.453 182939 DEBUG oslo_concurrency.lockutils [None req-ab3e801a-2c36-4846-8616-4353ef6a3456 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "f75b1ffc-1260-4530-b589-8d5036f53a54" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.863s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:08 compute-0 nova_compute[182935]: 2026-01-22 00:15:08.795 182939 DEBUG nova.network.neutron [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Successfully updated port: b40275b3-8698-4cfd-8377-b4cd53eda629 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:15:08 compute-0 nova_compute[182935]: 2026-01-22 00:15:08.826 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "refresh_cache-eba259ea-7c59-4a4a-9f0d-cb645fcc5eae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:15:08 compute-0 nova_compute[182935]: 2026-01-22 00:15:08.827 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquired lock "refresh_cache-eba259ea-7c59-4a4a-9f0d-cb645fcc5eae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:15:08 compute-0 nova_compute[182935]: 2026-01-22 00:15:08.827 182939 DEBUG nova.network.neutron [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:15:08 compute-0 nova_compute[182935]: 2026-01-22 00:15:08.938 182939 DEBUG nova.compute.manager [req-1265e28f-86b3-4f0c-b821-06c4a20552a5 req-5b66b833-d2c4-48ad-98ef-a212b4873b38 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Received event network-changed-b40275b3-8698-4cfd-8377-b4cd53eda629 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:08 compute-0 nova_compute[182935]: 2026-01-22 00:15:08.939 182939 DEBUG nova.compute.manager [req-1265e28f-86b3-4f0c-b821-06c4a20552a5 req-5b66b833-d2c4-48ad-98ef-a212b4873b38 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Refreshing instance network info cache due to event network-changed-b40275b3-8698-4cfd-8377-b4cd53eda629. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:15:08 compute-0 nova_compute[182935]: 2026-01-22 00:15:08.939 182939 DEBUG oslo_concurrency.lockutils [req-1265e28f-86b3-4f0c-b821-06c4a20552a5 req-5b66b833-d2c4-48ad-98ef-a212b4873b38 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-eba259ea-7c59-4a4a-9f0d-cb645fcc5eae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:15:09 compute-0 nova_compute[182935]: 2026-01-22 00:15:09.037 182939 DEBUG nova.network.neutron [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.202 182939 INFO nova.compute.manager [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Rebuilding instance
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.442 182939 DEBUG nova.network.neutron [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Updating instance_info_cache with network_info: [{"id": "b40275b3-8698-4cfd-8377-b4cd53eda629", "address": "fa:16:3e:c1:c5:ff", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40275b3-86", "ovs_interfaceid": "b40275b3-8698-4cfd-8377-b4cd53eda629", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.473 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Releasing lock "refresh_cache-eba259ea-7c59-4a4a-9f0d-cb645fcc5eae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.473 182939 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Instance network_info: |[{"id": "b40275b3-8698-4cfd-8377-b4cd53eda629", "address": "fa:16:3e:c1:c5:ff", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40275b3-86", "ovs_interfaceid": "b40275b3-8698-4cfd-8377-b4cd53eda629", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.474 182939 DEBUG oslo_concurrency.lockutils [req-1265e28f-86b3-4f0c-b821-06c4a20552a5 req-5b66b833-d2c4-48ad-98ef-a212b4873b38 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-eba259ea-7c59-4a4a-9f0d-cb645fcc5eae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.475 182939 DEBUG nova.network.neutron [req-1265e28f-86b3-4f0c-b821-06c4a20552a5 req-5b66b833-d2c4-48ad-98ef-a212b4873b38 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Refreshing network info cache for port b40275b3-8698-4cfd-8377-b4cd53eda629 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.477 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Start _get_guest_xml network_info=[{"id": "b40275b3-8698-4cfd-8377-b4cd53eda629", "address": "fa:16:3e:c1:c5:ff", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40275b3-86", "ovs_interfaceid": "b40275b3-8698-4cfd-8377-b4cd53eda629", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.482 182939 WARNING nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.489 182939 DEBUG nova.virt.libvirt.host [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.490 182939 DEBUG nova.virt.libvirt.host [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.505 182939 DEBUG nova.virt.libvirt.host [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.507 182939 DEBUG nova.virt.libvirt.host [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.509 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.510 182939 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.511 182939 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.511 182939 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.512 182939 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.513 182939 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.514 182939 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.514 182939 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.515 182939 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.515 182939 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.516 182939 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.517 182939 DEBUG nova.virt.hardware [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.524 182939 DEBUG nova.virt.libvirt.vif [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1628567258',display_name='tempest-MultipleCreateTestJSON-server-1628567258-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1628567258-2',id=127,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f02fc2085f6340ffa895cb894fdf5882',ramdisk_id='',reservation_id='r-ivpvh0b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-620854064',owner_user_name='tempest-MultipleCreateTestJSON-620854064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:15:00Z,user_data=None,user_id='00a7d470e36045deabd5584bd3a9c73e',uuid=eba259ea-7c59-4a4a-9f0d-cb645fcc5eae,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b40275b3-8698-4cfd-8377-b4cd53eda629", "address": "fa:16:3e:c1:c5:ff", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40275b3-86", "ovs_interfaceid": "b40275b3-8698-4cfd-8377-b4cd53eda629", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.524 182939 DEBUG nova.network.os_vif_util [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converting VIF {"id": "b40275b3-8698-4cfd-8377-b4cd53eda629", "address": "fa:16:3e:c1:c5:ff", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40275b3-86", "ovs_interfaceid": "b40275b3-8698-4cfd-8377-b4cd53eda629", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.526 182939 DEBUG nova.network.os_vif_util [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:c5:ff,bridge_name='br-int',has_traffic_filtering=True,id=b40275b3-8698-4cfd-8377-b4cd53eda629,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40275b3-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.527 182939 DEBUG nova.objects.instance [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lazy-loading 'pci_devices' on Instance uuid eba259ea-7c59-4a4a-9f0d-cb645fcc5eae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.547 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:15:10 compute-0 nova_compute[182935]:   <uuid>eba259ea-7c59-4a4a-9f0d-cb645fcc5eae</uuid>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   <name>instance-0000007f</name>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <nova:name>tempest-MultipleCreateTestJSON-server-1628567258-2</nova:name>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:15:10</nova:creationTime>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:15:10 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:15:10 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:15:10 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:15:10 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:15:10 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:15:10 compute-0 nova_compute[182935]:         <nova:user uuid="00a7d470e36045deabd5584bd3a9c73e">tempest-MultipleCreateTestJSON-620854064-project-member</nova:user>
Jan 22 00:15:10 compute-0 nova_compute[182935]:         <nova:project uuid="f02fc2085f6340ffa895cb894fdf5882">tempest-MultipleCreateTestJSON-620854064</nova:project>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:15:10 compute-0 nova_compute[182935]:         <nova:port uuid="b40275b3-8698-4cfd-8377-b4cd53eda629">
Jan 22 00:15:10 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <system>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <entry name="serial">eba259ea-7c59-4a4a-9f0d-cb645fcc5eae</entry>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <entry name="uuid">eba259ea-7c59-4a4a-9f0d-cb645fcc5eae</entry>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     </system>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   <os>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   </os>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   <features>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   </features>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk.config"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:c1:c5:ff"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <target dev="tapb40275b3-86"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/console.log" append="off"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <video>
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     </video>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:15:10 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:15:10 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:15:10 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:15:10 compute-0 nova_compute[182935]: </domain>
Jan 22 00:15:10 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.555 182939 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Preparing to wait for external event network-vif-plugged-b40275b3-8698-4cfd-8377-b4cd53eda629 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.555 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.556 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.556 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.557 182939 DEBUG nova.virt.libvirt.vif [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1628567258',display_name='tempest-MultipleCreateTestJSON-server-1628567258-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1628567258-2',id=127,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f02fc2085f6340ffa895cb894fdf5882',ramdisk_id='',reservation_id='r-ivpvh0b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-620854064',owner_user_name='tempest-MultipleCreateTestJSON-620854064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:15:00Z,user_data=None,user_id='00a7d470e36045deabd5584bd3a9c73e',uuid=eba259ea-7c59-4a4a-9f0d-cb645fcc5eae,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b40275b3-8698-4cfd-8377-b4cd53eda629", "address": "fa:16:3e:c1:c5:ff", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40275b3-86", "ovs_interfaceid": "b40275b3-8698-4cfd-8377-b4cd53eda629", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.558 182939 DEBUG nova.network.os_vif_util [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converting VIF {"id": "b40275b3-8698-4cfd-8377-b4cd53eda629", "address": "fa:16:3e:c1:c5:ff", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40275b3-86", "ovs_interfaceid": "b40275b3-8698-4cfd-8377-b4cd53eda629", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.559 182939 DEBUG nova.network.os_vif_util [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:c5:ff,bridge_name='br-int',has_traffic_filtering=True,id=b40275b3-8698-4cfd-8377-b4cd53eda629,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40275b3-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.560 182939 DEBUG os_vif [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:c5:ff,bridge_name='br-int',has_traffic_filtering=True,id=b40275b3-8698-4cfd-8377-b4cd53eda629,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40275b3-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.560 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.561 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.562 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.567 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.567 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb40275b3-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.568 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb40275b3-86, col_values=(('external_ids', {'iface-id': 'b40275b3-8698-4cfd-8377-b4cd53eda629', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:c5:ff', 'vm-uuid': 'eba259ea-7c59-4a4a-9f0d-cb645fcc5eae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.614 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:10 compute-0 NetworkManager[55139]: <info>  [1769040910.6158] manager: (tapb40275b3-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.620 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.623 182939 INFO os_vif [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:c5:ff,bridge_name='br-int',has_traffic_filtering=True,id=b40275b3-8698-4cfd-8377-b4cd53eda629,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40275b3-86')
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.625 182939 DEBUG nova.compute.manager [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:10 compute-0 podman[232583]: 2026-01-22 00:15:10.692493431 +0000 UTC m=+0.052953098 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.713 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.714 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.714 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] No VIF found with MAC fa:16:3e:c1:c5:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.714 182939 INFO nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Using config drive
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.782 182939 DEBUG nova.objects.instance [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lazy-loading 'pci_requests' on Instance uuid f75b1ffc-1260-4530-b589-8d5036f53a54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.799 182939 DEBUG nova.objects.instance [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lazy-loading 'pci_devices' on Instance uuid f75b1ffc-1260-4530-b589-8d5036f53a54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.819 182939 DEBUG nova.objects.instance [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lazy-loading 'resources' on Instance uuid f75b1ffc-1260-4530-b589-8d5036f53a54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.834 182939 DEBUG nova.objects.instance [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lazy-loading 'migration_context' on Instance uuid f75b1ffc-1260-4530-b589-8d5036f53a54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.852 182939 DEBUG nova.objects.instance [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 00:15:10 compute-0 nova_compute[182935]: 2026-01-22 00:15:10.856 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.246 182939 INFO nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Creating config drive at /var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk.config
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.252 182939 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl2mjf_1n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.382 182939 DEBUG oslo_concurrency.processutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl2mjf_1n" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:11 compute-0 NetworkManager[55139]: <info>  [1769040911.4384] manager: (tapb40275b3-86): new Tun device (/org/freedesktop/NetworkManager/Devices/232)
Jan 22 00:15:11 compute-0 kernel: tapb40275b3-86: entered promiscuous mode
Jan 22 00:15:11 compute-0 ovn_controller[95047]: 2026-01-22T00:15:11Z|00490|binding|INFO|Claiming lport b40275b3-8698-4cfd-8377-b4cd53eda629 for this chassis.
Jan 22 00:15:11 compute-0 ovn_controller[95047]: 2026-01-22T00:15:11Z|00491|binding|INFO|b40275b3-8698-4cfd-8377-b4cd53eda629: Claiming fa:16:3e:c1:c5:ff 10.100.0.7
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.444 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:11 compute-0 sshd-session[232611]: Invalid user docker from 188.166.69.60 port 53372
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.455 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:c5:ff 10.100.0.7'], port_security=['fa:16:3e:c1:c5:ff 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'eba259ea-7c59-4a4a-9f0d-cb645fcc5eae', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c19848fe-a435-4c66-8190-94e8e9e1b266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'neutron:revision_number': '2', 'neutron:security_group_ids': '01430d09-4466-4c63-8f42-d6bde77fcc79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=639fe658-8c59-48e8-bb7b-52cdb7487f54, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=b40275b3-8698-4cfd-8377-b4cd53eda629) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.457 104408 INFO neutron.agent.ovn.metadata.agent [-] Port b40275b3-8698-4cfd-8377-b4cd53eda629 in datapath c19848fe-a435-4c66-8190-94e8e9e1b266 bound to our chassis
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.459 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c19848fe-a435-4c66-8190-94e8e9e1b266
Jan 22 00:15:11 compute-0 ovn_controller[95047]: 2026-01-22T00:15:11Z|00492|binding|INFO|Setting lport b40275b3-8698-4cfd-8377-b4cd53eda629 ovn-installed in OVS
Jan 22 00:15:11 compute-0 ovn_controller[95047]: 2026-01-22T00:15:11Z|00493|binding|INFO|Setting lport b40275b3-8698-4cfd-8377-b4cd53eda629 up in Southbound
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.472 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3b406744-7dda-4b04-bd87-61a3736f7d9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.473 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc19848fe-a1 in ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.472 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.475 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc19848fe-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.475 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f722badc-87e7-4be9-a280-7c4754e7f2de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.475 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.476 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cf40e1ec-4af3-40e7-9e61-144b9dcbed89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 systemd-machined[154182]: New machine qemu-64-instance-0000007f.
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.487 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[1d40487c-fd0b-4be8-b893-3590e78d4cc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 systemd[1]: Started Virtual Machine qemu-64-instance-0000007f.
Jan 22 00:15:11 compute-0 systemd-udevd[232633]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.512 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[255d0588-6647-449d-9dac-d1cc0e3e0e19]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 NetworkManager[55139]: <info>  [1769040911.5223] device (tapb40275b3-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:15:11 compute-0 NetworkManager[55139]: <info>  [1769040911.5233] device (tapb40275b3-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:15:11 compute-0 sshd-session[232611]: Connection closed by invalid user docker 188.166.69.60 port 53372 [preauth]
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.545 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1a3f5a-5bba-4322-b420-491cc7e2bad2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 NetworkManager[55139]: <info>  [1769040911.5534] manager: (tapc19848fe-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/233)
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.554 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d22b74df-05d2-4089-adb3-39296125f852]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.587 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9b2d41f9-1f84-42b5-b087-acd8f90444a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.593 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[588f729a-30db-47c3-afbb-adfa33d22c31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 NetworkManager[55139]: <info>  [1769040911.6157] device (tapc19848fe-a0): carrier: link connected
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.623 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[791d3c06-c1cc-431e-8718-212ae6cdfb6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.647 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[89f7bb00-dabe-44b0-accb-a44f16bdc057]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc19848fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:5c:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541435, 'reachable_time': 22604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232663, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.666 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[17996047-0c37-4d15-a989-61a380fd5492]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:5cb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 541435, 'tstamp': 541435}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232664, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.688 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[01df7df1-96bf-48c2-9c7e-f841de9e267c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc19848fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:5c:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541435, 'reachable_time': 22604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232665, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.726 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7803ead2-a326-406f-bda4-18af1ba035aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.810 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[40f1f100-794a-4f51-9702-0253b3009160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.812 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc19848fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.812 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.812 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc19848fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.814 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:11 compute-0 NetworkManager[55139]: <info>  [1769040911.8153] manager: (tapc19848fe-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/234)
Jan 22 00:15:11 compute-0 kernel: tapc19848fe-a0: entered promiscuous mode
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.822 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc19848fe-a0, col_values=(('external_ids', {'iface-id': 'ba768391-9e0e-4cf0-83c5-526ca3a05a58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:11 compute-0 ovn_controller[95047]: 2026-01-22T00:15:11Z|00494|binding|INFO|Releasing lport ba768391-9e0e-4cf0-83c5-526ca3a05a58 from this chassis (sb_readonly=0)
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.824 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.825 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c19848fe-a435-4c66-8190-94e8e9e1b266.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c19848fe-a435-4c66-8190-94e8e9e1b266.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.826 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040911.8261914, eba259ea-7c59-4a4a-9f0d-cb645fcc5eae => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.827 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] VM Started (Lifecycle Event)
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.827 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc630e5-0682-4273-8c81-a24dd88a6449]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.829 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-c19848fe-a435-4c66-8190-94e8e9e1b266
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/c19848fe-a435-4c66-8190-94e8e9e1b266.pid.haproxy
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID c19848fe-a435-4c66-8190-94e8e9e1b266
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:15:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:11.831 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'env', 'PROCESS_TAG=haproxy-c19848fe-a435-4c66-8190-94e8e9e1b266', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c19848fe-a435-4c66-8190-94e8e9e1b266.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.837 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.859 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.865 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040911.826424, eba259ea-7c59-4a4a-9f0d-cb645fcc5eae => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.865 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] VM Paused (Lifecycle Event)
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.894 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.897 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:15:11 compute-0 nova_compute[182935]: 2026-01-22 00:15:11.919 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.118 182939 DEBUG nova.compute.manager [req-790e1044-6468-4f59-9dfe-21f9aaf75c08 req-99ed7b03-2a63-4f4f-a26c-f3cb530235c8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Received event network-vif-plugged-b40275b3-8698-4cfd-8377-b4cd53eda629 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.119 182939 DEBUG oslo_concurrency.lockutils [req-790e1044-6468-4f59-9dfe-21f9aaf75c08 req-99ed7b03-2a63-4f4f-a26c-f3cb530235c8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.119 182939 DEBUG oslo_concurrency.lockutils [req-790e1044-6468-4f59-9dfe-21f9aaf75c08 req-99ed7b03-2a63-4f4f-a26c-f3cb530235c8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.119 182939 DEBUG oslo_concurrency.lockutils [req-790e1044-6468-4f59-9dfe-21f9aaf75c08 req-99ed7b03-2a63-4f4f-a26c-f3cb530235c8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.120 182939 DEBUG nova.compute.manager [req-790e1044-6468-4f59-9dfe-21f9aaf75c08 req-99ed7b03-2a63-4f4f-a26c-f3cb530235c8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Processing event network-vif-plugged-b40275b3-8698-4cfd-8377-b4cd53eda629 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.120 182939 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.124 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040912.124097, eba259ea-7c59-4a4a-9f0d-cb645fcc5eae => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.124 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] VM Resumed (Lifecycle Event)
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.129 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.132 182939 INFO nova.virt.libvirt.driver [-] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Instance spawned successfully.
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.133 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.166 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.169 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.176 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.176 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.177 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.177 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.178 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.178 182939 DEBUG nova.virt.libvirt.driver [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.204 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.256 182939 INFO nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Took 10.96 seconds to spawn the instance on the hypervisor.
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.257 182939 DEBUG nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:12 compute-0 podman[232704]: 2026-01-22 00:15:12.287597124 +0000 UTC m=+0.113840759 container create 22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:15:12 compute-0 podman[232704]: 2026-01-22 00:15:12.201061773 +0000 UTC m=+0.027305428 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:15:12 compute-0 systemd[1]: Started libpod-conmon-22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063.scope.
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.359 182939 INFO nova.compute.manager [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Took 12.59 seconds to build instance.
Jan 22 00:15:12 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:15:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7057abe8c270c94786709360c1889d23a9cb75382fd04407a982aa1c5d53c37/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.386 182939 DEBUG oslo_concurrency.lockutils [None req-3cd3470c-fe37-4c48-88cb-fe198fd9e938 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.922s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.404 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:12 compute-0 podman[232704]: 2026-01-22 00:15:12.419609836 +0000 UTC m=+0.245853471 container init 22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:15:12 compute-0 podman[232704]: 2026-01-22 00:15:12.425551106 +0000 UTC m=+0.251794741 container start 22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:15:12 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232719]: [NOTICE]   (232723) : New worker (232725) forked
Jan 22 00:15:12 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232719]: [NOTICE]   (232723) : Loading success.
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.912 182939 DEBUG nova.network.neutron [req-1265e28f-86b3-4f0c-b821-06c4a20552a5 req-5b66b833-d2c4-48ad-98ef-a212b4873b38 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Updated VIF entry in instance network info cache for port b40275b3-8698-4cfd-8377-b4cd53eda629. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:15:12 compute-0 nova_compute[182935]: 2026-01-22 00:15:12.913 182939 DEBUG nova.network.neutron [req-1265e28f-86b3-4f0c-b821-06c4a20552a5 req-5b66b833-d2c4-48ad-98ef-a212b4873b38 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Updating instance_info_cache with network_info: [{"id": "b40275b3-8698-4cfd-8377-b4cd53eda629", "address": "fa:16:3e:c1:c5:ff", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40275b3-86", "ovs_interfaceid": "b40275b3-8698-4cfd-8377-b4cd53eda629", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:15:13 compute-0 nova_compute[182935]: 2026-01-22 00:15:13.004 182939 DEBUG oslo_concurrency.lockutils [req-1265e28f-86b3-4f0c-b821-06c4a20552a5 req-5b66b833-d2c4-48ad-98ef-a212b4873b38 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-eba259ea-7c59-4a4a-9f0d-cb645fcc5eae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:15:14 compute-0 nova_compute[182935]: 2026-01-22 00:15:14.345 182939 DEBUG nova.compute.manager [req-7c7a28ad-b34d-438d-9c8b-51ab2a8eee6e req-d0e38143-c708-4e0e-a28d-c192e7d56e01 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Received event network-vif-plugged-b40275b3-8698-4cfd-8377-b4cd53eda629 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:14 compute-0 nova_compute[182935]: 2026-01-22 00:15:14.347 182939 DEBUG oslo_concurrency.lockutils [req-7c7a28ad-b34d-438d-9c8b-51ab2a8eee6e req-d0e38143-c708-4e0e-a28d-c192e7d56e01 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:14 compute-0 nova_compute[182935]: 2026-01-22 00:15:14.347 182939 DEBUG oslo_concurrency.lockutils [req-7c7a28ad-b34d-438d-9c8b-51ab2a8eee6e req-d0e38143-c708-4e0e-a28d-c192e7d56e01 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:14 compute-0 nova_compute[182935]: 2026-01-22 00:15:14.348 182939 DEBUG oslo_concurrency.lockutils [req-7c7a28ad-b34d-438d-9c8b-51ab2a8eee6e req-d0e38143-c708-4e0e-a28d-c192e7d56e01 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:14 compute-0 nova_compute[182935]: 2026-01-22 00:15:14.348 182939 DEBUG nova.compute.manager [req-7c7a28ad-b34d-438d-9c8b-51ab2a8eee6e req-d0e38143-c708-4e0e-a28d-c192e7d56e01 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] No waiting events found dispatching network-vif-plugged-b40275b3-8698-4cfd-8377-b4cd53eda629 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:15:14 compute-0 nova_compute[182935]: 2026-01-22 00:15:14.348 182939 WARNING nova.compute.manager [req-7c7a28ad-b34d-438d-9c8b-51ab2a8eee6e req-d0e38143-c708-4e0e-a28d-c192e7d56e01 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Received unexpected event network-vif-plugged-b40275b3-8698-4cfd-8377-b4cd53eda629 for instance with vm_state active and task_state None.
Jan 22 00:15:15 compute-0 nova_compute[182935]: 2026-01-22 00:15:15.655 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.187 182939 DEBUG oslo_concurrency.lockutils [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.188 182939 DEBUG oslo_concurrency.lockutils [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.189 182939 DEBUG oslo_concurrency.lockutils [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.189 182939 DEBUG oslo_concurrency.lockutils [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.189 182939 DEBUG oslo_concurrency.lockutils [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.215 182939 INFO nova.compute.manager [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Terminating instance
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.225 182939 DEBUG nova.compute.manager [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:15:16 compute-0 kernel: tapb40275b3-86 (unregistering): left promiscuous mode
Jan 22 00:15:16 compute-0 NetworkManager[55139]: <info>  [1769040916.2472] device (tapb40275b3-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.256 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:16 compute-0 ovn_controller[95047]: 2026-01-22T00:15:16Z|00495|binding|INFO|Releasing lport b40275b3-8698-4cfd-8377-b4cd53eda629 from this chassis (sb_readonly=0)
Jan 22 00:15:16 compute-0 ovn_controller[95047]: 2026-01-22T00:15:16Z|00496|binding|INFO|Setting lport b40275b3-8698-4cfd-8377-b4cd53eda629 down in Southbound
Jan 22 00:15:16 compute-0 ovn_controller[95047]: 2026-01-22T00:15:16Z|00497|binding|INFO|Removing iface tapb40275b3-86 ovn-installed in OVS
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.259 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.265 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:c5:ff 10.100.0.7'], port_security=['fa:16:3e:c1:c5:ff 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'eba259ea-7c59-4a4a-9f0d-cb645fcc5eae', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c19848fe-a435-4c66-8190-94e8e9e1b266', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f02fc2085f6340ffa895cb894fdf5882', 'neutron:revision_number': '4', 'neutron:security_group_ids': '01430d09-4466-4c63-8f42-d6bde77fcc79', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=639fe658-8c59-48e8-bb7b-52cdb7487f54, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=b40275b3-8698-4cfd-8377-b4cd53eda629) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.266 104408 INFO neutron.agent.ovn.metadata.agent [-] Port b40275b3-8698-4cfd-8377-b4cd53eda629 in datapath c19848fe-a435-4c66-8190-94e8e9e1b266 unbound from our chassis
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.268 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c19848fe-a435-4c66-8190-94e8e9e1b266, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.269 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a80ad3b8-9910-4f2e-bafe-10e828eec7c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.270 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 namespace which is not needed anymore
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.274 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:16 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000007f.scope: Deactivated successfully.
Jan 22 00:15:16 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000007f.scope: Consumed 4.400s CPU time.
Jan 22 00:15:16 compute-0 systemd-machined[154182]: Machine qemu-64-instance-0000007f terminated.
Jan 22 00:15:16 compute-0 podman[232745]: 2026-01-22 00:15:16.329673542 +0000 UTC m=+0.058263842 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:15:16 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232719]: [NOTICE]   (232723) : haproxy version is 2.8.14-c23fe91
Jan 22 00:15:16 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232719]: [NOTICE]   (232723) : path to executable is /usr/sbin/haproxy
Jan 22 00:15:16 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232719]: [WARNING]  (232723) : Exiting Master process...
Jan 22 00:15:16 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232719]: [ALERT]    (232723) : Current worker (232725) exited with code 143 (Terminated)
Jan 22 00:15:16 compute-0 neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266[232719]: [WARNING]  (232723) : All workers exited. Exiting... (0)
Jan 22 00:15:16 compute-0 systemd[1]: libpod-22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063.scope: Deactivated successfully.
Jan 22 00:15:16 compute-0 podman[232787]: 2026-01-22 00:15:16.398148441 +0000 UTC m=+0.043921866 container died 22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:15:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063-userdata-shm.mount: Deactivated successfully.
Jan 22 00:15:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-e7057abe8c270c94786709360c1889d23a9cb75382fd04407a982aa1c5d53c37-merged.mount: Deactivated successfully.
Jan 22 00:15:16 compute-0 podman[232787]: 2026-01-22 00:15:16.431772586 +0000 UTC m=+0.077545971 container cleanup 22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:15:16 compute-0 systemd[1]: libpod-conmon-22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063.scope: Deactivated successfully.
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.449 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.453 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.498 182939 INFO nova.virt.libvirt.driver [-] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Instance destroyed successfully.
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.499 182939 DEBUG nova.objects.instance [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lazy-loading 'resources' on Instance uuid eba259ea-7c59-4a4a-9f0d-cb645fcc5eae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:16 compute-0 podman[232822]: 2026-01-22 00:15:16.511451347 +0000 UTC m=+0.050188223 container remove 22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.515 182939 DEBUG nova.virt.libvirt.vif [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-1628567258',display_name='tempest-MultipleCreateTestJSON-server-1628567258-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-1628567258-2',id=127,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-22T00:15:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f02fc2085f6340ffa895cb894fdf5882',ramdisk_id='',reservation_id='r-ivpvh0b6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-620854064',owner_user_name='tempest-MultipleCreateTestJSON-620854064-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:15:12Z,user_data=None,user_id='00a7d470e36045deabd5584bd3a9c73e',uuid=eba259ea-7c59-4a4a-9f0d-cb645fcc5eae,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b40275b3-8698-4cfd-8377-b4cd53eda629", "address": "fa:16:3e:c1:c5:ff", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40275b3-86", "ovs_interfaceid": "b40275b3-8698-4cfd-8377-b4cd53eda629", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.516 182939 DEBUG nova.network.os_vif_util [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converting VIF {"id": "b40275b3-8698-4cfd-8377-b4cd53eda629", "address": "fa:16:3e:c1:c5:ff", "network": {"id": "c19848fe-a435-4c66-8190-94e8e9e1b266", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1960670064-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f02fc2085f6340ffa895cb894fdf5882", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb40275b3-86", "ovs_interfaceid": "b40275b3-8698-4cfd-8377-b4cd53eda629", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.517 182939 DEBUG nova.network.os_vif_util [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:c5:ff,bridge_name='br-int',has_traffic_filtering=True,id=b40275b3-8698-4cfd-8377-b4cd53eda629,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40275b3-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.517 182939 DEBUG os_vif [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:c5:ff,bridge_name='br-int',has_traffic_filtering=True,id=b40275b3-8698-4cfd-8377-b4cd53eda629,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40275b3-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.517 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8d03a7-9237-4097-ac35-0fd5e64c7769]: (4, ('Thu Jan 22 12:15:16 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 (22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063)\n22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063\nThu Jan 22 12:15:16 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 (22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063)\n22f1cc2f45656c5e95c2dd61b2c70e20cc54ef8b93f79d737acc74399b2a0063\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.519 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[74ae2ead-c1bc-4dbc-a087-94d0c8730c2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.520 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.520 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb40275b3-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.520 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc19848fe-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.522 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:16 compute-0 kernel: tapc19848fe-a0: left promiscuous mode
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.525 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.537 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.540 182939 INFO os_vif [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:c5:ff,bridge_name='br-int',has_traffic_filtering=True,id=b40275b3-8698-4cfd-8377-b4cd53eda629,network=Network(c19848fe-a435-4c66-8190-94e8e9e1b266),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb40275b3-86')
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.540 182939 INFO nova.virt.libvirt.driver [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Deleting instance files /var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae_del
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.541 182939 INFO nova.virt.libvirt.driver [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Deletion of /var/lib/nova/instances/eba259ea-7c59-4a4a-9f0d-cb645fcc5eae_del complete
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.541 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bc1cb9e5-4843-4182-bc83-c0a8d040b235]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.560 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc42e06-f689-4939-8387-408e15f7394f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.561 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a4bc1731-4a84-438f-9dec-1cd515fb07c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.586 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe71fda-c2c7-4eb5-ba69-0cb9dc36adaa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541427, 'reachable_time': 42425, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232851, 'error': None, 'target': 'ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.589 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c19848fe-a435-4c66-8190-94e8e9e1b266 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:15:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:16.589 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[f34e65e2-6019-42eb-b8c8-8c415735d569]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:15:16 compute-0 systemd[1]: run-netns-ovnmeta\x2dc19848fe\x2da435\x2d4c66\x2d8190\x2d94e8e9e1b266.mount: Deactivated successfully.
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.613 182939 INFO nova.compute.manager [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.613 182939 DEBUG oslo.service.loopingcall [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.614 182939 DEBUG nova.compute.manager [-] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:15:16 compute-0 nova_compute[182935]: 2026-01-22 00:15:16.614 182939 DEBUG nova.network.neutron [-] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:15:17 compute-0 nova_compute[182935]: 2026-01-22 00:15:17.106 182939 DEBUG nova.compute.manager [req-6d1baac7-2019-4572-8488-80351f3f6f46 req-cf654fe9-9b6b-4956-87e5-650fa04d83fd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Received event network-vif-unplugged-b40275b3-8698-4cfd-8377-b4cd53eda629 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:17 compute-0 nova_compute[182935]: 2026-01-22 00:15:17.107 182939 DEBUG oslo_concurrency.lockutils [req-6d1baac7-2019-4572-8488-80351f3f6f46 req-cf654fe9-9b6b-4956-87e5-650fa04d83fd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:17 compute-0 nova_compute[182935]: 2026-01-22 00:15:17.109 182939 DEBUG oslo_concurrency.lockutils [req-6d1baac7-2019-4572-8488-80351f3f6f46 req-cf654fe9-9b6b-4956-87e5-650fa04d83fd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:17 compute-0 nova_compute[182935]: 2026-01-22 00:15:17.110 182939 DEBUG oslo_concurrency.lockutils [req-6d1baac7-2019-4572-8488-80351f3f6f46 req-cf654fe9-9b6b-4956-87e5-650fa04d83fd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:17 compute-0 nova_compute[182935]: 2026-01-22 00:15:17.110 182939 DEBUG nova.compute.manager [req-6d1baac7-2019-4572-8488-80351f3f6f46 req-cf654fe9-9b6b-4956-87e5-650fa04d83fd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] No waiting events found dispatching network-vif-unplugged-b40275b3-8698-4cfd-8377-b4cd53eda629 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:15:17 compute-0 nova_compute[182935]: 2026-01-22 00:15:17.110 182939 DEBUG nova.compute.manager [req-6d1baac7-2019-4572-8488-80351f3f6f46 req-cf654fe9-9b6b-4956-87e5-650fa04d83fd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Received event network-vif-unplugged-b40275b3-8698-4cfd-8377-b4cd53eda629 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:15:17 compute-0 nova_compute[182935]: 2026-01-22 00:15:17.405 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:17 compute-0 nova_compute[182935]: 2026-01-22 00:15:17.992 182939 DEBUG nova.network.neutron [-] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:15:18 compute-0 nova_compute[182935]: 2026-01-22 00:15:18.013 182939 INFO nova.compute.manager [-] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Took 1.40 seconds to deallocate network for instance.
Jan 22 00:15:18 compute-0 nova_compute[182935]: 2026-01-22 00:15:18.161 182939 DEBUG oslo_concurrency.lockutils [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:18 compute-0 nova_compute[182935]: 2026-01-22 00:15:18.162 182939 DEBUG oslo_concurrency.lockutils [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:18 compute-0 nova_compute[182935]: 2026-01-22 00:15:18.250 182939 DEBUG nova.compute.provider_tree [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:15:18 compute-0 nova_compute[182935]: 2026-01-22 00:15:18.271 182939 DEBUG nova.scheduler.client.report [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:15:18 compute-0 nova_compute[182935]: 2026-01-22 00:15:18.313 182939 DEBUG oslo_concurrency.lockutils [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:18 compute-0 nova_compute[182935]: 2026-01-22 00:15:18.339 182939 INFO nova.scheduler.client.report [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Deleted allocations for instance eba259ea-7c59-4a4a-9f0d-cb645fcc5eae
Jan 22 00:15:18 compute-0 nova_compute[182935]: 2026-01-22 00:15:18.431 182939 DEBUG oslo_concurrency.lockutils [None req-27ffa23f-85cf-483c-b3ca-d4a299bc3fac 00a7d470e36045deabd5584bd3a9c73e f02fc2085f6340ffa895cb894fdf5882 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:19 compute-0 nova_compute[182935]: 2026-01-22 00:15:19.204 182939 DEBUG nova.compute.manager [req-35733afe-eb6a-4170-9080-1764e4e76070 req-50d69d10-63fa-470a-a2ad-65809ca368a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Received event network-vif-plugged-b40275b3-8698-4cfd-8377-b4cd53eda629 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:19 compute-0 nova_compute[182935]: 2026-01-22 00:15:19.205 182939 DEBUG oslo_concurrency.lockutils [req-35733afe-eb6a-4170-9080-1764e4e76070 req-50d69d10-63fa-470a-a2ad-65809ca368a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:19 compute-0 nova_compute[182935]: 2026-01-22 00:15:19.205 182939 DEBUG oslo_concurrency.lockutils [req-35733afe-eb6a-4170-9080-1764e4e76070 req-50d69d10-63fa-470a-a2ad-65809ca368a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:19 compute-0 nova_compute[182935]: 2026-01-22 00:15:19.205 182939 DEBUG oslo_concurrency.lockutils [req-35733afe-eb6a-4170-9080-1764e4e76070 req-50d69d10-63fa-470a-a2ad-65809ca368a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "eba259ea-7c59-4a4a-9f0d-cb645fcc5eae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:19 compute-0 nova_compute[182935]: 2026-01-22 00:15:19.205 182939 DEBUG nova.compute.manager [req-35733afe-eb6a-4170-9080-1764e4e76070 req-50d69d10-63fa-470a-a2ad-65809ca368a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] No waiting events found dispatching network-vif-plugged-b40275b3-8698-4cfd-8377-b4cd53eda629 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:15:19 compute-0 nova_compute[182935]: 2026-01-22 00:15:19.206 182939 WARNING nova.compute.manager [req-35733afe-eb6a-4170-9080-1764e4e76070 req-50d69d10-63fa-470a-a2ad-65809ca368a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Received unexpected event network-vif-plugged-b40275b3-8698-4cfd-8377-b4cd53eda629 for instance with vm_state deleted and task_state None.
Jan 22 00:15:19 compute-0 nova_compute[182935]: 2026-01-22 00:15:19.206 182939 DEBUG nova.compute.manager [req-35733afe-eb6a-4170-9080-1764e4e76070 req-50d69d10-63fa-470a-a2ad-65809ca368a1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Received event network-vif-deleted-b40275b3-8698-4cfd-8377-b4cd53eda629 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:15:20 compute-0 nova_compute[182935]: 2026-01-22 00:15:20.899 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:15:21 compute-0 nova_compute[182935]: 2026-01-22 00:15:21.523 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:21 compute-0 nova_compute[182935]: 2026-01-22 00:15:21.946 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:22 compute-0 nova_compute[182935]: 2026-01-22 00:15:22.445 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:22.506 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:15:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:22.506 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:15:22 compute-0 nova_compute[182935]: 2026-01-22 00:15:22.507 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:22 compute-0 podman[232852]: 2026-01-22 00:15:22.683700861 +0000 UTC m=+0.056416838 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 00:15:22 compute-0 podman[232853]: 2026-01-22 00:15:22.692069516 +0000 UTC m=+0.061415955 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:15:23 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000080.scope: Deactivated successfully.
Jan 22 00:15:23 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000080.scope: Consumed 13.632s CPU time.
Jan 22 00:15:23 compute-0 systemd-machined[154182]: Machine qemu-63-instance-00000080 terminated.
Jan 22 00:15:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:15:23.508 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:15:23 compute-0 nova_compute[182935]: 2026-01-22 00:15:23.919 182939 INFO nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Instance shutdown successfully after 13 seconds.
Jan 22 00:15:23 compute-0 nova_compute[182935]: 2026-01-22 00:15:23.925 182939 INFO nova.virt.libvirt.driver [-] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Instance destroyed successfully.
Jan 22 00:15:23 compute-0 nova_compute[182935]: 2026-01-22 00:15:23.931 182939 INFO nova.virt.libvirt.driver [-] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Instance destroyed successfully.
Jan 22 00:15:23 compute-0 nova_compute[182935]: 2026-01-22 00:15:23.931 182939 INFO nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Deleting instance files /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54_del
Jan 22 00:15:23 compute-0 nova_compute[182935]: 2026-01-22 00:15:23.932 182939 INFO nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Deletion of /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54_del complete
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.171 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.172 182939 INFO nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Creating image(s)
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.173 182939 DEBUG oslo_concurrency.lockutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "/var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.173 182939 DEBUG oslo_concurrency.lockutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "/var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.174 182939 DEBUG oslo_concurrency.lockutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "/var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.188 182939 DEBUG oslo_concurrency.processutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.291 182939 DEBUG oslo_concurrency.processutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.293 182939 DEBUG oslo_concurrency.lockutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.294 182939 DEBUG oslo_concurrency.lockutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.315 182939 DEBUG oslo_concurrency.processutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.380 182939 DEBUG oslo_concurrency.processutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.382 182939 DEBUG oslo_concurrency.processutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.425 182939 DEBUG oslo_concurrency.processutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.427 182939 DEBUG oslo_concurrency.lockutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.428 182939 DEBUG oslo_concurrency.processutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.509 182939 DEBUG oslo_concurrency.processutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.511 182939 DEBUG nova.virt.disk.api [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Checking if we can resize image /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.511 182939 DEBUG oslo_concurrency.processutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.588 182939 DEBUG oslo_concurrency.processutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.589 182939 DEBUG nova.virt.disk.api [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Cannot resize image /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.590 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.590 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Ensure instance console log exists: /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.591 182939 DEBUG oslo_concurrency.lockutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.591 182939 DEBUG oslo_concurrency.lockutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.591 182939 DEBUG oslo_concurrency.lockutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.593 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.598 182939 WARNING nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.604 182939 DEBUG nova.virt.libvirt.host [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.604 182939 DEBUG nova.virt.libvirt.host [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.607 182939 DEBUG nova.virt.libvirt.host [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.608 182939 DEBUG nova.virt.libvirt.host [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.610 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.610 182939 DEBUG nova.virt.hardware [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.610 182939 DEBUG nova.virt.hardware [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.611 182939 DEBUG nova.virt.hardware [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.611 182939 DEBUG nova.virt.hardware [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.611 182939 DEBUG nova.virt.hardware [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.611 182939 DEBUG nova.virt.hardware [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.611 182939 DEBUG nova.virt.hardware [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.612 182939 DEBUG nova.virt.hardware [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.612 182939 DEBUG nova.virt.hardware [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.612 182939 DEBUG nova.virt.hardware [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.612 182939 DEBUG nova.virt.hardware [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.612 182939 DEBUG nova.objects.instance [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f75b1ffc-1260-4530-b589-8d5036f53a54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.635 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:15:24 compute-0 nova_compute[182935]:   <uuid>f75b1ffc-1260-4530-b589-8d5036f53a54</uuid>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   <name>instance-00000080</name>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerShowV254Test-server-1890884869</nova:name>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:15:24</nova:creationTime>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:15:24 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:15:24 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:15:24 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:15:24 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:15:24 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:15:24 compute-0 nova_compute[182935]:         <nova:user uuid="3ce6bd0cf3584a949e6c70eda3442ebd">tempest-ServerShowV254Test-1444323064-project-member</nova:user>
Jan 22 00:15:24 compute-0 nova_compute[182935]:         <nova:project uuid="e6d261bcfda9475db1fa317bd2df68f4">tempest-ServerShowV254Test-1444323064</nova:project>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="3e1dda74-3c6a-4d29-8792-32134d1c36c5"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <system>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <entry name="serial">f75b1ffc-1260-4530-b589-8d5036f53a54</entry>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <entry name="uuid">f75b1ffc-1260-4530-b589-8d5036f53a54</entry>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     </system>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   <os>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   </os>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   <features>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   </features>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.config"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/console.log" append="off"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <video>
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     </video>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:15:24 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:15:24 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:15:24 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:15:24 compute-0 nova_compute[182935]: </domain>
Jan 22 00:15:24 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.919 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.920 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.920 182939 INFO nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Using config drive
Jan 22 00:15:24 compute-0 nova_compute[182935]: 2026-01-22 00:15:24.935 182939 DEBUG nova.objects.instance [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lazy-loading 'ec2_ids' on Instance uuid f75b1ffc-1260-4530-b589-8d5036f53a54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:25 compute-0 nova_compute[182935]: 2026-01-22 00:15:25.137 182939 INFO nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Creating config drive at /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.config
Jan 22 00:15:25 compute-0 nova_compute[182935]: 2026-01-22 00:15:25.141 182939 DEBUG oslo_concurrency.processutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt5wsibya execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:25 compute-0 nova_compute[182935]: 2026-01-22 00:15:25.265 182939 DEBUG oslo_concurrency.processutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt5wsibya" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:25 compute-0 systemd-machined[154182]: New machine qemu-65-instance-00000080.
Jan 22 00:15:25 compute-0 systemd[1]: Started Virtual Machine qemu-65-instance-00000080.
Jan 22 00:15:25 compute-0 nova_compute[182935]: 2026-01-22 00:15:25.928 182939 DEBUG nova.compute.manager [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:15:25 compute-0 nova_compute[182935]: 2026-01-22 00:15:25.928 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:15:25 compute-0 nova_compute[182935]: 2026-01-22 00:15:25.930 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for f75b1ffc-1260-4530-b589-8d5036f53a54 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:15:25 compute-0 nova_compute[182935]: 2026-01-22 00:15:25.930 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040925.9295342, f75b1ffc-1260-4530-b589-8d5036f53a54 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:25 compute-0 nova_compute[182935]: 2026-01-22 00:15:25.930 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] VM Resumed (Lifecycle Event)
Jan 22 00:15:25 compute-0 nova_compute[182935]: 2026-01-22 00:15:25.936 182939 INFO nova.virt.libvirt.driver [-] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Instance spawned successfully.
Jan 22 00:15:25 compute-0 nova_compute[182935]: 2026-01-22 00:15:25.937 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.065 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.069 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.069 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.070 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.071 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.071 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.072 182939 DEBUG nova.virt.libvirt.driver [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.079 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.133 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.134 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040925.9295928, f75b1ffc-1260-4530-b589-8d5036f53a54 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.134 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] VM Started (Lifecycle Event)
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.165 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.170 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.176 182939 DEBUG nova.compute.manager [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.208 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.281 182939 DEBUG oslo_concurrency.lockutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.281 182939 DEBUG oslo_concurrency.lockutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.282 182939 DEBUG nova.objects.instance [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.387 182939 DEBUG oslo_concurrency.lockutils [None req-cd26db56-285a-486c-8e05-ae61622fa46e 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:26 compute-0 nova_compute[182935]: 2026-01-22 00:15:26.525 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:27 compute-0 nova_compute[182935]: 2026-01-22 00:15:27.446 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:27 compute-0 nova_compute[182935]: 2026-01-22 00:15:27.609 182939 DEBUG oslo_concurrency.lockutils [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "f75b1ffc-1260-4530-b589-8d5036f53a54" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:27 compute-0 nova_compute[182935]: 2026-01-22 00:15:27.609 182939 DEBUG oslo_concurrency.lockutils [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "f75b1ffc-1260-4530-b589-8d5036f53a54" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:27 compute-0 nova_compute[182935]: 2026-01-22 00:15:27.610 182939 DEBUG oslo_concurrency.lockutils [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "f75b1ffc-1260-4530-b589-8d5036f53a54-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:27 compute-0 nova_compute[182935]: 2026-01-22 00:15:27.610 182939 DEBUG oslo_concurrency.lockutils [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "f75b1ffc-1260-4530-b589-8d5036f53a54-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:27 compute-0 nova_compute[182935]: 2026-01-22 00:15:27.610 182939 DEBUG oslo_concurrency.lockutils [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "f75b1ffc-1260-4530-b589-8d5036f53a54-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:27 compute-0 nova_compute[182935]: 2026-01-22 00:15:27.626 182939 INFO nova.compute.manager [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Terminating instance
Jan 22 00:15:27 compute-0 nova_compute[182935]: 2026-01-22 00:15:27.646 182939 DEBUG oslo_concurrency.lockutils [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "refresh_cache-f75b1ffc-1260-4530-b589-8d5036f53a54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:15:27 compute-0 nova_compute[182935]: 2026-01-22 00:15:27.647 182939 DEBUG oslo_concurrency.lockutils [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquired lock "refresh_cache-f75b1ffc-1260-4530-b589-8d5036f53a54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:15:27 compute-0 nova_compute[182935]: 2026-01-22 00:15:27.648 182939 DEBUG nova.network.neutron [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:15:27 compute-0 nova_compute[182935]: 2026-01-22 00:15:27.898 182939 DEBUG nova.network.neutron [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:15:29 compute-0 nova_compute[182935]: 2026-01-22 00:15:29.888 182939 DEBUG nova.network.neutron [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.168 182939 DEBUG oslo_concurrency.lockutils [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Releasing lock "refresh_cache-f75b1ffc-1260-4530-b589-8d5036f53a54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.168 182939 DEBUG nova.compute.manager [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:15:30 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000080.scope: Deactivated successfully.
Jan 22 00:15:30 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000080.scope: Consumed 4.872s CPU time.
Jan 22 00:15:30 compute-0 systemd-machined[154182]: Machine qemu-65-instance-00000080 terminated.
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.434 182939 INFO nova.virt.libvirt.driver [-] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Instance destroyed successfully.
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.435 182939 DEBUG nova.objects.instance [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lazy-loading 'resources' on Instance uuid f75b1ffc-1260-4530-b589-8d5036f53a54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.454 182939 INFO nova.virt.libvirt.driver [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Deleting instance files /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54_del
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.455 182939 INFO nova.virt.libvirt.driver [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Deletion of /var/lib/nova/instances/f75b1ffc-1260-4530-b589-8d5036f53a54_del complete
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.530 182939 INFO nova.compute.manager [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.531 182939 DEBUG oslo.service.loopingcall [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.532 182939 DEBUG nova.compute.manager [-] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.532 182939 DEBUG nova.network.neutron [-] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.884 182939 DEBUG nova.network.neutron [-] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.916 182939 DEBUG nova.network.neutron [-] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:15:30 compute-0 nova_compute[182935]: 2026-01-22 00:15:30.938 182939 INFO nova.compute.manager [-] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Took 0.41 seconds to deallocate network for instance.
Jan 22 00:15:31 compute-0 nova_compute[182935]: 2026-01-22 00:15:31.019 182939 DEBUG oslo_concurrency.lockutils [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:31 compute-0 nova_compute[182935]: 2026-01-22 00:15:31.020 182939 DEBUG oslo_concurrency.lockutils [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:31 compute-0 nova_compute[182935]: 2026-01-22 00:15:31.101 182939 DEBUG nova.compute.provider_tree [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:15:31 compute-0 nova_compute[182935]: 2026-01-22 00:15:31.141 182939 DEBUG nova.scheduler.client.report [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:15:31 compute-0 nova_compute[182935]: 2026-01-22 00:15:31.176 182939 DEBUG oslo_concurrency.lockutils [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:31 compute-0 nova_compute[182935]: 2026-01-22 00:15:31.211 182939 INFO nova.scheduler.client.report [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Deleted allocations for instance f75b1ffc-1260-4530-b589-8d5036f53a54
Jan 22 00:15:31 compute-0 nova_compute[182935]: 2026-01-22 00:15:31.313 182939 DEBUG oslo_concurrency.lockutils [None req-0f6082ce-6ec5-49bc-9aff-ee7c1ad6425b 3ce6bd0cf3584a949e6c70eda3442ebd e6d261bcfda9475db1fa317bd2df68f4 - - default default] Lock "f75b1ffc-1260-4530-b589-8d5036f53a54" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:31 compute-0 nova_compute[182935]: 2026-01-22 00:15:31.496 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040916.4957054, eba259ea-7c59-4a4a-9f0d-cb645fcc5eae => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:31 compute-0 nova_compute[182935]: 2026-01-22 00:15:31.497 182939 INFO nova.compute.manager [-] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] VM Stopped (Lifecycle Event)
Jan 22 00:15:31 compute-0 nova_compute[182935]: 2026-01-22 00:15:31.527 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:31 compute-0 nova_compute[182935]: 2026-01-22 00:15:31.533 182939 DEBUG nova.compute.manager [None req-a87ce555-cc13-44e8-a8ac-a6d6932e5bdf - - - - - -] [instance: eba259ea-7c59-4a4a-9f0d-cb645fcc5eae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:32 compute-0 nova_compute[182935]: 2026-01-22 00:15:32.448 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:36 compute-0 nova_compute[182935]: 2026-01-22 00:15:36.529 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:36 compute-0 podman[232952]: 2026-01-22 00:15:36.696269469 +0000 UTC m=+0.058088957 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:15:36 compute-0 podman[232951]: 2026-01-22 00:15:36.77936581 +0000 UTC m=+0.142091489 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 00:15:37 compute-0 nova_compute[182935]: 2026-01-22 00:15:37.450 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:38 compute-0 nova_compute[182935]: 2026-01-22 00:15:38.666 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:38 compute-0 nova_compute[182935]: 2026-01-22 00:15:38.667 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:38 compute-0 nova_compute[182935]: 2026-01-22 00:15:38.712 182939 DEBUG nova.compute.manager [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:15:38 compute-0 nova_compute[182935]: 2026-01-22 00:15:38.834 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:38 compute-0 nova_compute[182935]: 2026-01-22 00:15:38.835 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:38 compute-0 nova_compute[182935]: 2026-01-22 00:15:38.841 182939 DEBUG nova.virt.hardware [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:15:38 compute-0 nova_compute[182935]: 2026-01-22 00:15:38.841 182939 INFO nova.compute.claims [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:15:38 compute-0 nova_compute[182935]: 2026-01-22 00:15:38.988 182939 DEBUG nova.compute.provider_tree [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.004 182939 DEBUG nova.scheduler.client.report [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.076 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.077 182939 DEBUG nova.compute.manager [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.161 182939 DEBUG nova.compute.manager [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.180 182939 INFO nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.204 182939 DEBUG nova.compute.manager [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.383 182939 DEBUG nova.compute.manager [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.385 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.386 182939 INFO nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Creating image(s)
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.387 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "/var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.388 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "/var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.389 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "/var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.414 182939 DEBUG oslo_concurrency.processutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.515 182939 DEBUG oslo_concurrency.processutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.517 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.519 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.546 182939 DEBUG oslo_concurrency.processutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.623 182939 DEBUG oslo_concurrency.processutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.624 182939 DEBUG oslo_concurrency.processutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.808 182939 DEBUG oslo_concurrency.processutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk 1073741824" returned: 0 in 0.184s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.809 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.809 182939 DEBUG oslo_concurrency.processutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.893 182939 DEBUG oslo_concurrency.processutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.894 182939 DEBUG nova.virt.disk.api [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Checking if we can resize image /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.895 182939 DEBUG oslo_concurrency.processutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.948 182939 DEBUG oslo_concurrency.processutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.950 182939 DEBUG nova.virt.disk.api [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Cannot resize image /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.950 182939 DEBUG nova.objects.instance [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lazy-loading 'migration_context' on Instance uuid 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.970 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.970 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Ensure instance console log exists: /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.971 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.971 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.971 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.973 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.976 182939 WARNING nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.981 182939 DEBUG nova.virt.libvirt.host [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.982 182939 DEBUG nova.virt.libvirt.host [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.985 182939 DEBUG nova.virt.libvirt.host [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.985 182939 DEBUG nova.virt.libvirt.host [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.986 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.987 182939 DEBUG nova.virt.hardware [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.987 182939 DEBUG nova.virt.hardware [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.987 182939 DEBUG nova.virt.hardware [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.987 182939 DEBUG nova.virt.hardware [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.988 182939 DEBUG nova.virt.hardware [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.988 182939 DEBUG nova.virt.hardware [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.988 182939 DEBUG nova.virt.hardware [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.988 182939 DEBUG nova.virt.hardware [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.988 182939 DEBUG nova.virt.hardware [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.989 182939 DEBUG nova.virt.hardware [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.989 182939 DEBUG nova.virt.hardware [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:15:39 compute-0 nova_compute[182935]: 2026-01-22 00:15:39.992 182939 DEBUG nova.objects.instance [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.017 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:15:40 compute-0 nova_compute[182935]:   <uuid>0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6</uuid>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   <name>instance-00000083</name>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerShowV257Test-server-1110820162</nova:name>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:15:39</nova:creationTime>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:15:40 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:15:40 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:15:40 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:15:40 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:15:40 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:15:40 compute-0 nova_compute[182935]:         <nova:user uuid="7b3e1ca620ce40dea19faa9efb3a1476">tempest-ServerShowV257Test-1069192355-project-member</nova:user>
Jan 22 00:15:40 compute-0 nova_compute[182935]:         <nova:project uuid="5f10d4e59914438b995b44069d2d0127">tempest-ServerShowV257Test-1069192355</nova:project>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <system>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <entry name="serial">0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6</entry>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <entry name="uuid">0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6</entry>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     </system>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   <os>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   </os>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   <features>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   </features>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.config"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/console.log" append="off"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <video>
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     </video>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:15:40 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:15:40 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:15:40 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:15:40 compute-0 nova_compute[182935]: </domain>
Jan 22 00:15:40 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.070 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.071 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.071 182939 INFO nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Using config drive
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.437 182939 INFO nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Creating config drive at /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.config
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.442 182939 DEBUG oslo_concurrency.processutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiucf_xsw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.574 182939 DEBUG oslo_concurrency.processutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiucf_xsw" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:40 compute-0 systemd-machined[154182]: New machine qemu-66-instance-00000083.
Jan 22 00:15:40 compute-0 systemd[1]: Started Virtual Machine qemu-66-instance-00000083.
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.823 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.823 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.824 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.824 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.920 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.978 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:40 compute-0 nova_compute[182935]: 2026-01-22 00:15:40.979 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.045 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:15:41 compute-0 podman[233045]: 2026-01-22 00:15:41.109322369 +0000 UTC m=+0.052672671 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.116 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040941.1159885, 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.117 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] VM Resumed (Lifecycle Event)
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.119 182939 DEBUG nova.compute.manager [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.120 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.124 182939 INFO nova.virt.libvirt.driver [-] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Instance spawned successfully.
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.125 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.141 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.145 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.149 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.149 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.149 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.150 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.150 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.151 182939 DEBUG nova.virt.libvirt.driver [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.180 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.180 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040941.1169467, 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.181 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] VM Started (Lifecycle Event)
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.212 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.215 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.222 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.223 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5703MB free_disk=73.12661361694336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.223 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.224 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.249 182939 INFO nova.compute.manager [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Took 1.86 seconds to spawn the instance on the hypervisor.
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.249 182939 DEBUG nova.compute.manager [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.256 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.530 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.551 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.552 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.552 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.573 182939 INFO nova.compute.manager [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Took 2.78 seconds to build instance.
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.591 182939 DEBUG oslo_concurrency.lockutils [None req-9e49b309-5a82-4f63-bb24-dc297afce016 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.924s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.661 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.685 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.711 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:15:41 compute-0 nova_compute[182935]: 2026-01-22 00:15:41.711 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:15:42 compute-0 nova_compute[182935]: 2026-01-22 00:15:42.451 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:42 compute-0 nova_compute[182935]: 2026-01-22 00:15:42.712 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:42 compute-0 nova_compute[182935]: 2026-01-22 00:15:42.713 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:15:42 compute-0 nova_compute[182935]: 2026-01-22 00:15:42.713 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:15:42 compute-0 nova_compute[182935]: 2026-01-22 00:15:42.956 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:15:42 compute-0 nova_compute[182935]: 2026-01-22 00:15:42.956 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:15:42 compute-0 nova_compute[182935]: 2026-01-22 00:15:42.957 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:15:42 compute-0 nova_compute[182935]: 2026-01-22 00:15:42.957 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:43 compute-0 nova_compute[182935]: 2026-01-22 00:15:43.216 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:15:43 compute-0 nova_compute[182935]: 2026-01-22 00:15:43.757 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:15:43 compute-0 nova_compute[182935]: 2026-01-22 00:15:43.777 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:15:43 compute-0 nova_compute[182935]: 2026-01-22 00:15:43.777 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:15:43 compute-0 nova_compute[182935]: 2026-01-22 00:15:43.778 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:44 compute-0 nova_compute[182935]: 2026-01-22 00:15:44.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:44 compute-0 nova_compute[182935]: 2026-01-22 00:15:44.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:15:44 compute-0 nova_compute[182935]: 2026-01-22 00:15:44.964 182939 INFO nova.compute.manager [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Rebuilding instance
Jan 22 00:15:45 compute-0 nova_compute[182935]: 2026-01-22 00:15:45.432 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040930.4316332, f75b1ffc-1260-4530-b589-8d5036f53a54 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:15:45 compute-0 nova_compute[182935]: 2026-01-22 00:15:45.433 182939 INFO nova.compute.manager [-] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] VM Stopped (Lifecycle Event)
Jan 22 00:15:45 compute-0 nova_compute[182935]: 2026-01-22 00:15:45.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:45 compute-0 sshd-session[233073]: Received disconnect from 45.148.10.151 port 51472:11:  [preauth]
Jan 22 00:15:45 compute-0 sshd-session[233073]: Disconnected from authenticating user root 45.148.10.151 port 51472 [preauth]
Jan 22 00:15:46 compute-0 nova_compute[182935]: 2026-01-22 00:15:46.531 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:46 compute-0 podman[233075]: 2026-01-22 00:15:46.712772113 +0000 UTC m=+0.073133978 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Jan 22 00:15:47 compute-0 nova_compute[182935]: 2026-01-22 00:15:47.454 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:47 compute-0 nova_compute[182935]: 2026-01-22 00:15:47.790 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:47 compute-0 nova_compute[182935]: 2026-01-22 00:15:47.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:48 compute-0 nova_compute[182935]: 2026-01-22 00:15:48.638 182939 DEBUG nova.compute.manager [None req-6dcdf612-9bf1-4d97-8768-56c7e615e044 - - - - - -] [instance: f75b1ffc-1260-4530-b589-8d5036f53a54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:48 compute-0 nova_compute[182935]: 2026-01-22 00:15:48.923 182939 DEBUG nova.compute.manager [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:15:49 compute-0 nova_compute[182935]: 2026-01-22 00:15:49.394 182939 DEBUG nova.objects.instance [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lazy-loading 'pci_requests' on Instance uuid 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:50 compute-0 nova_compute[182935]: 2026-01-22 00:15:50.173 182939 DEBUG nova.objects.instance [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:50 compute-0 nova_compute[182935]: 2026-01-22 00:15:50.194 182939 DEBUG nova.objects.instance [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lazy-loading 'resources' on Instance uuid 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:50 compute-0 nova_compute[182935]: 2026-01-22 00:15:50.208 182939 DEBUG nova.objects.instance [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lazy-loading 'migration_context' on Instance uuid 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:15:50 compute-0 nova_compute[182935]: 2026-01-22 00:15:50.234 182939 DEBUG nova.objects.instance [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 00:15:50 compute-0 nova_compute[182935]: 2026-01-22 00:15:50.239 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:15:51 compute-0 nova_compute[182935]: 2026-01-22 00:15:51.532 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:52 compute-0 nova_compute[182935]: 2026-01-22 00:15:52.514 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:53 compute-0 podman[233111]: 2026-01-22 00:15:53.698911152 +0000 UTC m=+0.062644424 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:15:53 compute-0 podman[233110]: 2026-01-22 00:15:53.720004144 +0000 UTC m=+0.087715099 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Jan 22 00:15:53 compute-0 nova_compute[182935]: 2026-01-22 00:15:53.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:54 compute-0 nova_compute[182935]: 2026-01-22 00:15:54.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:15:56 compute-0 sshd-session[233153]: Invalid user docker from 188.166.69.60 port 58782
Jan 22 00:15:56 compute-0 nova_compute[182935]: 2026-01-22 00:15:56.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:15:56 compute-0 sshd-session[233153]: Connection closed by invalid user docker 188.166.69.60 port 58782 [preauth]
Jan 22 00:15:57 compute-0 nova_compute[182935]: 2026-01-22 00:15:57.515 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:00 compute-0 nova_compute[182935]: 2026-01-22 00:16:00.284 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:16:01 compute-0 nova_compute[182935]: 2026-01-22 00:16:01.536 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:01 compute-0 nova_compute[182935]: 2026-01-22 00:16:01.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:02 compute-0 nova_compute[182935]: 2026-01-22 00:16:02.554 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:02 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000083.scope: Deactivated successfully.
Jan 22 00:16:02 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000083.scope: Consumed 13.670s CPU time.
Jan 22 00:16:02 compute-0 systemd-machined[154182]: Machine qemu-66-instance-00000083 terminated.
Jan 22 00:16:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:16:03.213 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:16:03.214 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:16:03.214 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:03 compute-0 nova_compute[182935]: 2026-01-22 00:16:03.299 182939 INFO nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Instance shutdown successfully after 13 seconds.
Jan 22 00:16:03 compute-0 nova_compute[182935]: 2026-01-22 00:16:03.305 182939 INFO nova.virt.libvirt.driver [-] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Instance destroyed successfully.
Jan 22 00:16:03 compute-0 nova_compute[182935]: 2026-01-22 00:16:03.309 182939 INFO nova.virt.libvirt.driver [-] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Instance destroyed successfully.
Jan 22 00:16:03 compute-0 nova_compute[182935]: 2026-01-22 00:16:03.310 182939 INFO nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Deleting instance files /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6_del
Jan 22 00:16:03 compute-0 nova_compute[182935]: 2026-01-22 00:16:03.311 182939 INFO nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Deletion of /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6_del complete
Jan 22 00:16:03 compute-0 nova_compute[182935]: 2026-01-22 00:16:03.978 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:16:03.983 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:16:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:16:03.985 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.038 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.039 182939 INFO nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Creating image(s)
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.040 182939 DEBUG oslo_concurrency.lockutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "/var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.040 182939 DEBUG oslo_concurrency.lockutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "/var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.042 182939 DEBUG oslo_concurrency.lockutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "/var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.071 182939 DEBUG oslo_concurrency.processutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.137 182939 DEBUG oslo_concurrency.processutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.138 182939 DEBUG oslo_concurrency.lockutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.139 182939 DEBUG oslo_concurrency.lockutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.156 182939 DEBUG oslo_concurrency.processutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.214 182939 DEBUG oslo_concurrency.processutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.217 182939 DEBUG oslo_concurrency.processutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.266 182939 DEBUG oslo_concurrency.processutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.269 182939 DEBUG oslo_concurrency.lockutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.270 182939 DEBUG oslo_concurrency.processutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.333 182939 DEBUG oslo_concurrency.processutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.335 182939 DEBUG nova.virt.disk.api [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Checking if we can resize image /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.335 182939 DEBUG oslo_concurrency.processutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.400 182939 DEBUG oslo_concurrency.processutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.401 182939 DEBUG nova.virt.disk.api [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Cannot resize image /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.401 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.401 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Ensure instance console log exists: /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.402 182939 DEBUG oslo_concurrency.lockutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.402 182939 DEBUG oslo_concurrency.lockutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.402 182939 DEBUG oslo_concurrency.lockutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.403 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.409 182939 WARNING nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.427 182939 DEBUG nova.virt.libvirt.host [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.429 182939 DEBUG nova.virt.libvirt.host [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.441 182939 DEBUG nova.virt.libvirt.host [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.442 182939 DEBUG nova.virt.libvirt.host [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.443 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.444 182939 DEBUG nova.virt.hardware [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.444 182939 DEBUG nova.virt.hardware [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.444 182939 DEBUG nova.virt.hardware [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.444 182939 DEBUG nova.virt.hardware [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.444 182939 DEBUG nova.virt.hardware [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.445 182939 DEBUG nova.virt.hardware [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.445 182939 DEBUG nova.virt.hardware [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.445 182939 DEBUG nova.virt.hardware [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.445 182939 DEBUG nova.virt.hardware [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.445 182939 DEBUG nova.virt.hardware [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.445 182939 DEBUG nova.virt.hardware [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.446 182939 DEBUG nova.objects.instance [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.468 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:16:04 compute-0 nova_compute[182935]:   <uuid>0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6</uuid>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   <name>instance-00000083</name>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <nova:name>tempest-ServerShowV257Test-server-1110820162</nova:name>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:16:04</nova:creationTime>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:16:04 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:16:04 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:16:04 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:16:04 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:16:04 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:16:04 compute-0 nova_compute[182935]:         <nova:user uuid="7b3e1ca620ce40dea19faa9efb3a1476">tempest-ServerShowV257Test-1069192355-project-member</nova:user>
Jan 22 00:16:04 compute-0 nova_compute[182935]:         <nova:project uuid="5f10d4e59914438b995b44069d2d0127">tempest-ServerShowV257Test-1069192355</nova:project>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="3e1dda74-3c6a-4d29-8792-32134d1c36c5"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <nova:ports/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <system>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <entry name="serial">0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6</entry>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <entry name="uuid">0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6</entry>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     </system>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   <os>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   </os>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   <features>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   </features>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.config"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/console.log" append="off"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <video>
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     </video>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:16:04 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:16:04 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:16:04 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:16:04 compute-0 nova_compute[182935]: </domain>
Jan 22 00:16:04 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.560 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.561 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.562 182939 INFO nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Using config drive
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.604 182939 DEBUG nova.objects.instance [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:04 compute-0 nova_compute[182935]: 2026-01-22 00:16:04.765 182939 DEBUG nova.objects.instance [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lazy-loading 'keypairs' on Instance uuid 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:04 compute-0 ovn_controller[95047]: 2026-01-22T00:16:04Z|00498|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.197 182939 INFO nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Creating config drive at /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.config
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.207 182939 DEBUG oslo_concurrency.processutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4gwe1paa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.359 182939 DEBUG oslo_concurrency.processutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4gwe1paa" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:16:05 compute-0 systemd-machined[154182]: New machine qemu-67-instance-00000083.
Jan 22 00:16:05 compute-0 systemd[1]: Started Virtual Machine qemu-67-instance-00000083.
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.713 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.713 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040965.7124333, 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.713 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] VM Resumed (Lifecycle Event)
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.716 182939 DEBUG nova.compute.manager [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.716 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.719 182939 INFO nova.virt.libvirt.driver [-] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Instance spawned successfully.
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.720 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.765 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.768 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.787 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.788 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.788 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.788 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.789 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.789 182939 DEBUG nova.virt.libvirt.driver [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.874 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.874 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769040965.7140357, 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.875 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] VM Started (Lifecycle Event)
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.904 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.908 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.973 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 22 00:16:05 compute-0 nova_compute[182935]: 2026-01-22 00:16:05.978 182939 DEBUG nova.compute.manager [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:06 compute-0 nova_compute[182935]: 2026-01-22 00:16:06.200 182939 DEBUG oslo_concurrency.lockutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:06 compute-0 nova_compute[182935]: 2026-01-22 00:16:06.200 182939 DEBUG oslo_concurrency.lockutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:06 compute-0 nova_compute[182935]: 2026-01-22 00:16:06.200 182939 DEBUG nova.objects.instance [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 00:16:06 compute-0 nova_compute[182935]: 2026-01-22 00:16:06.430 182939 DEBUG oslo_concurrency.lockutils [None req-f33b1f11-4320-41b6-b824-a94c0479abf4 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:06 compute-0 nova_compute[182935]: 2026-01-22 00:16:06.537 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:07 compute-0 nova_compute[182935]: 2026-01-22 00:16:07.597 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:07 compute-0 podman[233208]: 2026-01-22 00:16:07.716021797 +0000 UTC m=+0.077572882 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:16:07 compute-0 podman[233207]: 2026-01-22 00:16:07.770837756 +0000 UTC m=+0.137512521 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:16:09 compute-0 nova_compute[182935]: 2026-01-22 00:16:09.799 182939 DEBUG oslo_concurrency.lockutils [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:09 compute-0 nova_compute[182935]: 2026-01-22 00:16:09.800 182939 DEBUG oslo_concurrency.lockutils [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:09 compute-0 nova_compute[182935]: 2026-01-22 00:16:09.800 182939 DEBUG oslo_concurrency.lockutils [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:09 compute-0 nova_compute[182935]: 2026-01-22 00:16:09.801 182939 DEBUG oslo_concurrency.lockutils [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:09 compute-0 nova_compute[182935]: 2026-01-22 00:16:09.801 182939 DEBUG oslo_concurrency.lockutils [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:09 compute-0 nova_compute[182935]: 2026-01-22 00:16:09.818 182939 INFO nova.compute.manager [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Terminating instance
Jan 22 00:16:09 compute-0 nova_compute[182935]: 2026-01-22 00:16:09.828 182939 DEBUG oslo_concurrency.lockutils [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "refresh_cache-0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:16:09 compute-0 nova_compute[182935]: 2026-01-22 00:16:09.829 182939 DEBUG oslo_concurrency.lockutils [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquired lock "refresh_cache-0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:16:09 compute-0 nova_compute[182935]: 2026-01-22 00:16:09.829 182939 DEBUG nova.network.neutron [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:16:11 compute-0 nova_compute[182935]: 2026-01-22 00:16:11.538 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:11 compute-0 nova_compute[182935]: 2026-01-22 00:16:11.565 182939 DEBUG nova.network.neutron [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:16:11 compute-0 podman[233255]: 2026-01-22 00:16:11.67650115 +0000 UTC m=+0.051371331 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:16:12 compute-0 nova_compute[182935]: 2026-01-22 00:16:12.599 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:13 compute-0 nova_compute[182935]: 2026-01-22 00:16:13.655 182939 DEBUG nova.network.neutron [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:13 compute-0 nova_compute[182935]: 2026-01-22 00:16:13.687 182939 DEBUG oslo_concurrency.lockutils [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Releasing lock "refresh_cache-0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:16:13 compute-0 nova_compute[182935]: 2026-01-22 00:16:13.688 182939 DEBUG nova.compute.manager [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:16:13 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000083.scope: Deactivated successfully.
Jan 22 00:16:13 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000083.scope: Consumed 8.254s CPU time.
Jan 22 00:16:13 compute-0 systemd-machined[154182]: Machine qemu-67-instance-00000083 terminated.
Jan 22 00:16:13 compute-0 nova_compute[182935]: 2026-01-22 00:16:13.929 182939 INFO nova.virt.libvirt.driver [-] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Instance destroyed successfully.
Jan 22 00:16:13 compute-0 nova_compute[182935]: 2026-01-22 00:16:13.931 182939 DEBUG nova.objects.instance [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lazy-loading 'resources' on Instance uuid 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:16:13 compute-0 nova_compute[182935]: 2026-01-22 00:16:13.966 182939 INFO nova.virt.libvirt.driver [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Deleting instance files /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6_del
Jan 22 00:16:13 compute-0 nova_compute[182935]: 2026-01-22 00:16:13.966 182939 INFO nova.virt.libvirt.driver [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Deletion of /var/lib/nova/instances/0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6_del complete
Jan 22 00:16:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:16:13.988 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:14 compute-0 nova_compute[182935]: 2026-01-22 00:16:14.413 182939 INFO nova.compute.manager [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Took 0.72 seconds to destroy the instance on the hypervisor.
Jan 22 00:16:14 compute-0 nova_compute[182935]: 2026-01-22 00:16:14.414 182939 DEBUG oslo.service.loopingcall [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:16:14 compute-0 nova_compute[182935]: 2026-01-22 00:16:14.414 182939 DEBUG nova.compute.manager [-] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:16:14 compute-0 nova_compute[182935]: 2026-01-22 00:16:14.415 182939 DEBUG nova.network.neutron [-] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:16:14 compute-0 nova_compute[182935]: 2026-01-22 00:16:14.701 182939 DEBUG nova.network.neutron [-] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:16:14 compute-0 nova_compute[182935]: 2026-01-22 00:16:14.734 182939 DEBUG nova.network.neutron [-] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:16:14 compute-0 nova_compute[182935]: 2026-01-22 00:16:14.764 182939 INFO nova.compute.manager [-] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Took 0.35 seconds to deallocate network for instance.
Jan 22 00:16:14 compute-0 nova_compute[182935]: 2026-01-22 00:16:14.886 182939 DEBUG oslo_concurrency.lockutils [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:14 compute-0 nova_compute[182935]: 2026-01-22 00:16:14.887 182939 DEBUG oslo_concurrency.lockutils [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:14 compute-0 nova_compute[182935]: 2026-01-22 00:16:14.975 182939 DEBUG nova.compute.provider_tree [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:16:14 compute-0 nova_compute[182935]: 2026-01-22 00:16:14.995 182939 DEBUG nova.scheduler.client.report [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:16:15 compute-0 nova_compute[182935]: 2026-01-22 00:16:15.023 182939 DEBUG oslo_concurrency.lockutils [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:15 compute-0 nova_compute[182935]: 2026-01-22 00:16:15.061 182939 INFO nova.scheduler.client.report [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Deleted allocations for instance 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6
Jan 22 00:16:15 compute-0 nova_compute[182935]: 2026-01-22 00:16:15.171 182939 DEBUG oslo_concurrency.lockutils [None req-8aea05d7-6fff-4a16-ac19-d9d4079635ad 7b3e1ca620ce40dea19faa9efb3a1476 5f10d4e59914438b995b44069d2d0127 - - default default] Lock "0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.371s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:16 compute-0 nova_compute[182935]: 2026-01-22 00:16:16.540 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:17 compute-0 nova_compute[182935]: 2026-01-22 00:16:17.602 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:17 compute-0 podman[233289]: 2026-01-22 00:16:17.713164798 +0000 UTC m=+0.083449880 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 00:16:21 compute-0 nova_compute[182935]: 2026-01-22 00:16:21.542 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:22 compute-0 nova_compute[182935]: 2026-01-22 00:16:22.647 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:16:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:16:24 compute-0 podman[233307]: 2026-01-22 00:16:24.678265655 +0000 UTC m=+0.053005309 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:16:24 compute-0 podman[233306]: 2026-01-22 00:16:24.678373048 +0000 UTC m=+0.055359264 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Jan 22 00:16:26 compute-0 nova_compute[182935]: 2026-01-22 00:16:26.543 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:27 compute-0 nova_compute[182935]: 2026-01-22 00:16:27.649 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:28 compute-0 nova_compute[182935]: 2026-01-22 00:16:28.929 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769040973.927627, 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:16:28 compute-0 nova_compute[182935]: 2026-01-22 00:16:28.929 182939 INFO nova.compute.manager [-] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] VM Stopped (Lifecycle Event)
Jan 22 00:16:29 compute-0 nova_compute[182935]: 2026-01-22 00:16:29.229 182939 DEBUG nova.compute.manager [None req-dacb020f-9a41-487c-836f-b40c5bf37425 - - - - - -] [instance: 0a2a4d2d-f794-4e68-a6a9-67e564a2d8d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:16:31 compute-0 nova_compute[182935]: 2026-01-22 00:16:31.585 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:32 compute-0 nova_compute[182935]: 2026-01-22 00:16:32.677 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:36 compute-0 nova_compute[182935]: 2026-01-22 00:16:36.586 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:37 compute-0 nova_compute[182935]: 2026-01-22 00:16:37.679 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:38 compute-0 podman[233347]: 2026-01-22 00:16:38.764688566 +0000 UTC m=+0.123448163 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:16:38 compute-0 podman[233346]: 2026-01-22 00:16:38.764673885 +0000 UTC m=+0.137991641 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:16:41 compute-0 sshd-session[233397]: Invalid user docker from 188.166.69.60 port 48448
Jan 22 00:16:41 compute-0 sshd-session[233397]: Connection closed by invalid user docker 188.166.69.60 port 48448 [preauth]
Jan 22 00:16:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:16:41.387 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:16:41 compute-0 nova_compute[182935]: 2026-01-22 00:16:41.388 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:16:41.389 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:16:41 compute-0 nova_compute[182935]: 2026-01-22 00:16:41.588 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:41 compute-0 nova_compute[182935]: 2026-01-22 00:16:41.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:41 compute-0 nova_compute[182935]: 2026-01-22 00:16:41.831 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:41 compute-0 nova_compute[182935]: 2026-01-22 00:16:41.832 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:41 compute-0 nova_compute[182935]: 2026-01-22 00:16:41.832 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:41 compute-0 nova_compute[182935]: 2026-01-22 00:16:41.832 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:16:41 compute-0 podman[233400]: 2026-01-22 00:16:41.951087426 +0000 UTC m=+0.072382712 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:16:42 compute-0 nova_compute[182935]: 2026-01-22 00:16:42.018 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:16:42 compute-0 nova_compute[182935]: 2026-01-22 00:16:42.019 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5743MB free_disk=73.12732315063477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:16:42 compute-0 nova_compute[182935]: 2026-01-22 00:16:42.020 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:16:42 compute-0 nova_compute[182935]: 2026-01-22 00:16:42.020 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:16:42 compute-0 nova_compute[182935]: 2026-01-22 00:16:42.122 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:16:42 compute-0 nova_compute[182935]: 2026-01-22 00:16:42.123 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:16:42 compute-0 nova_compute[182935]: 2026-01-22 00:16:42.165 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:16:42 compute-0 nova_compute[182935]: 2026-01-22 00:16:42.188 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:16:42 compute-0 nova_compute[182935]: 2026-01-22 00:16:42.213 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:16:42 compute-0 nova_compute[182935]: 2026-01-22 00:16:42.213 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:16:42 compute-0 nova_compute[182935]: 2026-01-22 00:16:42.681 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:44 compute-0 nova_compute[182935]: 2026-01-22 00:16:44.213 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:44 compute-0 nova_compute[182935]: 2026-01-22 00:16:44.214 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:16:44 compute-0 nova_compute[182935]: 2026-01-22 00:16:44.214 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:16:44 compute-0 nova_compute[182935]: 2026-01-22 00:16:44.339 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:16:44 compute-0 nova_compute[182935]: 2026-01-22 00:16:44.340 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:45 compute-0 nova_compute[182935]: 2026-01-22 00:16:45.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:45 compute-0 nova_compute[182935]: 2026-01-22 00:16:45.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:16:46 compute-0 nova_compute[182935]: 2026-01-22 00:16:46.590 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:46 compute-0 nova_compute[182935]: 2026-01-22 00:16:46.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:47 compute-0 nova_compute[182935]: 2026-01-22 00:16:47.683 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:48 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:16:48.391 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:16:48 compute-0 podman[233424]: 2026-01-22 00:16:48.70274594 +0000 UTC m=+0.071438139 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:16:48 compute-0 nova_compute[182935]: 2026-01-22 00:16:48.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:49 compute-0 nova_compute[182935]: 2026-01-22 00:16:49.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:51 compute-0 nova_compute[182935]: 2026-01-22 00:16:51.591 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:52 compute-0 nova_compute[182935]: 2026-01-22 00:16:52.734 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:53 compute-0 nova_compute[182935]: 2026-01-22 00:16:53.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:16:55 compute-0 podman[233442]: 2026-01-22 00:16:55.693745922 +0000 UTC m=+0.063515794 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Jan 22 00:16:55 compute-0 podman[233443]: 2026-01-22 00:16:55.700605572 +0000 UTC m=+0.067751223 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute)
Jan 22 00:16:56 compute-0 nova_compute[182935]: 2026-01-22 00:16:56.593 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:16:57 compute-0 nova_compute[182935]: 2026-01-22 00:16:57.736 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:01 compute-0 nova_compute[182935]: 2026-01-22 00:17:01.595 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:01 compute-0 nova_compute[182935]: 2026-01-22 00:17:01.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:02 compute-0 nova_compute[182935]: 2026-01-22 00:17:02.738 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:03.214 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:03.215 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:03.215 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:06 compute-0 nova_compute[182935]: 2026-01-22 00:17:06.597 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:07 compute-0 nova_compute[182935]: 2026-01-22 00:17:07.742 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:09 compute-0 podman[233485]: 2026-01-22 00:17:09.697568194 +0000 UTC m=+0.065100201 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:17:09 compute-0 podman[233484]: 2026-01-22 00:17:09.73933497 +0000 UTC m=+0.109747764 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.607 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.794 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.794 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.795 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.795 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.795 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.795 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.838 182939 DEBUG nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.838 182939 WARNING nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.839 182939 WARNING nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.839 182939 WARNING nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.839 182939 WARNING nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.839 182939 WARNING nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.839 182939 WARNING nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.840 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Removable base files: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6 /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.840 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.840 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.840 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.841 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.841 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.841 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.841 182939 DEBUG nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.841 182939 DEBUG nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 22 00:17:11 compute-0 nova_compute[182935]: 2026-01-22 00:17:11.841 182939 DEBUG nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.013 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.013 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.043 182939 DEBUG nova.compute.manager [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.200 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.201 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.207 182939 DEBUG nova.virt.hardware [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.208 182939 INFO nova.compute.claims [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.379 182939 DEBUG nova.compute.provider_tree [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.400 182939 DEBUG nova.scheduler.client.report [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.440 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.440 182939 DEBUG nova.compute.manager [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.511 182939 DEBUG nova.compute.manager [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.512 182939 DEBUG nova.network.neutron [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.546 182939 INFO nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.571 182939 DEBUG nova.compute.manager [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.711 182939 DEBUG nova.compute.manager [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.712 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.712 182939 INFO nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Creating image(s)
Jan 22 00:17:12 compute-0 podman[233534]: 2026-01-22 00:17:12.712033449 +0000 UTC m=+0.076450156 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.713 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.713 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.714 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.727 182939 DEBUG oslo_concurrency.processutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.744 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.780 182939 DEBUG oslo_concurrency.processutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.781 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.782 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.792 182939 DEBUG oslo_concurrency.processutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.879 182939 DEBUG nova.policy [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.885 182939 DEBUG oslo_concurrency.processutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.886 182939 DEBUG oslo_concurrency.processutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.921 182939 DEBUG oslo_concurrency.processutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.923 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.924 182939 DEBUG oslo_concurrency.processutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.985 182939 DEBUG oslo_concurrency.processutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.987 182939 DEBUG nova.virt.disk.api [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Checking if we can resize image /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:17:12 compute-0 nova_compute[182935]: 2026-01-22 00:17:12.988 182939 DEBUG oslo_concurrency.processutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:13 compute-0 nova_compute[182935]: 2026-01-22 00:17:13.065 182939 DEBUG oslo_concurrency.processutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:13 compute-0 nova_compute[182935]: 2026-01-22 00:17:13.068 182939 DEBUG nova.virt.disk.api [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Cannot resize image /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:17:13 compute-0 nova_compute[182935]: 2026-01-22 00:17:13.068 182939 DEBUG nova.objects.instance [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid 46feac9e-f412-4027-8cfb-f7280308085e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:17:13 compute-0 nova_compute[182935]: 2026-01-22 00:17:13.096 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:17:13 compute-0 nova_compute[182935]: 2026-01-22 00:17:13.096 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Ensure instance console log exists: /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:17:13 compute-0 nova_compute[182935]: 2026-01-22 00:17:13.097 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:13 compute-0 nova_compute[182935]: 2026-01-22 00:17:13.097 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:13 compute-0 nova_compute[182935]: 2026-01-22 00:17:13.098 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:14 compute-0 nova_compute[182935]: 2026-01-22 00:17:14.491 182939 DEBUG nova.network.neutron [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Successfully created port: 7bc267e3-f762-4a18-a3a2-42a7161a231e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:17:16 compute-0 nova_compute[182935]: 2026-01-22 00:17:16.612 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:17 compute-0 nova_compute[182935]: 2026-01-22 00:17:17.283 182939 DEBUG nova.network.neutron [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Successfully updated port: 7bc267e3-f762-4a18-a3a2-42a7161a231e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:17:17 compute-0 nova_compute[182935]: 2026-01-22 00:17:17.317 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:17 compute-0 nova_compute[182935]: 2026-01-22 00:17:17.317 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:17 compute-0 nova_compute[182935]: 2026-01-22 00:17:17.318 182939 DEBUG nova.network.neutron [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:17:17 compute-0 nova_compute[182935]: 2026-01-22 00:17:17.536 182939 DEBUG nova.compute.manager [req-11a2c099-1bea-4691-9aa5-91c516ba7059 req-10d41948-f5fb-4451-af29-1adf24f910f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-changed-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:17 compute-0 nova_compute[182935]: 2026-01-22 00:17:17.536 182939 DEBUG nova.compute.manager [req-11a2c099-1bea-4691-9aa5-91c516ba7059 req-10d41948-f5fb-4451-af29-1adf24f910f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Refreshing instance network info cache due to event network-changed-7bc267e3-f762-4a18-a3a2-42a7161a231e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:17:17 compute-0 nova_compute[182935]: 2026-01-22 00:17:17.536 182939 DEBUG oslo_concurrency.lockutils [req-11a2c099-1bea-4691-9aa5-91c516ba7059 req-10d41948-f5fb-4451-af29-1adf24f910f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:17 compute-0 nova_compute[182935]: 2026-01-22 00:17:17.746 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:17 compute-0 nova_compute[182935]: 2026-01-22 00:17:17.853 182939 DEBUG nova.network.neutron [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.809 182939 DEBUG nova.network.neutron [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating instance_info_cache with network_info: [{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.840 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.840 182939 DEBUG nova.compute.manager [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Instance network_info: |[{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.840 182939 DEBUG oslo_concurrency.lockutils [req-11a2c099-1bea-4691-9aa5-91c516ba7059 req-10d41948-f5fb-4451-af29-1adf24f910f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.841 182939 DEBUG nova.network.neutron [req-11a2c099-1bea-4691-9aa5-91c516ba7059 req-10d41948-f5fb-4451-af29-1adf24f910f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Refreshing network info cache for port 7bc267e3-f762-4a18-a3a2-42a7161a231e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.843 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Start _get_guest_xml network_info=[{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.848 182939 WARNING nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.853 182939 DEBUG nova.virt.libvirt.host [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.854 182939 DEBUG nova.virt.libvirt.host [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.856 182939 DEBUG nova.virt.libvirt.host [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.857 182939 DEBUG nova.virt.libvirt.host [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.858 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.858 182939 DEBUG nova.virt.hardware [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.859 182939 DEBUG nova.virt.hardware [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.859 182939 DEBUG nova.virt.hardware [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.859 182939 DEBUG nova.virt.hardware [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.859 182939 DEBUG nova.virt.hardware [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.860 182939 DEBUG nova.virt.hardware [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.860 182939 DEBUG nova.virt.hardware [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.860 182939 DEBUG nova.virt.hardware [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.860 182939 DEBUG nova.virt.hardware [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.860 182939 DEBUG nova.virt.hardware [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.861 182939 DEBUG nova.virt.hardware [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.865 182939 DEBUG nova.virt.libvirt.vif [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:17:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1889498750',display_name='tempest-TestNetworkAdvancedServerOps-server-1889498750',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1889498750',id=134,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOTO7d2Dwnkbz9wr9hWsejC9/1+pdYEpWKDQobSKPUmWC0nAs/mdLNrBlKhRnQPpVBXnMQms4q8X3v+9bWXw5gwGNW9NuZlObmqlerpOa7gv/9x3J0wC1Nx+jU/uK6YUg==',key_name='tempest-TestNetworkAdvancedServerOps-1110887450',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-ib8w2x0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:12Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=46feac9e-f412-4027-8cfb-f7280308085e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.865 182939 DEBUG nova.network.os_vif_util [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.866 182939 DEBUG nova.network.os_vif_util [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.867 182939 DEBUG nova.objects.instance [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid 46feac9e-f412-4027-8cfb-f7280308085e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.885 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:17:18 compute-0 nova_compute[182935]:   <uuid>46feac9e-f412-4027-8cfb-f7280308085e</uuid>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   <name>instance-00000086</name>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1889498750</nova:name>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:17:18</nova:creationTime>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:17:18 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:17:18 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:17:18 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:17:18 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:17:18 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:17:18 compute-0 nova_compute[182935]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:17:18 compute-0 nova_compute[182935]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:17:18 compute-0 nova_compute[182935]:         <nova:port uuid="7bc267e3-f762-4a18-a3a2-42a7161a231e">
Jan 22 00:17:18 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <system>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <entry name="serial">46feac9e-f412-4027-8cfb-f7280308085e</entry>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <entry name="uuid">46feac9e-f412-4027-8cfb-f7280308085e</entry>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     </system>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   <os>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   </os>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   <features>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   </features>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.config"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:38:78:10"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <target dev="tap7bc267e3-f7"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/console.log" append="off"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <video>
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     </video>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:17:18 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:17:18 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:17:18 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:17:18 compute-0 nova_compute[182935]: </domain>
Jan 22 00:17:18 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.886 182939 DEBUG nova.compute.manager [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Preparing to wait for external event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.887 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.887 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.887 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.888 182939 DEBUG nova.virt.libvirt.vif [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:17:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1889498750',display_name='tempest-TestNetworkAdvancedServerOps-server-1889498750',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1889498750',id=134,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOTO7d2Dwnkbz9wr9hWsejC9/1+pdYEpWKDQobSKPUmWC0nAs/mdLNrBlKhRnQPpVBXnMQms4q8X3v+9bWXw5gwGNW9NuZlObmqlerpOa7gv/9x3J0wC1Nx+jU/uK6YUg==',key_name='tempest-TestNetworkAdvancedServerOps-1110887450',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-ib8w2x0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:12Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=46feac9e-f412-4027-8cfb-f7280308085e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.888 182939 DEBUG nova.network.os_vif_util [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.889 182939 DEBUG nova.network.os_vif_util [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.890 182939 DEBUG os_vif [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.890 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.891 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.891 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.898 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.898 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7bc267e3-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.899 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7bc267e3-f7, col_values=(('external_ids', {'iface-id': '7bc267e3-f762-4a18-a3a2-42a7161a231e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:78:10', 'vm-uuid': '46feac9e-f412-4027-8cfb-f7280308085e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.900 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:18 compute-0 NetworkManager[55139]: <info>  [1769041038.9015] manager: (tap7bc267e3-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.902 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.909 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.910 182939 INFO os_vif [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7')
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.972 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.973 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.973 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No VIF found with MAC fa:16:3e:38:78:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:17:18 compute-0 nova_compute[182935]: 2026-01-22 00:17:18.973 182939 INFO nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Using config drive
Jan 22 00:17:19 compute-0 nova_compute[182935]: 2026-01-22 00:17:19.472 182939 INFO nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Creating config drive at /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.config
Jan 22 00:17:19 compute-0 nova_compute[182935]: 2026-01-22 00:17:19.476 182939 DEBUG oslo_concurrency.processutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyq0ftx63 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:19 compute-0 nova_compute[182935]: 2026-01-22 00:17:19.602 182939 DEBUG oslo_concurrency.processutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyq0ftx63" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:19 compute-0 kernel: tap7bc267e3-f7: entered promiscuous mode
Jan 22 00:17:19 compute-0 NetworkManager[55139]: <info>  [1769041039.6575] manager: (tap7bc267e3-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/236)
Jan 22 00:17:19 compute-0 ovn_controller[95047]: 2026-01-22T00:17:19Z|00499|binding|INFO|Claiming lport 7bc267e3-f762-4a18-a3a2-42a7161a231e for this chassis.
Jan 22 00:17:19 compute-0 ovn_controller[95047]: 2026-01-22T00:17:19Z|00500|binding|INFO|7bc267e3-f762-4a18-a3a2-42a7161a231e: Claiming fa:16:3e:38:78:10 10.100.0.12
Jan 22 00:17:19 compute-0 nova_compute[182935]: 2026-01-22 00:17:19.659 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:19 compute-0 nova_compute[182935]: 2026-01-22 00:17:19.666 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.678 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:78:10 10.100.0.12'], port_security=['fa:16:3e:38:78:10 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-184c07f2-f316-4056-b962-173c9a73cccb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9e59c3b5-e637-42fe-b28f-811656431607', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac0dd3c8-754f-43f7-8c8a-c2e10a6719dc, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=7bc267e3-f762-4a18-a3a2-42a7161a231e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.680 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 7bc267e3-f762-4a18-a3a2-42a7161a231e in datapath 184c07f2-f316-4056-b962-173c9a73cccb bound to our chassis
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.682 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 184c07f2-f316-4056-b962-173c9a73cccb
Jan 22 00:17:19 compute-0 systemd-udevd[233608]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.702 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fd65dbe1-8df9-452a-a7f6-2907b117713f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.704 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap184c07f2-f1 in ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.708 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap184c07f2-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.708 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[676fb0a9-42c7-4180-8ff1-b37122393a3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.709 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0511990d-de2a-447e-95f7-da5d25eb89bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 systemd-machined[154182]: New machine qemu-68-instance-00000086.
Jan 22 00:17:19 compute-0 NetworkManager[55139]: <info>  [1769041039.7154] device (tap7bc267e3-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:17:19 compute-0 NetworkManager[55139]: <info>  [1769041039.7162] device (tap7bc267e3-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:17:19 compute-0 nova_compute[182935]: 2026-01-22 00:17:19.721 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:19 compute-0 ovn_controller[95047]: 2026-01-22T00:17:19Z|00501|binding|INFO|Setting lport 7bc267e3-f762-4a18-a3a2-42a7161a231e ovn-installed in OVS
Jan 22 00:17:19 compute-0 ovn_controller[95047]: 2026-01-22T00:17:19Z|00502|binding|INFO|Setting lport 7bc267e3-f762-4a18-a3a2-42a7161a231e up in Southbound
Jan 22 00:17:19 compute-0 nova_compute[182935]: 2026-01-22 00:17:19.724 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.724 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[be8b651e-8f1c-418c-bf4f-1a39133451f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 podman[233578]: 2026-01-22 00:17:19.727599465 +0000 UTC m=+0.093740950 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 00:17:19 compute-0 systemd[1]: Started Virtual Machine qemu-68-instance-00000086.
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.749 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[71fe8f22-6370-469f-8ac0-d2ad25658df2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.779 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8c8a7d52-2934-4b42-8801-9312a38c6462]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.786 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[edd54b7e-dd87-40ff-bd6a-8f7d4bb3c740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 systemd-udevd[233612]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:17:19 compute-0 NetworkManager[55139]: <info>  [1769041039.7877] manager: (tap184c07f2-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/237)
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.819 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0447e7-1022-422b-b4fb-57691d1f6a48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.825 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[66c06b63-db4c-436b-95f0-e2aa68a2c3b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 NetworkManager[55139]: <info>  [1769041039.8512] device (tap184c07f2-f0): carrier: link connected
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.858 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[dc62d9ad-9449-47e5-bf9f-c4f860995d65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.877 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6700119c-66e2-4cf6-85f8-bdef2b8ee966]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap184c07f2-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:28:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554258, 'reachable_time': 30491, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233641, 'error': None, 'target': 'ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.898 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4675a02a-a69b-49a9-8805-94eb19e2bb1a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe30:2880'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554258, 'tstamp': 554258}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233642, 'error': None, 'target': 'ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.917 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[530f3535-55f7-4464-98a1-78455f83068b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap184c07f2-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:30:28:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554258, 'reachable_time': 30491, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233643, 'error': None, 'target': 'ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:19.962 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3b6d50-ce49-4de2-ae40-c320fd2f8a7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:20.048 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[79d99734-e545-4bcd-93f1-34ac03277e89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:20.051 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap184c07f2-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:20.052 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:20.053 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap184c07f2-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:20 compute-0 NetworkManager[55139]: <info>  [1769041040.0569] manager: (tap184c07f2-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/238)
Jan 22 00:17:20 compute-0 kernel: tap184c07f2-f0: entered promiscuous mode
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.058 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:20.061 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap184c07f2-f0, col_values=(('external_ids', {'iface-id': 'fb0deda5-be9d-4b30-99e6-73fb36bd8567'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:20 compute-0 ovn_controller[95047]: 2026-01-22T00:17:20Z|00503|binding|INFO|Releasing lport fb0deda5-be9d-4b30-99e6-73fb36bd8567 from this chassis (sb_readonly=0)
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.063 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:20.066 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/184c07f2-f316-4056-b962-173c9a73cccb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/184c07f2-f316-4056-b962-173c9a73cccb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:20.068 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ce9769d2-e93f-4ddc-93e8-c66b7651ad01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:20.069 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-184c07f2-f316-4056-b962-173c9a73cccb
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/184c07f2-f316-4056-b962-173c9a73cccb.pid.haproxy
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 184c07f2-f316-4056-b962-173c9a73cccb
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:17:20 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:20.070 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb', 'env', 'PROCESS_TAG=haproxy-184c07f2-f316-4056-b962-173c9a73cccb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/184c07f2-f316-4056-b962-173c9a73cccb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.076 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:20 compute-0 podman[233681]: 2026-01-22 00:17:20.465924164 +0000 UTC m=+0.048536525 container create a5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.484 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041040.4833586, 46feac9e-f412-4027-8cfb-f7280308085e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.485 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] VM Started (Lifecycle Event)
Jan 22 00:17:20 compute-0 systemd[1]: Started libpod-conmon-a5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5.scope.
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.515 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.521 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041040.484917, 46feac9e-f412-4027-8cfb-f7280308085e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.521 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] VM Paused (Lifecycle Event)
Jan 22 00:17:20 compute-0 podman[233681]: 2026-01-22 00:17:20.440514311 +0000 UTC m=+0.023126692 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:17:20 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:17:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b917df69e2d6b5d3abc00cf7dca05f5bdf21de929e29618298684fa59a300f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.566 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:20 compute-0 podman[233681]: 2026-01-22 00:17:20.567981877 +0000 UTC m=+0.150594248 container init a5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.571 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:17:20 compute-0 podman[233681]: 2026-01-22 00:17:20.574379876 +0000 UTC m=+0.156992237 container start a5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.595 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:17:20 compute-0 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233697]: [NOTICE]   (233701) : New worker (233703) forked
Jan 22 00:17:20 compute-0 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233697]: [NOTICE]   (233701) : Loading success.
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.973 182939 DEBUG nova.network.neutron [req-11a2c099-1bea-4691-9aa5-91c516ba7059 req-10d41948-f5fb-4451-af29-1adf24f910f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updated VIF entry in instance network info cache for port 7bc267e3-f762-4a18-a3a2-42a7161a231e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:17:20 compute-0 nova_compute[182935]: 2026-01-22 00:17:20.973 182939 DEBUG nova.network.neutron [req-11a2c099-1bea-4691-9aa5-91c516ba7059 req-10d41948-f5fb-4451-af29-1adf24f910f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating instance_info_cache with network_info: [{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.010 182939 DEBUG oslo_concurrency.lockutils [req-11a2c099-1bea-4691-9aa5-91c516ba7059 req-10d41948-f5fb-4451-af29-1adf24f910f7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.820 182939 DEBUG nova.compute.manager [req-767c74ed-06b4-4db4-86e3-3d2e9510d650 req-2287e36b-2262-477b-b81b-039cc4dac1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.820 182939 DEBUG oslo_concurrency.lockutils [req-767c74ed-06b4-4db4-86e3-3d2e9510d650 req-2287e36b-2262-477b-b81b-039cc4dac1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.821 182939 DEBUG oslo_concurrency.lockutils [req-767c74ed-06b4-4db4-86e3-3d2e9510d650 req-2287e36b-2262-477b-b81b-039cc4dac1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.821 182939 DEBUG oslo_concurrency.lockutils [req-767c74ed-06b4-4db4-86e3-3d2e9510d650 req-2287e36b-2262-477b-b81b-039cc4dac1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.821 182939 DEBUG nova.compute.manager [req-767c74ed-06b4-4db4-86e3-3d2e9510d650 req-2287e36b-2262-477b-b81b-039cc4dac1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Processing event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.821 182939 DEBUG nova.compute.manager [req-767c74ed-06b4-4db4-86e3-3d2e9510d650 req-2287e36b-2262-477b-b81b-039cc4dac1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.822 182939 DEBUG oslo_concurrency.lockutils [req-767c74ed-06b4-4db4-86e3-3d2e9510d650 req-2287e36b-2262-477b-b81b-039cc4dac1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.822 182939 DEBUG oslo_concurrency.lockutils [req-767c74ed-06b4-4db4-86e3-3d2e9510d650 req-2287e36b-2262-477b-b81b-039cc4dac1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.822 182939 DEBUG oslo_concurrency.lockutils [req-767c74ed-06b4-4db4-86e3-3d2e9510d650 req-2287e36b-2262-477b-b81b-039cc4dac1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.822 182939 DEBUG nova.compute.manager [req-767c74ed-06b4-4db4-86e3-3d2e9510d650 req-2287e36b-2262-477b-b81b-039cc4dac1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] No waiting events found dispatching network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.822 182939 WARNING nova.compute.manager [req-767c74ed-06b4-4db4-86e3-3d2e9510d650 req-2287e36b-2262-477b-b81b-039cc4dac1d8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received unexpected event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e for instance with vm_state building and task_state spawning.
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.823 182939 DEBUG nova.compute.manager [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.828 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041041.82757, 46feac9e-f412-4027-8cfb-f7280308085e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.829 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] VM Resumed (Lifecycle Event)
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.834 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.840 182939 INFO nova.virt.libvirt.driver [-] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Instance spawned successfully.
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.840 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.869 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.875 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.881 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.881 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.882 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.882 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.883 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.884 182939 DEBUG nova.virt.libvirt.driver [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.921 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.989 182939 INFO nova.compute.manager [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Took 9.28 seconds to spawn the instance on the hypervisor.
Jan 22 00:17:21 compute-0 nova_compute[182935]: 2026-01-22 00:17:21.989 182939 DEBUG nova.compute.manager [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:22 compute-0 nova_compute[182935]: 2026-01-22 00:17:22.090 182939 INFO nova.compute.manager [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Took 9.95 seconds to build instance.
Jan 22 00:17:22 compute-0 nova_compute[182935]: 2026-01-22 00:17:22.149 182939 DEBUG oslo_concurrency.lockutils [None req-7a0b40df-1a40-400d-877c-b51e5c44c2c0 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:22 compute-0 nova_compute[182935]: 2026-01-22 00:17:22.748 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:23 compute-0 nova_compute[182935]: 2026-01-22 00:17:23.901 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:25.628 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:17:25 compute-0 nova_compute[182935]: 2026-01-22 00:17:25.629 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:25.631 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:17:25 compute-0 sshd-session[233712]: Invalid user redis from 188.166.69.60 port 38446
Jan 22 00:17:26 compute-0 sshd-session[233712]: Connection closed by invalid user redis 188.166.69.60 port 38446 [preauth]
Jan 22 00:17:26 compute-0 podman[233715]: 2026-01-22 00:17:26.065677511 +0000 UTC m=+0.087040923 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:17:26 compute-0 podman[233714]: 2026-01-22 00:17:26.068695033 +0000 UTC m=+0.089501131 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-type=git)
Jan 22 00:17:27 compute-0 nova_compute[182935]: 2026-01-22 00:17:27.140 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:27 compute-0 NetworkManager[55139]: <info>  [1769041047.1443] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/239)
Jan 22 00:17:27 compute-0 NetworkManager[55139]: <info>  [1769041047.1452] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Jan 22 00:17:27 compute-0 nova_compute[182935]: 2026-01-22 00:17:27.233 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:27 compute-0 ovn_controller[95047]: 2026-01-22T00:17:27Z|00504|binding|INFO|Releasing lport fb0deda5-be9d-4b30-99e6-73fb36bd8567 from this chassis (sb_readonly=0)
Jan 22 00:17:27 compute-0 nova_compute[182935]: 2026-01-22 00:17:27.247 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:27 compute-0 nova_compute[182935]: 2026-01-22 00:17:27.712 182939 DEBUG nova.compute.manager [req-6947306f-214a-4105-8859-3ef5970003bc req-d2ef23b4-705e-407b-853d-342a11ed57ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-changed-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:27 compute-0 nova_compute[182935]: 2026-01-22 00:17:27.712 182939 DEBUG nova.compute.manager [req-6947306f-214a-4105-8859-3ef5970003bc req-d2ef23b4-705e-407b-853d-342a11ed57ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Refreshing instance network info cache due to event network-changed-7bc267e3-f762-4a18-a3a2-42a7161a231e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:17:27 compute-0 nova_compute[182935]: 2026-01-22 00:17:27.713 182939 DEBUG oslo_concurrency.lockutils [req-6947306f-214a-4105-8859-3ef5970003bc req-d2ef23b4-705e-407b-853d-342a11ed57ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:27 compute-0 nova_compute[182935]: 2026-01-22 00:17:27.713 182939 DEBUG oslo_concurrency.lockutils [req-6947306f-214a-4105-8859-3ef5970003bc req-d2ef23b4-705e-407b-853d-342a11ed57ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:27 compute-0 nova_compute[182935]: 2026-01-22 00:17:27.713 182939 DEBUG nova.network.neutron [req-6947306f-214a-4105-8859-3ef5970003bc req-d2ef23b4-705e-407b-853d-342a11ed57ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Refreshing network info cache for port 7bc267e3-f762-4a18-a3a2-42a7161a231e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:17:27 compute-0 nova_compute[182935]: 2026-01-22 00:17:27.750 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:28 compute-0 nova_compute[182935]: 2026-01-22 00:17:28.904 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:29 compute-0 nova_compute[182935]: 2026-01-22 00:17:29.925 182939 DEBUG nova.network.neutron [req-6947306f-214a-4105-8859-3ef5970003bc req-d2ef23b4-705e-407b-853d-342a11ed57ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updated VIF entry in instance network info cache for port 7bc267e3-f762-4a18-a3a2-42a7161a231e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:17:29 compute-0 nova_compute[182935]: 2026-01-22 00:17:29.926 182939 DEBUG nova.network.neutron [req-6947306f-214a-4105-8859-3ef5970003bc req-d2ef23b4-705e-407b-853d-342a11ed57ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating instance_info_cache with network_info: [{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:29 compute-0 nova_compute[182935]: 2026-01-22 00:17:29.960 182939 DEBUG oslo_concurrency.lockutils [req-6947306f-214a-4105-8859-3ef5970003bc req-d2ef23b4-705e-407b-853d-342a11ed57ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:17:30 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:30.635 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:32 compute-0 nova_compute[182935]: 2026-01-22 00:17:32.503 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:32 compute-0 nova_compute[182935]: 2026-01-22 00:17:32.752 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:33 compute-0 nova_compute[182935]: 2026-01-22 00:17:33.943 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:33 compute-0 ovn_controller[95047]: 2026-01-22T00:17:33Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:38:78:10 10.100.0.12
Jan 22 00:17:33 compute-0 ovn_controller[95047]: 2026-01-22T00:17:33Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:78:10 10.100.0.12
Jan 22 00:17:36 compute-0 nova_compute[182935]: 2026-01-22 00:17:36.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:37 compute-0 nova_compute[182935]: 2026-01-22 00:17:37.754 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:38 compute-0 nova_compute[182935]: 2026-01-22 00:17:38.775 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:38 compute-0 nova_compute[182935]: 2026-01-22 00:17:38.946 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:39 compute-0 nova_compute[182935]: 2026-01-22 00:17:39.628 182939 INFO nova.compute.manager [None req-b503a603-34d8-4abe-99a3-558ef90585fc 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Get console output
Jan 22 00:17:39 compute-0 nova_compute[182935]: 2026-01-22 00:17:39.634 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:17:40 compute-0 podman[233769]: 2026-01-22 00:17:40.687600578 +0000 UTC m=+0.055058637 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:17:40 compute-0 podman[233768]: 2026-01-22 00:17:40.713678086 +0000 UTC m=+0.085967768 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:17:42 compute-0 nova_compute[182935]: 2026-01-22 00:17:42.756 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:43 compute-0 podman[233817]: 2026-01-22 00:17:43.67373669 +0000 UTC m=+0.049685741 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:17:43 compute-0 nova_compute[182935]: 2026-01-22 00:17:43.809 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:43 compute-0 nova_compute[182935]: 2026-01-22 00:17:43.809 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:17:43 compute-0 nova_compute[182935]: 2026-01-22 00:17:43.809 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:17:43 compute-0 nova_compute[182935]: 2026-01-22 00:17:43.948 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:44 compute-0 nova_compute[182935]: 2026-01-22 00:17:44.075 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:44 compute-0 nova_compute[182935]: 2026-01-22 00:17:44.075 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:44 compute-0 nova_compute[182935]: 2026-01-22 00:17:44.075 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:17:44 compute-0 nova_compute[182935]: 2026-01-22 00:17:44.076 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 46feac9e-f412-4027-8cfb-f7280308085e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:17:44 compute-0 nova_compute[182935]: 2026-01-22 00:17:44.471 182939 INFO nova.compute.manager [None req-a48c8ea6-5c85-41d3-8eba-80fa0364e5a9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Get console output
Jan 22 00:17:44 compute-0 nova_compute[182935]: 2026-01-22 00:17:44.475 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:17:45 compute-0 nova_compute[182935]: 2026-01-22 00:17:45.981 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating instance_info_cache with network_info: [{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:45 compute-0 nova_compute[182935]: 2026-01-22 00:17:45.997 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:17:45 compute-0 nova_compute[182935]: 2026-01-22 00:17:45.997 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:17:45 compute-0 nova_compute[182935]: 2026-01-22 00:17:45.997 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:45 compute-0 nova_compute[182935]: 2026-01-22 00:17:45.998 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:45 compute-0 nova_compute[182935]: 2026-01-22 00:17:45.998 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:17:45 compute-0 nova_compute[182935]: 2026-01-22 00:17:45.998 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.024 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.024 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.025 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.025 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.126 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.204 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.205 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.260 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.413 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.415 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5543MB free_disk=73.09864807128906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.415 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.415 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.488 182939 INFO nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating resource usage from migration bd86f60c-99de-4bae-8548-969bcc2d8d50
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.538 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 46feac9e-f412-4027-8cfb-f7280308085e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.539 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.539 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.608 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.627 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.661 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:17:46 compute-0 nova_compute[182935]: 2026-01-22 00:17:46.662 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:47 compute-0 nova_compute[182935]: 2026-01-22 00:17:47.757 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:48 compute-0 nova_compute[182935]: 2026-01-22 00:17:48.752 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "de316698-4994-41d0-af54-de924c3b99c2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:48 compute-0 nova_compute[182935]: 2026-01-22 00:17:48.752 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:48 compute-0 nova_compute[182935]: 2026-01-22 00:17:48.787 182939 DEBUG nova.compute.manager [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:17:48 compute-0 nova_compute[182935]: 2026-01-22 00:17:48.881 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:48 compute-0 nova_compute[182935]: 2026-01-22 00:17:48.882 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:48 compute-0 nova_compute[182935]: 2026-01-22 00:17:48.889 182939 DEBUG nova.virt.hardware [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:17:48 compute-0 nova_compute[182935]: 2026-01-22 00:17:48.889 182939 INFO nova.compute.claims [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:17:48 compute-0 nova_compute[182935]: 2026-01-22 00:17:48.956 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.025 182939 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.026 182939 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquired lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.026 182939 DEBUG nova.network.neutron [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.146 182939 DEBUG nova.compute.provider_tree [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.163 182939 DEBUG nova.scheduler.client.report [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.200 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.201 182939 DEBUG nova.compute.manager [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.348 182939 DEBUG nova.compute.manager [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.348 182939 DEBUG nova.network.neutron [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.383 182939 INFO nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.399 182939 DEBUG nova.compute.manager [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.459 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.565 182939 DEBUG nova.compute.manager [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.566 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.567 182939 INFO nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Creating image(s)
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.567 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "/var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.568 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.568 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.580 182939 DEBUG oslo_concurrency.processutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.600 182939 DEBUG nova.policy [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.637 182939 DEBUG oslo_concurrency.processutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.638 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.639 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.649 182939 DEBUG oslo_concurrency.processutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.705 182939 DEBUG oslo_concurrency.processutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.706 182939 DEBUG oslo_concurrency.processutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.773 182939 DEBUG oslo_concurrency.processutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk 1073741824" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.775 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.775 182939 DEBUG oslo_concurrency.processutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.834 182939 DEBUG oslo_concurrency.processutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.835 182939 DEBUG nova.virt.disk.api [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Checking if we can resize image /var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.836 182939 DEBUG oslo_concurrency.processutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.893 182939 DEBUG oslo_concurrency.processutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.894 182939 DEBUG nova.virt.disk.api [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Cannot resize image /var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.894 182939 DEBUG nova.objects.instance [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'migration_context' on Instance uuid de316698-4994-41d0-af54-de924c3b99c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.912 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.912 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Ensure instance console log exists: /var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.913 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.913 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:49 compute-0 nova_compute[182935]: 2026-01-22 00:17:49.913 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:50 compute-0 podman[233863]: 2026-01-22 00:17:50.678395211 +0000 UTC m=+0.048604256 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:17:50 compute-0 nova_compute[182935]: 2026-01-22 00:17:50.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:50 compute-0 nova_compute[182935]: 2026-01-22 00:17:50.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:51 compute-0 nova_compute[182935]: 2026-01-22 00:17:51.061 182939 DEBUG nova.network.neutron [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Successfully created port: c6086fce-4544-4d96-8b80-4e4854599fda _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:17:51 compute-0 nova_compute[182935]: 2026-01-22 00:17:51.578 182939 DEBUG nova.network.neutron [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating instance_info_cache with network_info: [{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:51 compute-0 nova_compute[182935]: 2026-01-22 00:17:51.668 182939 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Releasing lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:17:51 compute-0 nova_compute[182935]: 2026-01-22 00:17:51.905 182939 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 22 00:17:51 compute-0 nova_compute[182935]: 2026-01-22 00:17:51.906 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Creating file /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/0cc8e4fb45d4465487fe4b0a05f124e9.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 22 00:17:51 compute-0 nova_compute[182935]: 2026-01-22 00:17:51.906 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/0cc8e4fb45d4465487fe4b0a05f124e9.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.475 182939 DEBUG nova.network.neutron [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Successfully updated port: c6086fce-4544-4d96-8b80-4e4854599fda _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.498 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/0cc8e4fb45d4465487fe4b0a05f124e9.tmp" returned: 1 in 0.592s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.499 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/0cc8e4fb45d4465487fe4b0a05f124e9.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.499 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Creating directory /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.500 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.550 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "refresh_cache-de316698-4994-41d0-af54-de924c3b99c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.551 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquired lock "refresh_cache-de316698-4994-41d0-af54-de924c3b99c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.551 182939 DEBUG nova.network.neutron [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.608 182939 DEBUG nova.compute.manager [req-913c85ad-99cb-4d98-a164-6d384073804c req-5bd5f94d-0fb1-4afb-b808-b72e64cb6da8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Received event network-changed-c6086fce-4544-4d96-8b80-4e4854599fda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.608 182939 DEBUG nova.compute.manager [req-913c85ad-99cb-4d98-a164-6d384073804c req-5bd5f94d-0fb1-4afb-b808-b72e64cb6da8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Refreshing instance network info cache due to event network-changed-c6086fce-4544-4d96-8b80-4e4854599fda. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.609 182939 DEBUG oslo_concurrency.lockutils [req-913c85ad-99cb-4d98-a164-6d384073804c req-5bd5f94d-0fb1-4afb-b808-b72e64cb6da8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-de316698-4994-41d0-af54-de924c3b99c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.710 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e" returned: 0 in 0.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.716 182939 DEBUG nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.731 182939 DEBUG nova.network.neutron [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:17:52 compute-0 nova_compute[182935]: 2026-01-22 00:17:52.759 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:53 compute-0 nova_compute[182935]: 2026-01-22 00:17:53.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:53 compute-0 nova_compute[182935]: 2026-01-22 00:17:53.959 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.073 182939 DEBUG nova.network.neutron [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Updating instance_info_cache with network_info: [{"id": "c6086fce-4544-4d96-8b80-4e4854599fda", "address": "fa:16:3e:e9:73:60", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6086fce-45", "ovs_interfaceid": "c6086fce-4544-4d96-8b80-4e4854599fda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.098 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Releasing lock "refresh_cache-de316698-4994-41d0-af54-de924c3b99c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.098 182939 DEBUG nova.compute.manager [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Instance network_info: |[{"id": "c6086fce-4544-4d96-8b80-4e4854599fda", "address": "fa:16:3e:e9:73:60", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6086fce-45", "ovs_interfaceid": "c6086fce-4544-4d96-8b80-4e4854599fda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.099 182939 DEBUG oslo_concurrency.lockutils [req-913c85ad-99cb-4d98-a164-6d384073804c req-5bd5f94d-0fb1-4afb-b808-b72e64cb6da8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-de316698-4994-41d0-af54-de924c3b99c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.099 182939 DEBUG nova.network.neutron [req-913c85ad-99cb-4d98-a164-6d384073804c req-5bd5f94d-0fb1-4afb-b808-b72e64cb6da8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Refreshing network info cache for port c6086fce-4544-4d96-8b80-4e4854599fda _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.101 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Start _get_guest_xml network_info=[{"id": "c6086fce-4544-4d96-8b80-4e4854599fda", "address": "fa:16:3e:e9:73:60", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6086fce-45", "ovs_interfaceid": "c6086fce-4544-4d96-8b80-4e4854599fda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.105 182939 WARNING nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.109 182939 DEBUG nova.virt.libvirt.host [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.109 182939 DEBUG nova.virt.libvirt.host [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.113 182939 DEBUG nova.virt.libvirt.host [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.113 182939 DEBUG nova.virt.libvirt.host [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.114 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.114 182939 DEBUG nova.virt.hardware [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.115 182939 DEBUG nova.virt.hardware [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.115 182939 DEBUG nova.virt.hardware [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.115 182939 DEBUG nova.virt.hardware [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.115 182939 DEBUG nova.virt.hardware [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.115 182939 DEBUG nova.virt.hardware [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.116 182939 DEBUG nova.virt.hardware [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.116 182939 DEBUG nova.virt.hardware [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.116 182939 DEBUG nova.virt.hardware [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.116 182939 DEBUG nova.virt.hardware [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.117 182939 DEBUG nova.virt.hardware [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.120 182939 DEBUG nova.virt.libvirt.vif [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:17:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-783971544',display_name='tempest-ServersTestJSON-server-783971544',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-783971544',id=139,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-1vpgdm6o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:49Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=de316698-4994-41d0-af54-de924c3b99c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6086fce-4544-4d96-8b80-4e4854599fda", "address": "fa:16:3e:e9:73:60", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6086fce-45", "ovs_interfaceid": "c6086fce-4544-4d96-8b80-4e4854599fda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.120 182939 DEBUG nova.network.os_vif_util [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "c6086fce-4544-4d96-8b80-4e4854599fda", "address": "fa:16:3e:e9:73:60", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6086fce-45", "ovs_interfaceid": "c6086fce-4544-4d96-8b80-4e4854599fda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.121 182939 DEBUG nova.network.os_vif_util [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:73:60,bridge_name='br-int',has_traffic_filtering=True,id=c6086fce-4544-4d96-8b80-4e4854599fda,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6086fce-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.122 182939 DEBUG nova.objects.instance [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'pci_devices' on Instance uuid de316698-4994-41d0-af54-de924c3b99c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.141 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:17:54 compute-0 nova_compute[182935]:   <uuid>de316698-4994-41d0-af54-de924c3b99c2</uuid>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   <name>instance-0000008b</name>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersTestJSON-server-783971544</nova:name>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:17:54</nova:creationTime>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:17:54 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:17:54 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:17:54 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:17:54 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:17:54 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:17:54 compute-0 nova_compute[182935]:         <nova:user uuid="5eb4e81f0cef4003ae49faa67b3f17c3">tempest-ServersTestJSON-374007797-project-member</nova:user>
Jan 22 00:17:54 compute-0 nova_compute[182935]:         <nova:project uuid="3e408650207b498c8d115fd0c4f776dc">tempest-ServersTestJSON-374007797</nova:project>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:17:54 compute-0 nova_compute[182935]:         <nova:port uuid="c6086fce-4544-4d96-8b80-4e4854599fda">
Jan 22 00:17:54 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <system>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <entry name="serial">de316698-4994-41d0-af54-de924c3b99c2</entry>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <entry name="uuid">de316698-4994-41d0-af54-de924c3b99c2</entry>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     </system>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   <os>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   </os>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   <features>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   </features>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk.config"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:e9:73:60"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <target dev="tapc6086fce-45"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/console.log" append="off"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <video>
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     </video>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:17:54 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:17:54 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:17:54 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:17:54 compute-0 nova_compute[182935]: </domain>
Jan 22 00:17:54 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.143 182939 DEBUG nova.compute.manager [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Preparing to wait for external event network-vif-plugged-c6086fce-4544-4d96-8b80-4e4854599fda prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.144 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "de316698-4994-41d0-af54-de924c3b99c2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.144 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.144 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.145 182939 DEBUG nova.virt.libvirt.vif [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:17:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-783971544',display_name='tempest-ServersTestJSON-server-783971544',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-783971544',id=139,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-1vpgdm6o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:17:49Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=de316698-4994-41d0-af54-de924c3b99c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6086fce-4544-4d96-8b80-4e4854599fda", "address": "fa:16:3e:e9:73:60", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6086fce-45", "ovs_interfaceid": "c6086fce-4544-4d96-8b80-4e4854599fda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.146 182939 DEBUG nova.network.os_vif_util [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "c6086fce-4544-4d96-8b80-4e4854599fda", "address": "fa:16:3e:e9:73:60", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6086fce-45", "ovs_interfaceid": "c6086fce-4544-4d96-8b80-4e4854599fda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.147 182939 DEBUG nova.network.os_vif_util [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:73:60,bridge_name='br-int',has_traffic_filtering=True,id=c6086fce-4544-4d96-8b80-4e4854599fda,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6086fce-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.147 182939 DEBUG os_vif [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:73:60,bridge_name='br-int',has_traffic_filtering=True,id=c6086fce-4544-4d96-8b80-4e4854599fda,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6086fce-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.148 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.148 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.149 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.152 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.153 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6086fce-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.153 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc6086fce-45, col_values=(('external_ids', {'iface-id': 'c6086fce-4544-4d96-8b80-4e4854599fda', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:73:60', 'vm-uuid': 'de316698-4994-41d0-af54-de924c3b99c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.154 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:54 compute-0 NetworkManager[55139]: <info>  [1769041074.1554] manager: (tapc6086fce-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/241)
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.158 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.161 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.162 182939 INFO os_vif [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:73:60,bridge_name='br-int',has_traffic_filtering=True,id=c6086fce-4544-4d96-8b80-4e4854599fda,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6086fce-45')
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.216 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.217 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.217 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No VIF found with MAC fa:16:3e:e9:73:60, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.218 182939 INFO nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Using config drive
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.793 182939 INFO nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Creating config drive at /var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk.config
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.798 182939 DEBUG oslo_concurrency.processutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0mu5wy0c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.926 182939 DEBUG oslo_concurrency.processutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0mu5wy0c" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:54 compute-0 kernel: tap7bc267e3-f7 (unregistering): left promiscuous mode
Jan 22 00:17:54 compute-0 NetworkManager[55139]: <info>  [1769041074.9721] device (tap7bc267e3-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.974 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.986 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:54 compute-0 ovn_controller[95047]: 2026-01-22T00:17:54Z|00505|binding|INFO|Releasing lport 7bc267e3-f762-4a18-a3a2-42a7161a231e from this chassis (sb_readonly=0)
Jan 22 00:17:54 compute-0 ovn_controller[95047]: 2026-01-22T00:17:54Z|00506|binding|INFO|Setting lport 7bc267e3-f762-4a18-a3a2-42a7161a231e down in Southbound
Jan 22 00:17:54 compute-0 ovn_controller[95047]: 2026-01-22T00:17:54Z|00507|binding|INFO|Removing iface tap7bc267e3-f7 ovn-installed in OVS
Jan 22 00:17:54 compute-0 nova_compute[182935]: 2026-01-22 00:17:54.989 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:54 compute-0 NetworkManager[55139]: <info>  [1769041074.9932] manager: (tapc6086fce-45): new Tun device (/org/freedesktop/NetworkManager/Devices/242)
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.002 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.000 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:78:10 10.100.0.12'], port_security=['fa:16:3e:38:78:10 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '46feac9e-f412-4027-8cfb-f7280308085e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-184c07f2-f316-4056-b962-173c9a73cccb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9e59c3b5-e637-42fe-b28f-811656431607', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.203'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac0dd3c8-754f-43f7-8c8a-c2e10a6719dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=7bc267e3-f762-4a18-a3a2-42a7161a231e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.001 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 7bc267e3-f762-4a18-a3a2-42a7161a231e in datapath 184c07f2-f316-4056-b962-173c9a73cccb unbound from our chassis
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.002 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 184c07f2-f316-4056-b962-173c9a73cccb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:17:55 compute-0 kernel: tapc6086fce-45: entered promiscuous mode
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.004 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[920fad65-94e5-4953-8d5b-e630cdcfdd78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.004 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb namespace which is not needed anymore
Jan 22 00:17:55 compute-0 systemd-udevd[233906]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.008 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 ovn_controller[95047]: 2026-01-22T00:17:55Z|00508|binding|INFO|Claiming lport c6086fce-4544-4d96-8b80-4e4854599fda for this chassis.
Jan 22 00:17:55 compute-0 ovn_controller[95047]: 2026-01-22T00:17:55Z|00509|binding|INFO|c6086fce-4544-4d96-8b80-4e4854599fda: Claiming fa:16:3e:e9:73:60 10.100.0.12
Jan 22 00:17:55 compute-0 NetworkManager[55139]: <info>  [1769041075.0202] device (tapc6086fce-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:17:55 compute-0 NetworkManager[55139]: <info>  [1769041075.0207] device (tapc6086fce-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:17:55 compute-0 ovn_controller[95047]: 2026-01-22T00:17:55Z|00510|binding|INFO|Setting lport c6086fce-4544-4d96-8b80-4e4854599fda ovn-installed in OVS
Jan 22 00:17:55 compute-0 ovn_controller[95047]: 2026-01-22T00:17:55Z|00511|binding|INFO|Setting lport c6086fce-4544-4d96-8b80-4e4854599fda up in Southbound
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.022 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:73:60 10.100.0.12'], port_security=['fa:16:3e:e9:73:60 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'de316698-4994-41d0-af54-de924c3b99c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=c6086fce-4544-4d96-8b80-4e4854599fda) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:17:55 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000086.scope: Deactivated successfully.
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.024 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000086.scope: Consumed 14.035s CPU time.
Jan 22 00:17:55 compute-0 systemd-machined[154182]: Machine qemu-68-instance-00000086 terminated.
Jan 22 00:17:55 compute-0 systemd-machined[154182]: New machine qemu-69-instance-0000008b.
Jan 22 00:17:55 compute-0 systemd[1]: Started Virtual Machine qemu-69-instance-0000008b.
Jan 22 00:17:55 compute-0 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233697]: [NOTICE]   (233701) : haproxy version is 2.8.14-c23fe91
Jan 22 00:17:55 compute-0 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233697]: [NOTICE]   (233701) : path to executable is /usr/sbin/haproxy
Jan 22 00:17:55 compute-0 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233697]: [WARNING]  (233701) : Exiting Master process...
Jan 22 00:17:55 compute-0 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233697]: [ALERT]    (233701) : Current worker (233703) exited with code 143 (Terminated)
Jan 22 00:17:55 compute-0 neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb[233697]: [WARNING]  (233701) : All workers exited. Exiting... (0)
Jan 22 00:17:55 compute-0 systemd[1]: libpod-a5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5.scope: Deactivated successfully.
Jan 22 00:17:55 compute-0 podman[233937]: 2026-01-22 00:17:55.152501107 +0000 UTC m=+0.048350460 container died a5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:17:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5-userdata-shm.mount: Deactivated successfully.
Jan 22 00:17:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-9b917df69e2d6b5d3abc00cf7dca05f5bdf21de929e29618298684fa59a300f5-merged.mount: Deactivated successfully.
Jan 22 00:17:55 compute-0 podman[233937]: 2026-01-22 00:17:55.187676788 +0000 UTC m=+0.083526141 container cleanup a5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 00:17:55 compute-0 systemd[1]: libpod-conmon-a5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5.scope: Deactivated successfully.
Jan 22 00:17:55 compute-0 podman[233974]: 2026-01-22 00:17:55.265022814 +0000 UTC m=+0.051143684 container remove a5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.270 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6347e29e-87fe-4aba-8727-ceee9c5bfacf]: (4, ('Thu Jan 22 12:17:55 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb (a5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5)\na5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5\nThu Jan 22 12:17:55 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb (a5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5)\na5b5026b4afd09b25519c4ef9dc26004b91b100c709a4210590309dff6f797e5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.273 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b15849d0-6a4d-4355-ab2a-ce23be217f1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.274 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap184c07f2-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.277 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 kernel: tap184c07f2-f0: left promiscuous mode
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.292 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.295 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[caf725b9-dc2c-419d-83f3-466ab7c835f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.304 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041075.3032606, de316698-4994-41d0-af54-de924c3b99c2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.304 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: de316698-4994-41d0-af54-de924c3b99c2] VM Started (Lifecycle Event)
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.313 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c45d54d4-b414-4f74-8393-4335321d8cf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.314 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ac5347-01af-4498-934f-6a0fb7a00dbf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.331 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b5f709aa-0e34-4a48-a4c7-0d8d9d04b0fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554251, 'reachable_time': 20747, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234012, 'error': None, 'target': 'ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 systemd[1]: run-netns-ovnmeta\x2d184c07f2\x2df316\x2d4056\x2db962\x2d173c9a73cccb.mount: Deactivated successfully.
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.335 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-184c07f2-f316-4056-b962-173c9a73cccb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.335 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[5c1dff1f-f0ea-4822-8375-082bdd7c7661]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.336 104408 INFO neutron.agent.ovn.metadata.agent [-] Port c6086fce-4544-4d96-8b80-4e4854599fda in datapath aabf11c6-ef94-408a-8148-6c6400566606 unbound from our chassis
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.337 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aabf11c6-ef94-408a-8148-6c6400566606
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.348 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2677eae3-4f05-4a90-bf07-032cc0895d47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.349 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaabf11c6-e1 in ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.351 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaabf11c6-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.351 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[164bbf4e-d383-4207-bb77-8c80e4fb4551]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.352 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9082fdf3-fed0-4007-a9e8-8e8c2634b490]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.361 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: de316698-4994-41d0-af54-de924c3b99c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.365 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041075.3033729, de316698-4994-41d0-af54-de924c3b99c2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.365 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: de316698-4994-41d0-af54-de924c3b99c2] VM Paused (Lifecycle Event)
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.365 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[34ece3c1-4f9e-4d25-8b60-b74bf12dfc28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.379 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dfb06310-a430-434a-b1ff-32dcb51dfcfa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.390 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: de316698-4994-41d0-af54-de924c3b99c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.394 182939 DEBUG nova.compute.manager [req-ee7826a9-ea23-45c0-b753-489c20bd1267 req-8702eaab-23fc-49c7-9bf2-47693bcce176 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Received event network-vif-plugged-c6086fce-4544-4d96-8b80-4e4854599fda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.395 182939 DEBUG oslo_concurrency.lockutils [req-ee7826a9-ea23-45c0-b753-489c20bd1267 req-8702eaab-23fc-49c7-9bf2-47693bcce176 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "de316698-4994-41d0-af54-de924c3b99c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.395 182939 DEBUG oslo_concurrency.lockutils [req-ee7826a9-ea23-45c0-b753-489c20bd1267 req-8702eaab-23fc-49c7-9bf2-47693bcce176 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.395 182939 DEBUG oslo_concurrency.lockutils [req-ee7826a9-ea23-45c0-b753-489c20bd1267 req-8702eaab-23fc-49c7-9bf2-47693bcce176 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.396 182939 DEBUG nova.compute.manager [req-ee7826a9-ea23-45c0-b753-489c20bd1267 req-8702eaab-23fc-49c7-9bf2-47693bcce176 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Processing event network-vif-plugged-c6086fce-4544-4d96-8b80-4e4854599fda _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.396 182939 DEBUG nova.compute.manager [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.400 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: de316698-4994-41d0-af54-de924c3b99c2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.402 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.406 182939 INFO nova.virt.libvirt.driver [-] [instance: de316698-4994-41d0-af54-de924c3b99c2] Instance spawned successfully.
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.407 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.412 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c95ab50c-a05a-44c7-bf15-c6d1f33f309e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.418 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[727335b2-33c4-4855-8c1b-23e1aff4d105]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 systemd-udevd[233904]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:17:55 compute-0 NetworkManager[55139]: <info>  [1769041075.4195] manager: (tapaabf11c6-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/243)
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.432 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: de316698-4994-41d0-af54-de924c3b99c2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.432 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041075.399703, de316698-4994-41d0-af54-de924c3b99c2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.432 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: de316698-4994-41d0-af54-de924c3b99c2] VM Resumed (Lifecycle Event)
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.441 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.441 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.442 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.442 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.443 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.443 182939 DEBUG nova.virt.libvirt.driver [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.450 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8be69586-0f0d-4be7-9595-a52f6324a4ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.456 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9c45c83f-40d4-4f14-8c5e-6932d480ddf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.478 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: de316698-4994-41d0-af54-de924c3b99c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:55 compute-0 NetworkManager[55139]: <info>  [1769041075.4797] device (tapaabf11c6-e0): carrier: link connected
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.481 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: de316698-4994-41d0-af54-de924c3b99c2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.483 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[2f194557-79cb-4467-ab3e-b9abe3676893]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.501 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5d33f60d-c8d2-4560-ab54-bfb3c9694aa0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 163], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557821, 'reachable_time': 17033, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234039, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.512 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: de316698-4994-41d0-af54-de924c3b99c2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.515 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[93ed767f-d9be-4f85-ad8d-9fbb609de299]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:1b62'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 557821, 'tstamp': 557821}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234040, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.532 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c43925-8f4f-4a72-9696-18ae38150848]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 163], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557821, 'reachable_time': 17033, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234041, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.553 182939 INFO nova.compute.manager [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Took 5.99 seconds to spawn the instance on the hypervisor.
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.553 182939 DEBUG nova.compute.manager [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.563 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[918b1e61-77c2-41be-8000-0ccba46fd99b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.632 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa2510c-8e5c-499c-882e-203d7b4015b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.634 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.634 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.635 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaabf11c6-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.636 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 NetworkManager[55139]: <info>  [1769041075.6373] manager: (tapaabf11c6-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Jan 22 00:17:55 compute-0 kernel: tapaabf11c6-e0: entered promiscuous mode
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.640 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.641 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaabf11c6-e0, col_values=(('external_ids', {'iface-id': '1ae0dbff-a7cd-4db8-afc3-1d102fdd130f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.642 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 ovn_controller[95047]: 2026-01-22T00:17:55Z|00512|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.658 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.661 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.662 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.663 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9e0c2b36-bda1-427e-b3d4-34f65d10e77c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.664 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-aabf11c6-ef94-408a-8148-6c6400566606
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID aabf11c6-ef94-408a-8148-6c6400566606
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:17:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:55.666 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'env', 'PROCESS_TAG=haproxy-aabf11c6-ef94-408a-8148-6c6400566606', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aabf11c6-ef94-408a-8148-6c6400566606.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.675 182939 INFO nova.compute.manager [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Took 6.82 seconds to build instance.
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.722 182939 DEBUG oslo_concurrency.lockutils [None req-f0ae9b60-7d15-4f04-b34a-e919f9d1eee9 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.970s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.732 182939 INFO nova.virt.libvirt.driver [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Instance shutdown successfully after 3 seconds.
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.738 182939 INFO nova.virt.libvirt.driver [-] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Instance destroyed successfully.
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.739 182939 DEBUG nova.virt.libvirt.vif [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:17:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1889498750',display_name='tempest-TestNetworkAdvancedServerOps-server-1889498750',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1889498750',id=134,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOTO7d2Dwnkbz9wr9hWsejC9/1+pdYEpWKDQobSKPUmWC0nAs/mdLNrBlKhRnQPpVBXnMQms4q8X3v+9bWXw5gwGNW9NuZlObmqlerpOa7gv/9x3J0wC1Nx+jU/uK6YUg==',key_name='tempest-TestNetworkAdvancedServerOps-1110887450',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:17:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-ib8w2x0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:17:48Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=46feac9e-f412-4027-8cfb-f7280308085e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1947088510", "vif_mac": "fa:16:3e:38:78:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.739 182939 DEBUG nova.network.os_vif_util [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converting VIF {"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1947088510", "vif_mac": "fa:16:3e:38:78:10"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.740 182939 DEBUG nova.network.os_vif_util [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.740 182939 DEBUG os_vif [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.742 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.743 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7bc267e3-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.745 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.746 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.748 182939 INFO os_vif [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7')
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.751 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.809 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.810 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.830 182939 DEBUG nova.network.neutron [req-913c85ad-99cb-4d98-a164-6d384073804c req-5bd5f94d-0fb1-4afb-b808-b72e64cb6da8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Updated VIF entry in instance network info cache for port c6086fce-4544-4d96-8b80-4e4854599fda. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.831 182939 DEBUG nova.network.neutron [req-913c85ad-99cb-4d98-a164-6d384073804c req-5bd5f94d-0fb1-4afb-b808-b72e64cb6da8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Updating instance_info_cache with network_info: [{"id": "c6086fce-4544-4d96-8b80-4e4854599fda", "address": "fa:16:3e:e9:73:60", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6086fce-45", "ovs_interfaceid": "c6086fce-4544-4d96-8b80-4e4854599fda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.848 182939 DEBUG oslo_concurrency.lockutils [req-913c85ad-99cb-4d98-a164-6d384073804c req-5bd5f94d-0fb1-4afb-b808-b72e64cb6da8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-de316698-4994-41d0-af54-de924c3b99c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.867 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.869 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Copying file /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e_resize/disk to 192.168.122.102:/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:17:55 compute-0 nova_compute[182935]: 2026-01-22 00:17:55.869 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e_resize/disk 192.168.122.102:/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:56 compute-0 podman[234084]: 2026-01-22 00:17:56.028525101 +0000 UTC m=+0.050455109 container create cb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:17:56 compute-0 systemd[1]: Started libpod-conmon-cb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b.scope.
Jan 22 00:17:56 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:17:56 compute-0 podman[234084]: 2026-01-22 00:17:56.001604233 +0000 UTC m=+0.023534281 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:17:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8bf52c71c3341b314a02f0a2068fb1a25395e41d5c0f5290b8e9fc0810dc7365/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:17:56 compute-0 podman[234084]: 2026-01-22 00:17:56.118134583 +0000 UTC m=+0.140064621 container init cb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:17:56 compute-0 podman[234084]: 2026-01-22 00:17:56.124235666 +0000 UTC m=+0.146165684 container start cb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:17:56 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234099]: [NOTICE]   (234116) : New worker (234123) forked
Jan 22 00:17:56 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234099]: [NOTICE]   (234116) : Loading success.
Jan 22 00:17:56 compute-0 podman[234103]: 2026-01-22 00:17:56.214437212 +0000 UTC m=+0.096323100 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:17:56 compute-0 podman[234102]: 2026-01-22 00:17:56.216260554 +0000 UTC m=+0.103802634 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 00:17:56 compute-0 nova_compute[182935]: 2026-01-22 00:17:56.477 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "scp -r /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e_resize/disk 192.168.122.102:/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk" returned: 0 in 0.607s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:56 compute-0 nova_compute[182935]: 2026-01-22 00:17:56.478 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Copying file /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:17:56 compute-0 nova_compute[182935]: 2026-01-22 00:17:56.478 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e_resize/disk.config 192.168.122.102:/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:56 compute-0 nova_compute[182935]: 2026-01-22 00:17:56.738 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "scp -C -r /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e_resize/disk.config 192.168.122.102:/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.config" returned: 0 in 0.259s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:56 compute-0 nova_compute[182935]: 2026-01-22 00:17:56.739 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Copying file /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:17:56 compute-0 nova_compute[182935]: 2026-01-22 00:17:56.740 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e_resize/disk.info 192.168.122.102:/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:17:56 compute-0 nova_compute[182935]: 2026-01-22 00:17:56.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:56 compute-0 nova_compute[182935]: 2026-01-22 00:17:56.989 182939 DEBUG oslo_concurrency.processutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "scp -C -r /var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e_resize/disk.info 192.168.122.102:/var/lib/nova/instances/46feac9e-f412-4027-8cfb-f7280308085e/disk.info" returned: 0 in 0.249s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.113 182939 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-unplugged-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.113 182939 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.114 182939 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.114 182939 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.114 182939 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] No waiting events found dispatching network-vif-unplugged-7bc267e3-f762-4a18-a3a2-42a7161a231e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.115 182939 WARNING nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received unexpected event network-vif-unplugged-7bc267e3-f762-4a18-a3a2-42a7161a231e for instance with vm_state active and task_state resize_migrating.
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.115 182939 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.115 182939 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.115 182939 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.115 182939 DEBUG oslo_concurrency.lockutils [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.116 182939 DEBUG nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] No waiting events found dispatching network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.116 182939 WARNING nova.compute.manager [req-0be5c44c-6e07-4152-8d76-76e3a0d4a8d1 req-486ba317-ee9c-42c6-b3f7-68f7d0381531 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received unexpected event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e for instance with vm_state active and task_state resize_migrating.
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.123 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.123 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.209 182939 DEBUG neutronclient.v2_0.client [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 7bc267e3-f762-4a18-a3a2-42a7161a231e for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.298 182939 DEBUG oslo_concurrency.lockutils [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "de316698-4994-41d0-af54-de924c3b99c2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.299 182939 DEBUG oslo_concurrency.lockutils [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.300 182939 DEBUG oslo_concurrency.lockutils [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "de316698-4994-41d0-af54-de924c3b99c2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.300 182939 DEBUG oslo_concurrency.lockutils [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.300 182939 DEBUG oslo_concurrency.lockutils [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.313 182939 INFO nova.compute.manager [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Terminating instance
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.326 182939 DEBUG nova.compute.manager [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:17:57 compute-0 kernel: tapc6086fce-45 (unregistering): left promiscuous mode
Jan 22 00:17:57 compute-0 NetworkManager[55139]: <info>  [1769041077.3479] device (tapc6086fce-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:17:57 compute-0 ovn_controller[95047]: 2026-01-22T00:17:57Z|00513|binding|INFO|Releasing lport c6086fce-4544-4d96-8b80-4e4854599fda from this chassis (sb_readonly=0)
Jan 22 00:17:57 compute-0 ovn_controller[95047]: 2026-01-22T00:17:57Z|00514|binding|INFO|Setting lport c6086fce-4544-4d96-8b80-4e4854599fda down in Southbound
Jan 22 00:17:57 compute-0 ovn_controller[95047]: 2026-01-22T00:17:57Z|00515|binding|INFO|Removing iface tapc6086fce-45 ovn-installed in OVS
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.363 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:73:60 10.100.0.12'], port_security=['fa:16:3e:e9:73:60 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'de316698-4994-41d0-af54-de924c3b99c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=c6086fce-4544-4d96-8b80-4e4854599fda) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.365 104408 INFO neutron.agent.ovn.metadata.agent [-] Port c6086fce-4544-4d96-8b80-4e4854599fda in datapath aabf11c6-ef94-408a-8148-6c6400566606 unbound from our chassis
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.367 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aabf11c6-ef94-408a-8148-6c6400566606, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.368 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf97b43-2e15-4dab-ae30-ec3ffb486d13]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.368 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 namespace which is not needed anymore
Jan 22 00:17:57 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Jan 22 00:17:57 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d0000008b.scope: Consumed 2.173s CPU time.
Jan 22 00:17:57 compute-0 systemd-machined[154182]: Machine qemu-69-instance-0000008b terminated.
Jan 22 00:17:57 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234099]: [NOTICE]   (234116) : haproxy version is 2.8.14-c23fe91
Jan 22 00:17:57 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234099]: [NOTICE]   (234116) : path to executable is /usr/sbin/haproxy
Jan 22 00:17:57 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234099]: [WARNING]  (234116) : Exiting Master process...
Jan 22 00:17:57 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234099]: [WARNING]  (234116) : Exiting Master process...
Jan 22 00:17:57 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234099]: [ALERT]    (234116) : Current worker (234123) exited with code 143 (Terminated)
Jan 22 00:17:57 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[234099]: [WARNING]  (234116) : All workers exited. Exiting... (0)
Jan 22 00:17:57 compute-0 systemd[1]: libpod-cb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b.scope: Deactivated successfully.
Jan 22 00:17:57 compute-0 podman[234179]: 2026-01-22 00:17:57.509659164 +0000 UTC m=+0.045521944 container died cb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:17:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b-userdata-shm.mount: Deactivated successfully.
Jan 22 00:17:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-8bf52c71c3341b314a02f0a2068fb1a25395e41d5c0f5290b8e9fc0810dc7365-merged.mount: Deactivated successfully.
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.540 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.546 182939 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:57 compute-0 podman[234179]: 2026-01-22 00:17:57.546936034 +0000 UTC m=+0.082798794 container cleanup cb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.547 182939 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.547 182939 DEBUG oslo_concurrency.lockutils [None req-6a54018f-e373-40db-bb2b-666c3eb4f8c4 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:57 compute-0 NetworkManager[55139]: <info>  [1769041077.5487] manager: (tapc6086fce-45): new Tun device (/org/freedesktop/NetworkManager/Devices/245)
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.552 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.556 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:57 compute-0 systemd[1]: libpod-conmon-cb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b.scope: Deactivated successfully.
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.586 182939 DEBUG nova.compute.manager [req-d5f0794d-2013-41b7-af45-2e51fb6d086e req-284e78f7-4428-4ffe-b26f-6e42885c4e61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Received event network-vif-plugged-c6086fce-4544-4d96-8b80-4e4854599fda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.586 182939 DEBUG oslo_concurrency.lockutils [req-d5f0794d-2013-41b7-af45-2e51fb6d086e req-284e78f7-4428-4ffe-b26f-6e42885c4e61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "de316698-4994-41d0-af54-de924c3b99c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.587 182939 DEBUG oslo_concurrency.lockutils [req-d5f0794d-2013-41b7-af45-2e51fb6d086e req-284e78f7-4428-4ffe-b26f-6e42885c4e61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.587 182939 DEBUG oslo_concurrency.lockutils [req-d5f0794d-2013-41b7-af45-2e51fb6d086e req-284e78f7-4428-4ffe-b26f-6e42885c4e61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.587 182939 DEBUG nova.compute.manager [req-d5f0794d-2013-41b7-af45-2e51fb6d086e req-284e78f7-4428-4ffe-b26f-6e42885c4e61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] No waiting events found dispatching network-vif-plugged-c6086fce-4544-4d96-8b80-4e4854599fda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.587 182939 WARNING nova.compute.manager [req-d5f0794d-2013-41b7-af45-2e51fb6d086e req-284e78f7-4428-4ffe-b26f-6e42885c4e61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Received unexpected event network-vif-plugged-c6086fce-4544-4d96-8b80-4e4854599fda for instance with vm_state active and task_state deleting.
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.597 182939 INFO nova.virt.libvirt.driver [-] [instance: de316698-4994-41d0-af54-de924c3b99c2] Instance destroyed successfully.
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.598 182939 DEBUG nova.objects.instance [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'resources' on Instance uuid de316698-4994-41d0-af54-de924c3b99c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.613 182939 DEBUG nova.virt.libvirt.vif [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:17:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-783971544',display_name='tempest-ServersTestJSON-server-783971544',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-783971544',id=139,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:17:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-1vpgdm6o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:17:55Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=de316698-4994-41d0-af54-de924c3b99c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6086fce-4544-4d96-8b80-4e4854599fda", "address": "fa:16:3e:e9:73:60", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6086fce-45", "ovs_interfaceid": "c6086fce-4544-4d96-8b80-4e4854599fda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.614 182939 DEBUG nova.network.os_vif_util [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "c6086fce-4544-4d96-8b80-4e4854599fda", "address": "fa:16:3e:e9:73:60", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6086fce-45", "ovs_interfaceid": "c6086fce-4544-4d96-8b80-4e4854599fda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.615 182939 DEBUG nova.network.os_vif_util [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:73:60,bridge_name='br-int',has_traffic_filtering=True,id=c6086fce-4544-4d96-8b80-4e4854599fda,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6086fce-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.615 182939 DEBUG os_vif [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:73:60,bridge_name='br-int',has_traffic_filtering=True,id=c6086fce-4544-4d96-8b80-4e4854599fda,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6086fce-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.617 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.617 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6086fce-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:57 compute-0 podman[234214]: 2026-01-22 00:17:57.618854793 +0000 UTC m=+0.049298922 container remove cb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.621 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.624 182939 INFO os_vif [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:73:60,bridge_name='br-int',has_traffic_filtering=True,id=c6086fce-4544-4d96-8b80-4e4854599fda,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6086fce-45')
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.625 182939 INFO nova.virt.libvirt.driver [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Deleting instance files /var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2_del
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.626 182939 INFO nova.virt.libvirt.driver [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Deletion of /var/lib/nova/instances/de316698-4994-41d0-af54-de924c3b99c2_del complete
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.626 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[033537d6-5b14-4c61-9b52-25c58caa7f65]: (4, ('Thu Jan 22 12:17:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 (cb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b)\ncb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b\nThu Jan 22 12:17:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 (cb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b)\ncb30c39f7b6049f45ab92c0e2a1e4f2b975974853252c9e5b8b1eb7a616bb97b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.627 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3329eb40-357d-4a9a-bad4-d74b3beb3efc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.628 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.631 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:57 compute-0 kernel: tapaabf11c6-e0: left promiscuous mode
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.643 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.646 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[20ef5818-9b6b-4dc1-9b3e-fc9e87b05d5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.660 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[06ce3436-61d7-4338-84df-607fd1199a6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.662 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e00253ee-6482-4daf-a852-5929382e2d23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.679 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2eca54ba-a1b5-4356-aa14-b4ceab6ad7ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557814, 'reachable_time': 32230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234240, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.682 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:17:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:17:57.682 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[437d34ea-fd8f-4633-968a-660e116ac138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:17:57 compute-0 systemd[1]: run-netns-ovnmeta\x2daabf11c6\x2def94\x2d408a\x2d8148\x2d6c6400566606.mount: Deactivated successfully.
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.737 182939 INFO nova.compute.manager [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Took 0.41 seconds to destroy the instance on the hypervisor.
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.738 182939 DEBUG oslo.service.loopingcall [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.738 182939 DEBUG nova.compute.manager [-] [instance: de316698-4994-41d0-af54-de924c3b99c2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.738 182939 DEBUG nova.network.neutron [-] [instance: de316698-4994-41d0-af54-de924c3b99c2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:17:57 compute-0 nova_compute[182935]: 2026-01-22 00:17:57.761 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.093 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.150 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Triggering sync for uuid de316698-4994-41d0-af54-de924c3b99c2 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.151 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "de316698-4994-41d0-af54-de924c3b99c2" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.564 182939 DEBUG nova.network.neutron [-] [instance: de316698-4994-41d0-af54-de924c3b99c2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.591 182939 INFO nova.compute.manager [-] [instance: de316698-4994-41d0-af54-de924c3b99c2] Took 0.85 seconds to deallocate network for instance.
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.667 182939 DEBUG nova.compute.manager [req-b56510b0-353b-4950-a066-0636a4772477 req-f1e1d814-4f16-431a-8c78-9e12df4a086b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Received event network-vif-deleted-c6086fce-4544-4d96-8b80-4e4854599fda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.729 182939 DEBUG oslo_concurrency.lockutils [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.729 182939 DEBUG oslo_concurrency.lockutils [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.808 182939 DEBUG nova.compute.provider_tree [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.821 182939 DEBUG nova.scheduler.client.report [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.840 182939 DEBUG oslo_concurrency.lockutils [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.888 182939 DEBUG nova.compute.manager [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-changed-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.888 182939 DEBUG nova.compute.manager [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Refreshing instance network info cache due to event network-changed-7bc267e3-f762-4a18-a3a2-42a7161a231e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.889 182939 DEBUG oslo_concurrency.lockutils [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.889 182939 DEBUG oslo_concurrency.lockutils [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.889 182939 DEBUG nova.network.neutron [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Refreshing network info cache for port 7bc267e3-f762-4a18-a3a2-42a7161a231e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.891 182939 INFO nova.scheduler.client.report [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Deleted allocations for instance de316698-4994-41d0-af54-de924c3b99c2
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.993 182939 DEBUG oslo_concurrency.lockutils [None req-2412c850-68c2-40a9-ac3c-4d01820dc53b 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.994 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "de316698-4994-41d0-af54-de924c3b99c2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.843s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.994 182939 INFO nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: de316698-4994-41d0-af54-de924c3b99c2] During sync_power_state the instance has a pending task (deleting). Skip.
Jan 22 00:17:58 compute-0 nova_compute[182935]: 2026-01-22 00:17:58.995 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "de316698-4994-41d0-af54-de924c3b99c2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:59 compute-0 nova_compute[182935]: 2026-01-22 00:17:59.689 182939 DEBUG nova.compute.manager [req-27950756-8692-407a-8f7f-7b08bab3884d req-9bd6d5a0-0aa4-4a8f-a2e8-07d291b3d569 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Received event network-vif-unplugged-c6086fce-4544-4d96-8b80-4e4854599fda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:59 compute-0 nova_compute[182935]: 2026-01-22 00:17:59.689 182939 DEBUG oslo_concurrency.lockutils [req-27950756-8692-407a-8f7f-7b08bab3884d req-9bd6d5a0-0aa4-4a8f-a2e8-07d291b3d569 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "de316698-4994-41d0-af54-de924c3b99c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:59 compute-0 nova_compute[182935]: 2026-01-22 00:17:59.690 182939 DEBUG oslo_concurrency.lockutils [req-27950756-8692-407a-8f7f-7b08bab3884d req-9bd6d5a0-0aa4-4a8f-a2e8-07d291b3d569 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:59 compute-0 nova_compute[182935]: 2026-01-22 00:17:59.690 182939 DEBUG oslo_concurrency.lockutils [req-27950756-8692-407a-8f7f-7b08bab3884d req-9bd6d5a0-0aa4-4a8f-a2e8-07d291b3d569 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:59 compute-0 nova_compute[182935]: 2026-01-22 00:17:59.690 182939 DEBUG nova.compute.manager [req-27950756-8692-407a-8f7f-7b08bab3884d req-9bd6d5a0-0aa4-4a8f-a2e8-07d291b3d569 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] No waiting events found dispatching network-vif-unplugged-c6086fce-4544-4d96-8b80-4e4854599fda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:17:59 compute-0 nova_compute[182935]: 2026-01-22 00:17:59.691 182939 WARNING nova.compute.manager [req-27950756-8692-407a-8f7f-7b08bab3884d req-9bd6d5a0-0aa4-4a8f-a2e8-07d291b3d569 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Received unexpected event network-vif-unplugged-c6086fce-4544-4d96-8b80-4e4854599fda for instance with vm_state deleted and task_state None.
Jan 22 00:17:59 compute-0 nova_compute[182935]: 2026-01-22 00:17:59.691 182939 DEBUG nova.compute.manager [req-27950756-8692-407a-8f7f-7b08bab3884d req-9bd6d5a0-0aa4-4a8f-a2e8-07d291b3d569 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Received event network-vif-plugged-c6086fce-4544-4d96-8b80-4e4854599fda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:17:59 compute-0 nova_compute[182935]: 2026-01-22 00:17:59.691 182939 DEBUG oslo_concurrency.lockutils [req-27950756-8692-407a-8f7f-7b08bab3884d req-9bd6d5a0-0aa4-4a8f-a2e8-07d291b3d569 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "de316698-4994-41d0-af54-de924c3b99c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:17:59 compute-0 nova_compute[182935]: 2026-01-22 00:17:59.691 182939 DEBUG oslo_concurrency.lockutils [req-27950756-8692-407a-8f7f-7b08bab3884d req-9bd6d5a0-0aa4-4a8f-a2e8-07d291b3d569 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:17:59 compute-0 nova_compute[182935]: 2026-01-22 00:17:59.692 182939 DEBUG oslo_concurrency.lockutils [req-27950756-8692-407a-8f7f-7b08bab3884d req-9bd6d5a0-0aa4-4a8f-a2e8-07d291b3d569 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "de316698-4994-41d0-af54-de924c3b99c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:17:59 compute-0 nova_compute[182935]: 2026-01-22 00:17:59.692 182939 DEBUG nova.compute.manager [req-27950756-8692-407a-8f7f-7b08bab3884d req-9bd6d5a0-0aa4-4a8f-a2e8-07d291b3d569 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] No waiting events found dispatching network-vif-plugged-c6086fce-4544-4d96-8b80-4e4854599fda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:17:59 compute-0 nova_compute[182935]: 2026-01-22 00:17:59.692 182939 WARNING nova.compute.manager [req-27950756-8692-407a-8f7f-7b08bab3884d req-9bd6d5a0-0aa4-4a8f-a2e8-07d291b3d569 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: de316698-4994-41d0-af54-de924c3b99c2] Received unexpected event network-vif-plugged-c6086fce-4544-4d96-8b80-4e4854599fda for instance with vm_state deleted and task_state None.
Jan 22 00:18:00 compute-0 nova_compute[182935]: 2026-01-22 00:18:00.284 182939 DEBUG nova.network.neutron [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updated VIF entry in instance network info cache for port 7bc267e3-f762-4a18-a3a2-42a7161a231e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:18:00 compute-0 nova_compute[182935]: 2026-01-22 00:18:00.285 182939 DEBUG nova.network.neutron [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating instance_info_cache with network_info: [{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:18:00 compute-0 nova_compute[182935]: 2026-01-22 00:18:00.559 182939 DEBUG oslo_concurrency.lockutils [req-164072ac-9fa4-48f2-9c7f-d6a22bd849ed req-e01b6ba4-a895-4b3f-a40d-fb78ff6428b7 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:18:02 compute-0 nova_compute[182935]: 2026-01-22 00:18:02.560 182939 DEBUG nova.compute.manager [req-5b34d5ec-b838-49eb-8631-76f620e17fb0 req-e66cbc1c-78c4-4386-913b-de59c3130dce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:18:02 compute-0 nova_compute[182935]: 2026-01-22 00:18:02.560 182939 DEBUG oslo_concurrency.lockutils [req-5b34d5ec-b838-49eb-8631-76f620e17fb0 req-e66cbc1c-78c4-4386-913b-de59c3130dce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:02 compute-0 nova_compute[182935]: 2026-01-22 00:18:02.560 182939 DEBUG oslo_concurrency.lockutils [req-5b34d5ec-b838-49eb-8631-76f620e17fb0 req-e66cbc1c-78c4-4386-913b-de59c3130dce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:02 compute-0 nova_compute[182935]: 2026-01-22 00:18:02.561 182939 DEBUG oslo_concurrency.lockutils [req-5b34d5ec-b838-49eb-8631-76f620e17fb0 req-e66cbc1c-78c4-4386-913b-de59c3130dce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:02 compute-0 nova_compute[182935]: 2026-01-22 00:18:02.561 182939 DEBUG nova.compute.manager [req-5b34d5ec-b838-49eb-8631-76f620e17fb0 req-e66cbc1c-78c4-4386-913b-de59c3130dce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] No waiting events found dispatching network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:18:02 compute-0 nova_compute[182935]: 2026-01-22 00:18:02.561 182939 WARNING nova.compute.manager [req-5b34d5ec-b838-49eb-8631-76f620e17fb0 req-e66cbc1c-78c4-4386-913b-de59c3130dce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received unexpected event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e for instance with vm_state active and task_state resize_finish.
Jan 22 00:18:02 compute-0 nova_compute[182935]: 2026-01-22 00:18:02.620 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:02 compute-0 nova_compute[182935]: 2026-01-22 00:18:02.763 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:02 compute-0 nova_compute[182935]: 2026-01-22 00:18:02.852 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:03.215 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:03.216 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:03.216 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:04 compute-0 nova_compute[182935]: 2026-01-22 00:18:04.079 182939 DEBUG oslo_concurrency.lockutils [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:04 compute-0 nova_compute[182935]: 2026-01-22 00:18:04.080 182939 DEBUG oslo_concurrency.lockutils [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:04 compute-0 nova_compute[182935]: 2026-01-22 00:18:04.080 182939 DEBUG nova.compute.manager [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Going to confirm migration 17 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 22 00:18:04 compute-0 nova_compute[182935]: 2026-01-22 00:18:04.255 182939 DEBUG nova.objects.instance [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'info_cache' on Instance uuid 46feac9e-f412-4027-8cfb-f7280308085e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:18:05 compute-0 nova_compute[182935]: 2026-01-22 00:18:05.035 182939 DEBUG neutronclient.v2_0.client [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 7bc267e3-f762-4a18-a3a2-42a7161a231e for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 00:18:05 compute-0 nova_compute[182935]: 2026-01-22 00:18:05.036 182939 DEBUG oslo_concurrency.lockutils [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:18:05 compute-0 nova_compute[182935]: 2026-01-22 00:18:05.037 182939 DEBUG oslo_concurrency.lockutils [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:18:05 compute-0 nova_compute[182935]: 2026-01-22 00:18:05.037 182939 DEBUG nova.network.neutron [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:18:05 compute-0 nova_compute[182935]: 2026-01-22 00:18:05.057 182939 DEBUG nova.compute.manager [req-da858a30-795e-4208-9885-940989f6cfd8 req-77314b1e-3c03-47c6-a224-507c33bbe3f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:18:05 compute-0 nova_compute[182935]: 2026-01-22 00:18:05.058 182939 DEBUG oslo_concurrency.lockutils [req-da858a30-795e-4208-9885-940989f6cfd8 req-77314b1e-3c03-47c6-a224-507c33bbe3f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "46feac9e-f412-4027-8cfb-f7280308085e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:05 compute-0 nova_compute[182935]: 2026-01-22 00:18:05.058 182939 DEBUG oslo_concurrency.lockutils [req-da858a30-795e-4208-9885-940989f6cfd8 req-77314b1e-3c03-47c6-a224-507c33bbe3f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:05 compute-0 nova_compute[182935]: 2026-01-22 00:18:05.058 182939 DEBUG oslo_concurrency.lockutils [req-da858a30-795e-4208-9885-940989f6cfd8 req-77314b1e-3c03-47c6-a224-507c33bbe3f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:05 compute-0 nova_compute[182935]: 2026-01-22 00:18:05.059 182939 DEBUG nova.compute.manager [req-da858a30-795e-4208-9885-940989f6cfd8 req-77314b1e-3c03-47c6-a224-507c33bbe3f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] No waiting events found dispatching network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:18:05 compute-0 nova_compute[182935]: 2026-01-22 00:18:05.059 182939 WARNING nova.compute.manager [req-da858a30-795e-4208-9885-940989f6cfd8 req-77314b1e-3c03-47c6-a224-507c33bbe3f0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Received unexpected event network-vif-plugged-7bc267e3-f762-4a18-a3a2-42a7161a231e for instance with vm_state resized and task_state None.
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.096 182939 DEBUG nova.network.neutron [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Updating instance_info_cache with network_info: [{"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.220 182939 DEBUG oslo_concurrency.lockutils [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-46feac9e-f412-4027-8cfb-f7280308085e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.220 182939 DEBUG nova.objects.instance [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid 46feac9e-f412-4027-8cfb-f7280308085e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.256 182939 DEBUG nova.virt.libvirt.vif [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:17:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1889498750',display_name='tempest-TestNetworkAdvancedServerOps-server-1889498750',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1889498750',id=134,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAOTO7d2Dwnkbz9wr9hWsejC9/1+pdYEpWKDQobSKPUmWC0nAs/mdLNrBlKhRnQPpVBXnMQms4q8X3v+9bWXw5gwGNW9NuZlObmqlerpOa7gv/9x3J0wC1Nx+jU/uK6YUg==',key_name='tempest-TestNetworkAdvancedServerOps-1110887450',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:18:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-ib8w2x0y',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:18:03Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=46feac9e-f412-4027-8cfb-f7280308085e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.256 182939 DEBUG nova.network.os_vif_util [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "address": "fa:16:3e:38:78:10", "network": {"id": "184c07f2-f316-4056-b962-173c9a73cccb", "bridge": "br-int", "label": "tempest-network-smoke--1947088510", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7bc267e3-f7", "ovs_interfaceid": "7bc267e3-f762-4a18-a3a2-42a7161a231e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.257 182939 DEBUG nova.network.os_vif_util [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.257 182939 DEBUG os_vif [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.259 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.259 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7bc267e3-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.259 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.261 182939 INFO os_vif [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:78:10,bridge_name='br-int',has_traffic_filtering=True,id=7bc267e3-f762-4a18-a3a2-42a7161a231e,network=Network(184c07f2-f316-4056-b962-173c9a73cccb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7bc267e3-f7')
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.261 182939 DEBUG oslo_concurrency.lockutils [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.262 182939 DEBUG oslo_concurrency.lockutils [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.328 182939 DEBUG nova.compute.provider_tree [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.342 182939 DEBUG nova.scheduler.client.report [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.395 182939 DEBUG oslo_concurrency.lockutils [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.433 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.533 182939 INFO nova.scheduler.client.report [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Deleted allocation for migration bd86f60c-99de-4bae-8548-969bcc2d8d50
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.617 182939 DEBUG oslo_concurrency.lockutils [None req-4825e68b-4b39-44df-bf00-9d80ccd310a7 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "46feac9e-f412-4027-8cfb-f7280308085e" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.537s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.619 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.621 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.764 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:07.955 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:18:07 compute-0 nova_compute[182935]: 2026-01-22 00:18:07.956 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:07.957 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:18:09 compute-0 nova_compute[182935]: 2026-01-22 00:18:09.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:09 compute-0 nova_compute[182935]: 2026-01-22 00:18:09.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:18:09 compute-0 nova_compute[182935]: 2026-01-22 00:18:09.926 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:18:10 compute-0 nova_compute[182935]: 2026-01-22 00:18:10.244 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041075.2424936, 46feac9e-f412-4027-8cfb-f7280308085e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:18:10 compute-0 nova_compute[182935]: 2026-01-22 00:18:10.245 182939 INFO nova.compute.manager [-] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] VM Stopped (Lifecycle Event)
Jan 22 00:18:10 compute-0 nova_compute[182935]: 2026-01-22 00:18:10.268 182939 DEBUG nova.compute.manager [None req-940e9d3a-1f99-4155-af0a-616960bc645e - - - - - -] [instance: 46feac9e-f412-4027-8cfb-f7280308085e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:18:10 compute-0 sshd-session[234242]: Invalid user redis from 188.166.69.60 port 49636
Jan 22 00:18:10 compute-0 sshd-session[234242]: Connection closed by invalid user redis 188.166.69.60 port 49636 [preauth]
Jan 22 00:18:11 compute-0 podman[234245]: 2026-01-22 00:18:11.678101872 +0000 UTC m=+0.052504647 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:18:11 compute-0 podman[234244]: 2026-01-22 00:18:11.706859994 +0000 UTC m=+0.081185677 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 00:18:12 compute-0 nova_compute[182935]: 2026-01-22 00:18:12.592 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041077.5914128, de316698-4994-41d0-af54-de924c3b99c2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:18:12 compute-0 nova_compute[182935]: 2026-01-22 00:18:12.592 182939 INFO nova.compute.manager [-] [instance: de316698-4994-41d0-af54-de924c3b99c2] VM Stopped (Lifecycle Event)
Jan 22 00:18:12 compute-0 nova_compute[182935]: 2026-01-22 00:18:12.611 182939 DEBUG nova.compute.manager [None req-9117d375-3e5e-44d7-83d9-f7f320b5f932 - - - - - -] [instance: de316698-4994-41d0-af54-de924c3b99c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:18:12 compute-0 nova_compute[182935]: 2026-01-22 00:18:12.623 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:12 compute-0 nova_compute[182935]: 2026-01-22 00:18:12.765 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:14 compute-0 podman[234294]: 2026-01-22 00:18:14.701693501 +0000 UTC m=+0.077298175 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:18:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:16.959 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:17 compute-0 nova_compute[182935]: 2026-01-22 00:18:17.625 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:17 compute-0 nova_compute[182935]: 2026-01-22 00:18:17.797 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:21 compute-0 podman[234318]: 2026-01-22 00:18:21.69773872 +0000 UTC m=+0.075852593 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 00:18:22 compute-0 nova_compute[182935]: 2026-01-22 00:18:22.629 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:22 compute-0 nova_compute[182935]: 2026-01-22 00:18:22.798 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:18:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:18:26 compute-0 podman[234338]: 2026-01-22 00:18:26.733400947 +0000 UTC m=+0.091694522 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:18:26 compute-0 podman[234337]: 2026-01-22 00:18:26.731974173 +0000 UTC m=+0.094480237 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=openstack_network_exporter, container_name=openstack_network_exporter)
Jan 22 00:18:27 compute-0 nova_compute[182935]: 2026-01-22 00:18:27.632 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:27 compute-0 nova_compute[182935]: 2026-01-22 00:18:27.800 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:27 compute-0 nova_compute[182935]: 2026-01-22 00:18:27.982 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "00f1dbef-5c82-4287-9df6-2fca347c2852" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:27 compute-0 nova_compute[182935]: 2026-01-22 00:18:27.982 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:28 compute-0 nova_compute[182935]: 2026-01-22 00:18:28.308 182939 DEBUG nova.compute.manager [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:18:29 compute-0 nova_compute[182935]: 2026-01-22 00:18:29.933 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:29 compute-0 nova_compute[182935]: 2026-01-22 00:18:29.933 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:29 compute-0 nova_compute[182935]: 2026-01-22 00:18:29.940 182939 DEBUG nova.virt.hardware [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:18:29 compute-0 nova_compute[182935]: 2026-01-22 00:18:29.940 182939 INFO nova.compute.claims [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:18:30 compute-0 nova_compute[182935]: 2026-01-22 00:18:30.909 182939 DEBUG nova.compute.provider_tree [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:18:31 compute-0 nova_compute[182935]: 2026-01-22 00:18:31.125 182939 DEBUG nova.scheduler.client.report [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:18:31 compute-0 nova_compute[182935]: 2026-01-22 00:18:31.503 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:31 compute-0 nova_compute[182935]: 2026-01-22 00:18:31.504 182939 DEBUG nova.compute.manager [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:18:32 compute-0 nova_compute[182935]: 2026-01-22 00:18:32.181 182939 DEBUG nova.compute.manager [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:18:32 compute-0 nova_compute[182935]: 2026-01-22 00:18:32.182 182939 DEBUG nova.network.neutron [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:18:32 compute-0 nova_compute[182935]: 2026-01-22 00:18:32.418 182939 INFO nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:18:32 compute-0 nova_compute[182935]: 2026-01-22 00:18:32.478 182939 DEBUG nova.policy [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '833f1e9dce90456ea55a443da6704907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:18:32 compute-0 nova_compute[182935]: 2026-01-22 00:18:32.634 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:32 compute-0 nova_compute[182935]: 2026-01-22 00:18:32.807 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.008 182939 DEBUG nova.compute.manager [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.446 182939 DEBUG nova.compute.manager [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.448 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.448 182939 INFO nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Creating image(s)
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.449 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "/var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.449 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.450 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "/var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.464 182939 DEBUG oslo_concurrency.processutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.528 182939 DEBUG oslo_concurrency.processutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.529 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.529 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.540 182939 DEBUG oslo_concurrency.processutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.613 182939 DEBUG oslo_concurrency.processutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.614 182939 DEBUG oslo_concurrency.processutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.654 182939 DEBUG oslo_concurrency.processutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.656 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.656 182939 DEBUG oslo_concurrency.processutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.712 182939 DEBUG oslo_concurrency.processutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.713 182939 DEBUG nova.virt.disk.api [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Checking if we can resize image /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.714 182939 DEBUG oslo_concurrency.processutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.773 182939 DEBUG oslo_concurrency.processutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.774 182939 DEBUG nova.virt.disk.api [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Cannot resize image /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.775 182939 DEBUG nova.objects.instance [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'migration_context' on Instance uuid 00f1dbef-5c82-4287-9df6-2fca347c2852 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.796 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.796 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Ensure instance console log exists: /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.797 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.797 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.797 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:33 compute-0 nova_compute[182935]: 2026-01-22 00:18:33.905 182939 DEBUG nova.network.neutron [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Successfully created port: 92a7e957-ed27-4696-8db4-46047f344d0a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:18:36 compute-0 nova_compute[182935]: 2026-01-22 00:18:36.162 182939 DEBUG nova.network.neutron [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Successfully updated port: 92a7e957-ed27-4696-8db4-46047f344d0a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:18:36 compute-0 nova_compute[182935]: 2026-01-22 00:18:36.306 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:18:36 compute-0 nova_compute[182935]: 2026-01-22 00:18:36.307 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquired lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:18:36 compute-0 nova_compute[182935]: 2026-01-22 00:18:36.307 182939 DEBUG nova.network.neutron [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:18:36 compute-0 nova_compute[182935]: 2026-01-22 00:18:36.492 182939 DEBUG nova.compute.manager [req-75db49be-8f55-410c-a4ac-a84dc70a5230 req-ce59d094-79b0-4cc2-bed7-99509a0c5b46 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Received event network-changed-92a7e957-ed27-4696-8db4-46047f344d0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:18:36 compute-0 nova_compute[182935]: 2026-01-22 00:18:36.492 182939 DEBUG nova.compute.manager [req-75db49be-8f55-410c-a4ac-a84dc70a5230 req-ce59d094-79b0-4cc2-bed7-99509a0c5b46 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Refreshing instance network info cache due to event network-changed-92a7e957-ed27-4696-8db4-46047f344d0a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:18:36 compute-0 nova_compute[182935]: 2026-01-22 00:18:36.493 182939 DEBUG oslo_concurrency.lockutils [req-75db49be-8f55-410c-a4ac-a84dc70a5230 req-ce59d094-79b0-4cc2-bed7-99509a0c5b46 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:18:36 compute-0 nova_compute[182935]: 2026-01-22 00:18:36.652 182939 DEBUG nova.network.neutron [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:18:37 compute-0 nova_compute[182935]: 2026-01-22 00:18:37.637 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:37 compute-0 nova_compute[182935]: 2026-01-22 00:18:37.809 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.152 182939 DEBUG nova.network.neutron [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Updating instance_info_cache with network_info: [{"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.577 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Releasing lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.577 182939 DEBUG nova.compute.manager [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Instance network_info: |[{"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.578 182939 DEBUG oslo_concurrency.lockutils [req-75db49be-8f55-410c-a4ac-a84dc70a5230 req-ce59d094-79b0-4cc2-bed7-99509a0c5b46 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.578 182939 DEBUG nova.network.neutron [req-75db49be-8f55-410c-a4ac-a84dc70a5230 req-ce59d094-79b0-4cc2-bed7-99509a0c5b46 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Refreshing network info cache for port 92a7e957-ed27-4696-8db4-46047f344d0a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.581 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Start _get_guest_xml network_info=[{"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.586 182939 WARNING nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.591 182939 DEBUG nova.virt.libvirt.host [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.591 182939 DEBUG nova.virt.libvirt.host [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.599 182939 DEBUG nova.virt.libvirt.host [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.600 182939 DEBUG nova.virt.libvirt.host [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.601 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.601 182939 DEBUG nova.virt.hardware [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.601 182939 DEBUG nova.virt.hardware [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.602 182939 DEBUG nova.virt.hardware [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.602 182939 DEBUG nova.virt.hardware [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.602 182939 DEBUG nova.virt.hardware [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.602 182939 DEBUG nova.virt.hardware [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.603 182939 DEBUG nova.virt.hardware [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.603 182939 DEBUG nova.virt.hardware [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.603 182939 DEBUG nova.virt.hardware [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.604 182939 DEBUG nova.virt.hardware [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.604 182939 DEBUG nova.virt.hardware [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.607 182939 DEBUG nova.virt.libvirt.vif [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:18:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-542111961',display_name='tempest-TestNetworkBasicOps-server-542111961',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-542111961',id=142,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLsQVoSMf+l9f1DJAJNJXGg3MYgdMg2NKPQouBZTvv4c7FDkjYqI22zEEEZQ6T13spe7Ib9awbPJaYWdt9L2qTnbAntWMo/p6k7z4TK8fYhUKilllGMLWNmFmwaUqp3N7w==',key_name='tempest-TestNetworkBasicOps-1288527963',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-lxm1nqql',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:18:33Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=00f1dbef-5c82-4287-9df6-2fca347c2852,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.607 182939 DEBUG nova.network.os_vif_util [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.608 182939 DEBUG nova.network.os_vif_util [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=92a7e957-ed27-4696-8db4-46047f344d0a,network=Network(0635e581-a43c-4a3a-8490-6d5b22361e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a7e957-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.609 182939 DEBUG nova.objects.instance [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 00f1dbef-5c82-4287-9df6-2fca347c2852 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.626 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:18:38 compute-0 nova_compute[182935]:   <uuid>00f1dbef-5c82-4287-9df6-2fca347c2852</uuid>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   <name>instance-0000008e</name>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <nova:name>tempest-TestNetworkBasicOps-server-542111961</nova:name>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:18:38</nova:creationTime>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:18:38 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:18:38 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:18:38 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:18:38 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:18:38 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:18:38 compute-0 nova_compute[182935]:         <nova:user uuid="833f1e9dce90456ea55a443da6704907">tempest-TestNetworkBasicOps-822850957-project-member</nova:user>
Jan 22 00:18:38 compute-0 nova_compute[182935]:         <nova:project uuid="34b96b4037d24a0ea19383ca2477b2fd">tempest-TestNetworkBasicOps-822850957</nova:project>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:18:38 compute-0 nova_compute[182935]:         <nova:port uuid="92a7e957-ed27-4696-8db4-46047f344d0a">
Jan 22 00:18:38 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <system>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <entry name="serial">00f1dbef-5c82-4287-9df6-2fca347c2852</entry>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <entry name="uuid">00f1dbef-5c82-4287-9df6-2fca347c2852</entry>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     </system>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   <os>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   </os>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   <features>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   </features>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk.config"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:f2:b3:f6"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <target dev="tap92a7e957-ed"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/console.log" append="off"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <video>
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     </video>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:18:38 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:18:38 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:18:38 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:18:38 compute-0 nova_compute[182935]: </domain>
Jan 22 00:18:38 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.627 182939 DEBUG nova.compute.manager [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Preparing to wait for external event network-vif-plugged-92a7e957-ed27-4696-8db4-46047f344d0a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.628 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.628 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.628 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.629 182939 DEBUG nova.virt.libvirt.vif [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:18:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-542111961',display_name='tempest-TestNetworkBasicOps-server-542111961',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-542111961',id=142,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLsQVoSMf+l9f1DJAJNJXGg3MYgdMg2NKPQouBZTvv4c7FDkjYqI22zEEEZQ6T13spe7Ib9awbPJaYWdt9L2qTnbAntWMo/p6k7z4TK8fYhUKilllGMLWNmFmwaUqp3N7w==',key_name='tempest-TestNetworkBasicOps-1288527963',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-lxm1nqql',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:18:33Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=00f1dbef-5c82-4287-9df6-2fca347c2852,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.629 182939 DEBUG nova.network.os_vif_util [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.630 182939 DEBUG nova.network.os_vif_util [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=92a7e957-ed27-4696-8db4-46047f344d0a,network=Network(0635e581-a43c-4a3a-8490-6d5b22361e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a7e957-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.630 182939 DEBUG os_vif [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=92a7e957-ed27-4696-8db4-46047f344d0a,network=Network(0635e581-a43c-4a3a-8490-6d5b22361e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a7e957-ed') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.631 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.632 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.632 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.636 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.636 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap92a7e957-ed, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.637 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap92a7e957-ed, col_values=(('external_ids', {'iface-id': '92a7e957-ed27-4696-8db4-46047f344d0a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:b3:f6', 'vm-uuid': '00f1dbef-5c82-4287-9df6-2fca347c2852'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.684 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:38 compute-0 NetworkManager[55139]: <info>  [1769041118.6854] manager: (tap92a7e957-ed): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.687 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.691 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:38 compute-0 nova_compute[182935]: 2026-01-22 00:18:38.691 182939 INFO os_vif [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=92a7e957-ed27-4696-8db4-46047f344d0a,network=Network(0635e581-a43c-4a3a-8490-6d5b22361e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a7e957-ed')
Jan 22 00:18:39 compute-0 nova_compute[182935]: 2026-01-22 00:18:39.698 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:18:39 compute-0 nova_compute[182935]: 2026-01-22 00:18:39.699 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:18:39 compute-0 nova_compute[182935]: 2026-01-22 00:18:39.699 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] No VIF found with MAC fa:16:3e:f2:b3:f6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:18:39 compute-0 nova_compute[182935]: 2026-01-22 00:18:39.699 182939 INFO nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Using config drive
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.419 182939 INFO nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Creating config drive at /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk.config
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.424 182939 DEBUG oslo_concurrency.processutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpen27rkqw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.554 182939 DEBUG oslo_concurrency.processutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpen27rkqw" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.592 182939 DEBUG nova.network.neutron [req-75db49be-8f55-410c-a4ac-a84dc70a5230 req-ce59d094-79b0-4cc2-bed7-99509a0c5b46 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Updated VIF entry in instance network info cache for port 92a7e957-ed27-4696-8db4-46047f344d0a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.593 182939 DEBUG nova.network.neutron [req-75db49be-8f55-410c-a4ac-a84dc70a5230 req-ce59d094-79b0-4cc2-bed7-99509a0c5b46 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Updating instance_info_cache with network_info: [{"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.612 182939 DEBUG oslo_concurrency.lockutils [req-75db49be-8f55-410c-a4ac-a84dc70a5230 req-ce59d094-79b0-4cc2-bed7-99509a0c5b46 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:18:40 compute-0 kernel: tap92a7e957-ed: entered promiscuous mode
Jan 22 00:18:40 compute-0 NetworkManager[55139]: <info>  [1769041120.6167] manager: (tap92a7e957-ed): new Tun device (/org/freedesktop/NetworkManager/Devices/247)
Jan 22 00:18:40 compute-0 ovn_controller[95047]: 2026-01-22T00:18:40Z|00516|binding|INFO|Claiming lport 92a7e957-ed27-4696-8db4-46047f344d0a for this chassis.
Jan 22 00:18:40 compute-0 ovn_controller[95047]: 2026-01-22T00:18:40Z|00517|binding|INFO|92a7e957-ed27-4696-8db4-46047f344d0a: Claiming fa:16:3e:f2:b3:f6 10.100.0.5
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.617 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.620 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.624 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.630 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:b3:f6 10.100.0.5'], port_security=['fa:16:3e:f2:b3:f6 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '00f1dbef-5c82-4287-9df6-2fca347c2852', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0635e581-a43c-4a3a-8490-6d5b22361e4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3675cdb2-29f9-47b7-a8e2-4f48470eb4f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a317f3f1-02a8-4299-a536-8c346dbb0861, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=92a7e957-ed27-4696-8db4-46047f344d0a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.632 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 92a7e957-ed27-4696-8db4-46047f344d0a in datapath 0635e581-a43c-4a3a-8490-6d5b22361e4e bound to our chassis
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.633 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0635e581-a43c-4a3a-8490-6d5b22361e4e
Jan 22 00:18:40 compute-0 systemd-udevd[234412]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.645 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7c3f179e-a978-49cb-80aa-be827e11f98e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.646 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0635e581-a1 in ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.648 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0635e581-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.648 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a93512-5e99-49bf-b044-1f680f0b7043]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.648 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[24301afa-9880-4fb1-aa27-aca0877037b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 systemd-machined[154182]: New machine qemu-70-instance-0000008e.
Jan 22 00:18:40 compute-0 NetworkManager[55139]: <info>  [1769041120.6615] device (tap92a7e957-ed): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.660 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5a7eff-f9fd-4e88-9c16-1de7c20049e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 NetworkManager[55139]: <info>  [1769041120.6625] device (tap92a7e957-ed): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.681 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[06b476ef-2182-439c-a014-70f8e8ae4b15]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 systemd[1]: Started Virtual Machine qemu-70-instance-0000008e.
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.685 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:40 compute-0 ovn_controller[95047]: 2026-01-22T00:18:40Z|00518|binding|INFO|Setting lport 92a7e957-ed27-4696-8db4-46047f344d0a ovn-installed in OVS
Jan 22 00:18:40 compute-0 ovn_controller[95047]: 2026-01-22T00:18:40Z|00519|binding|INFO|Setting lport 92a7e957-ed27-4696-8db4-46047f344d0a up in Southbound
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.686 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.711 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[82382be6-b830-44d8-b238-6a8259ec0bad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.716 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[05a4f30c-4353-4b0f-b258-277043c7bbc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 NetworkManager[55139]: <info>  [1769041120.7174] manager: (tap0635e581-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/248)
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.748 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[09c7e111-ec72-4ebf-99d4-c76d4c689c91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.752 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9231a9ac-cc5c-410b-8d54-6dc81f81e925]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 NetworkManager[55139]: <info>  [1769041120.7728] device (tap0635e581-a0): carrier: link connected
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.779 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b0554024-3620-420d-b467-ab1f1f7d9b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.796 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[79b93e52-af60-4278-8d26-e03060eecaba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0635e581-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8a:64:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562351, 'reachable_time': 25879, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234445, 'error': None, 'target': 'ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.813 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[752a0056-18ea-4eae-92cf-dc22ab77d27e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8a:641f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 562351, 'tstamp': 562351}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234446, 'error': None, 'target': 'ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.830 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f4fde50f-1d16-4bf1-97bb-cff92a5bc5aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0635e581-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8a:64:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562351, 'reachable_time': 25879, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234447, 'error': None, 'target': 'ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.861 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7aedaab0-bc7a-4601-aa46-7bdc343f7438]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.922 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3160b9ae-795f-4d1d-ad0d-33b60812d8b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.924 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0635e581-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.924 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.924 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0635e581-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:40 compute-0 NetworkManager[55139]: <info>  [1769041120.9657] manager: (tap0635e581-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Jan 22 00:18:40 compute-0 kernel: tap0635e581-a0: entered promiscuous mode
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.965 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.967 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.969 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0635e581-a0, col_values=(('external_ids', {'iface-id': '6eeb0e01-2d5a-4731-9621-c0b25b2276f3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:40 compute-0 ovn_controller[95047]: 2026-01-22T00:18:40Z|00520|binding|INFO|Releasing lport 6eeb0e01-2d5a-4731-9621-c0b25b2276f3 from this chassis (sb_readonly=0)
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.971 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.972 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.973 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0635e581-a43c-4a3a-8490-6d5b22361e4e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0635e581-a43c-4a3a-8490-6d5b22361e4e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.974 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[eac0353e-db83-4a89-bd3d-91e2b1cd6ec4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.974 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-0635e581-a43c-4a3a-8490-6d5b22361e4e
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/0635e581-a43c-4a3a-8490-6d5b22361e4e.pid.haproxy
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 0635e581-a43c-4a3a-8490-6d5b22361e4e
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:18:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:40.975 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e', 'env', 'PROCESS_TAG=haproxy-0635e581-a43c-4a3a-8490-6d5b22361e4e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0635e581-a43c-4a3a-8490-6d5b22361e4e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:18:40 compute-0 nova_compute[182935]: 2026-01-22 00:18:40.983 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:41 compute-0 podman[234479]: 2026-01-22 00:18:41.345559394 +0000 UTC m=+0.048882221 container create 0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.362 182939 DEBUG nova.compute.manager [req-2826d4a0-1d55-40e5-8aa6-deed8e3e14fc req-a6fc68d6-aac5-4a78-9067-a94fe12a6065 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Received event network-vif-plugged-92a7e957-ed27-4696-8db4-46047f344d0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.363 182939 DEBUG oslo_concurrency.lockutils [req-2826d4a0-1d55-40e5-8aa6-deed8e3e14fc req-a6fc68d6-aac5-4a78-9067-a94fe12a6065 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.363 182939 DEBUG oslo_concurrency.lockutils [req-2826d4a0-1d55-40e5-8aa6-deed8e3e14fc req-a6fc68d6-aac5-4a78-9067-a94fe12a6065 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.363 182939 DEBUG oslo_concurrency.lockutils [req-2826d4a0-1d55-40e5-8aa6-deed8e3e14fc req-a6fc68d6-aac5-4a78-9067-a94fe12a6065 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.364 182939 DEBUG nova.compute.manager [req-2826d4a0-1d55-40e5-8aa6-deed8e3e14fc req-a6fc68d6-aac5-4a78-9067-a94fe12a6065 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Processing event network-vif-plugged-92a7e957-ed27-4696-8db4-46047f344d0a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:18:41 compute-0 systemd[1]: Started libpod-conmon-0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4.scope.
Jan 22 00:18:41 compute-0 podman[234479]: 2026-01-22 00:18:41.318946573 +0000 UTC m=+0.022269420 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:18:41 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:18:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad7653be67caa0d848eb68b662e1b080714c162f2a0c08e80e93121176450568/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:18:41 compute-0 podman[234479]: 2026-01-22 00:18:41.432200547 +0000 UTC m=+0.135523384 container init 0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:18:41 compute-0 podman[234479]: 2026-01-22 00:18:41.437864049 +0000 UTC m=+0.141186876 container start 0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 22 00:18:41 compute-0 neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e[234495]: [NOTICE]   (234499) : New worker (234501) forked
Jan 22 00:18:41 compute-0 neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e[234495]: [NOTICE]   (234499) : Loading success.
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.672 182939 DEBUG nova.compute.manager [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.674 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041121.6712067, 00f1dbef-5c82-4287-9df6-2fca347c2852 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.676 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] VM Started (Lifecycle Event)
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.678 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.682 182939 INFO nova.virt.libvirt.driver [-] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Instance spawned successfully.
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.682 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.702 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.707 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.719 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.720 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.721 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.721 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.721 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.722 182939 DEBUG nova.virt.libvirt.driver [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.752 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.753 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041121.67141, 00f1dbef-5c82-4287-9df6-2fca347c2852 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.753 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] VM Paused (Lifecycle Event)
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.783 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.787 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041121.677717, 00f1dbef-5c82-4287-9df6-2fca347c2852 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.787 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] VM Resumed (Lifecycle Event)
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.812 182939 INFO nova.compute.manager [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Took 8.37 seconds to spawn the instance on the hypervisor.
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.813 182939 DEBUG nova.compute.manager [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.815 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.823 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.864 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.966 182939 INFO nova.compute.manager [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Took 12.07 seconds to build instance.
Jan 22 00:18:41 compute-0 nova_compute[182935]: 2026-01-22 00:18:41.989 182939 DEBUG oslo_concurrency.lockutils [None req-c01701d2-8823-4cbe-9171-d70b59e733a0 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:42 compute-0 podman[234519]: 2026-01-22 00:18:42.682575932 +0000 UTC m=+0.051404180 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:18:42 compute-0 podman[234518]: 2026-01-22 00:18:42.713788242 +0000 UTC m=+0.085828986 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:18:42 compute-0 nova_compute[182935]: 2026-01-22 00:18:42.811 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:43 compute-0 nova_compute[182935]: 2026-01-22 00:18:43.652 182939 DEBUG nova.compute.manager [req-2fa1e029-86ad-40da-ac4c-c4748323c1c7 req-2362847e-bcca-47e1-a96a-4df2f8ce0cf2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Received event network-vif-plugged-92a7e957-ed27-4696-8db4-46047f344d0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:18:43 compute-0 nova_compute[182935]: 2026-01-22 00:18:43.652 182939 DEBUG oslo_concurrency.lockutils [req-2fa1e029-86ad-40da-ac4c-c4748323c1c7 req-2362847e-bcca-47e1-a96a-4df2f8ce0cf2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:43 compute-0 nova_compute[182935]: 2026-01-22 00:18:43.653 182939 DEBUG oslo_concurrency.lockutils [req-2fa1e029-86ad-40da-ac4c-c4748323c1c7 req-2362847e-bcca-47e1-a96a-4df2f8ce0cf2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:43 compute-0 nova_compute[182935]: 2026-01-22 00:18:43.653 182939 DEBUG oslo_concurrency.lockutils [req-2fa1e029-86ad-40da-ac4c-c4748323c1c7 req-2362847e-bcca-47e1-a96a-4df2f8ce0cf2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:43 compute-0 nova_compute[182935]: 2026-01-22 00:18:43.653 182939 DEBUG nova.compute.manager [req-2fa1e029-86ad-40da-ac4c-c4748323c1c7 req-2362847e-bcca-47e1-a96a-4df2f8ce0cf2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] No waiting events found dispatching network-vif-plugged-92a7e957-ed27-4696-8db4-46047f344d0a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:18:43 compute-0 nova_compute[182935]: 2026-01-22 00:18:43.654 182939 WARNING nova.compute.manager [req-2fa1e029-86ad-40da-ac4c-c4748323c1c7 req-2362847e-bcca-47e1-a96a-4df2f8ce0cf2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Received unexpected event network-vif-plugged-92a7e957-ed27-4696-8db4-46047f344d0a for instance with vm_state active and task_state None.
Jan 22 00:18:43 compute-0 nova_compute[182935]: 2026-01-22 00:18:43.686 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:43 compute-0 nova_compute[182935]: 2026-01-22 00:18:43.927 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:43 compute-0 nova_compute[182935]: 2026-01-22 00:18:43.968 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:43 compute-0 nova_compute[182935]: 2026-01-22 00:18:43.969 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:43 compute-0 nova_compute[182935]: 2026-01-22 00:18:43.969 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:43 compute-0 nova_compute[182935]: 2026-01-22 00:18:43.969 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.065 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.137 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.138 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.207 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.346 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.347 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5541MB free_disk=73.12640380859375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.347 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.348 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.473 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 00f1dbef-5c82-4287-9df6-2fca347c2852 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.474 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.474 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.535 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.553 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.594 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:18:44 compute-0 nova_compute[182935]: 2026-01-22 00:18:44.595 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:45 compute-0 NetworkManager[55139]: <info>  [1769041125.2137] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/250)
Jan 22 00:18:45 compute-0 NetworkManager[55139]: <info>  [1769041125.2154] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Jan 22 00:18:45 compute-0 nova_compute[182935]: 2026-01-22 00:18:45.216 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:45 compute-0 nova_compute[182935]: 2026-01-22 00:18:45.343 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:45 compute-0 ovn_controller[95047]: 2026-01-22T00:18:45Z|00521|binding|INFO|Releasing lport 6eeb0e01-2d5a-4731-9621-c0b25b2276f3 from this chassis (sb_readonly=0)
Jan 22 00:18:45 compute-0 nova_compute[182935]: 2026-01-22 00:18:45.365 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:45 compute-0 nova_compute[182935]: 2026-01-22 00:18:45.607 182939 DEBUG nova.compute.manager [req-8d96fe95-264e-4e5a-9111-3168abbe3126 req-40529953-8f48-47fa-a8b3-d6b1e7510218 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Received event network-changed-92a7e957-ed27-4696-8db4-46047f344d0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:18:45 compute-0 nova_compute[182935]: 2026-01-22 00:18:45.607 182939 DEBUG nova.compute.manager [req-8d96fe95-264e-4e5a-9111-3168abbe3126 req-40529953-8f48-47fa-a8b3-d6b1e7510218 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Refreshing instance network info cache due to event network-changed-92a7e957-ed27-4696-8db4-46047f344d0a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:18:45 compute-0 nova_compute[182935]: 2026-01-22 00:18:45.607 182939 DEBUG oslo_concurrency.lockutils [req-8d96fe95-264e-4e5a-9111-3168abbe3126 req-40529953-8f48-47fa-a8b3-d6b1e7510218 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:18:45 compute-0 nova_compute[182935]: 2026-01-22 00:18:45.608 182939 DEBUG oslo_concurrency.lockutils [req-8d96fe95-264e-4e5a-9111-3168abbe3126 req-40529953-8f48-47fa-a8b3-d6b1e7510218 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:18:45 compute-0 nova_compute[182935]: 2026-01-22 00:18:45.608 182939 DEBUG nova.network.neutron [req-8d96fe95-264e-4e5a-9111-3168abbe3126 req-40529953-8f48-47fa-a8b3-d6b1e7510218 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Refreshing network info cache for port 92a7e957-ed27-4696-8db4-46047f344d0a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:18:45 compute-0 podman[234577]: 2026-01-22 00:18:45.683821878 +0000 UTC m=+0.053441078 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:18:45 compute-0 nova_compute[182935]: 2026-01-22 00:18:45.871 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Acquiring lock "9fc1259e-0d28-405b-b42e-35f73659ff76" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:45 compute-0 nova_compute[182935]: 2026-01-22 00:18:45.871 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:45 compute-0 nova_compute[182935]: 2026-01-22 00:18:45.897 182939 DEBUG nova.compute.manager [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.034 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.035 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.049 182939 DEBUG nova.virt.hardware [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.049 182939 INFO nova.compute.claims [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.216 182939 DEBUG nova.compute.provider_tree [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.237 182939 DEBUG nova.scheduler.client.report [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.271 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.272 182939 DEBUG nova.compute.manager [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.356 182939 DEBUG nova.compute.manager [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.356 182939 DEBUG nova.network.neutron [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.380 182939 INFO nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.400 182939 DEBUG nova.compute.manager [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.461 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.461 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.461 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.509 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.559 182939 DEBUG nova.compute.manager [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.561 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.562 182939 INFO nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Creating image(s)
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.562 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Acquiring lock "/var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.563 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "/var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.564 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "/var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.583 182939 DEBUG oslo_concurrency.processutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.647 182939 DEBUG oslo_concurrency.processutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.648 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.649 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.660 182939 DEBUG oslo_concurrency.processutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.683 182939 DEBUG nova.policy [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '793e503e125c457d8f4082ecd2e4a391', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '828d60daff104fcdab0f25aef8cdb46b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.726 182939 DEBUG oslo_concurrency.processutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.727 182939 DEBUG oslo_concurrency.processutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.770 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.800 182939 DEBUG oslo_concurrency.processutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk 1073741824" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.801 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.801 182939 DEBUG oslo_concurrency.processutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.861 182939 DEBUG oslo_concurrency.processutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.862 182939 DEBUG nova.virt.disk.api [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Checking if we can resize image /var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.862 182939 DEBUG oslo_concurrency.processutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.921 182939 DEBUG oslo_concurrency.processutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.922 182939 DEBUG nova.virt.disk.api [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Cannot resize image /var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.922 182939 DEBUG nova.objects.instance [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lazy-loading 'migration_context' on Instance uuid 9fc1259e-0d28-405b-b42e-35f73659ff76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.943 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.943 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Ensure instance console log exists: /var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.944 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.944 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:46 compute-0 nova_compute[182935]: 2026-01-22 00:18:46.944 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:47 compute-0 nova_compute[182935]: 2026-01-22 00:18:47.016 182939 DEBUG nova.network.neutron [req-8d96fe95-264e-4e5a-9111-3168abbe3126 req-40529953-8f48-47fa-a8b3-d6b1e7510218 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Updated VIF entry in instance network info cache for port 92a7e957-ed27-4696-8db4-46047f344d0a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:18:47 compute-0 nova_compute[182935]: 2026-01-22 00:18:47.017 182939 DEBUG nova.network.neutron [req-8d96fe95-264e-4e5a-9111-3168abbe3126 req-40529953-8f48-47fa-a8b3-d6b1e7510218 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Updating instance_info_cache with network_info: [{"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:18:47 compute-0 nova_compute[182935]: 2026-01-22 00:18:47.049 182939 DEBUG oslo_concurrency.lockutils [req-8d96fe95-264e-4e5a-9111-3168abbe3126 req-40529953-8f48-47fa-a8b3-d6b1e7510218 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:18:47 compute-0 nova_compute[182935]: 2026-01-22 00:18:47.049 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:18:47 compute-0 nova_compute[182935]: 2026-01-22 00:18:47.050 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:18:47 compute-0 nova_compute[182935]: 2026-01-22 00:18:47.050 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 00f1dbef-5c82-4287-9df6-2fca347c2852 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:18:47 compute-0 nova_compute[182935]: 2026-01-22 00:18:47.658 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:47 compute-0 nova_compute[182935]: 2026-01-22 00:18:47.814 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:48 compute-0 nova_compute[182935]: 2026-01-22 00:18:48.034 182939 DEBUG nova.network.neutron [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Successfully created port: 74941ef0-6243-4016-8f33-3e7e739ec086 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:18:48 compute-0 nova_compute[182935]: 2026-01-22 00:18:48.560 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Updating instance_info_cache with network_info: [{"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:18:48 compute-0 nova_compute[182935]: 2026-01-22 00:18:48.587 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:18:48 compute-0 nova_compute[182935]: 2026-01-22 00:18:48.588 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:18:48 compute-0 nova_compute[182935]: 2026-01-22 00:18:48.588 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:48 compute-0 nova_compute[182935]: 2026-01-22 00:18:48.588 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:48 compute-0 nova_compute[182935]: 2026-01-22 00:18:48.589 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:18:48 compute-0 nova_compute[182935]: 2026-01-22 00:18:48.726 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:48 compute-0 nova_compute[182935]: 2026-01-22 00:18:48.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:49 compute-0 nova_compute[182935]: 2026-01-22 00:18:49.325 182939 DEBUG nova.network.neutron [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Successfully updated port: 74941ef0-6243-4016-8f33-3e7e739ec086 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:18:49 compute-0 nova_compute[182935]: 2026-01-22 00:18:49.341 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Acquiring lock "refresh_cache-9fc1259e-0d28-405b-b42e-35f73659ff76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:18:49 compute-0 nova_compute[182935]: 2026-01-22 00:18:49.341 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Acquired lock "refresh_cache-9fc1259e-0d28-405b-b42e-35f73659ff76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:18:49 compute-0 nova_compute[182935]: 2026-01-22 00:18:49.342 182939 DEBUG nova.network.neutron [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:18:49 compute-0 nova_compute[182935]: 2026-01-22 00:18:49.540 182939 DEBUG nova.network.neutron [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:18:50 compute-0 nova_compute[182935]: 2026-01-22 00:18:50.105 182939 DEBUG nova.compute.manager [req-25667d25-cdca-48f7-803e-66d98b256b6b req-3e6798a9-6e34-41fb-8ccd-7ac787983a3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Received event network-changed-74941ef0-6243-4016-8f33-3e7e739ec086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:18:50 compute-0 nova_compute[182935]: 2026-01-22 00:18:50.106 182939 DEBUG nova.compute.manager [req-25667d25-cdca-48f7-803e-66d98b256b6b req-3e6798a9-6e34-41fb-8ccd-7ac787983a3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Refreshing instance network info cache due to event network-changed-74941ef0-6243-4016-8f33-3e7e739ec086. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:18:50 compute-0 nova_compute[182935]: 2026-01-22 00:18:50.107 182939 DEBUG oslo_concurrency.lockutils [req-25667d25-cdca-48f7-803e-66d98b256b6b req-3e6798a9-6e34-41fb-8ccd-7ac787983a3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-9fc1259e-0d28-405b-b42e-35f73659ff76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.127 182939 DEBUG nova.network.neutron [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Updating instance_info_cache with network_info: [{"id": "74941ef0-6243-4016-8f33-3e7e739ec086", "address": "fa:16:3e:62:e5:17", "network": {"id": "0177ab1d-de86-4d4b-a23f-905845c27092", "bridge": "br-int", "label": "tempest-network-smoke--1371204854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "828d60daff104fcdab0f25aef8cdb46b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74941ef0-62", "ovs_interfaceid": "74941ef0-6243-4016-8f33-3e7e739ec086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.168 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Releasing lock "refresh_cache-9fc1259e-0d28-405b-b42e-35f73659ff76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.169 182939 DEBUG nova.compute.manager [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Instance network_info: |[{"id": "74941ef0-6243-4016-8f33-3e7e739ec086", "address": "fa:16:3e:62:e5:17", "network": {"id": "0177ab1d-de86-4d4b-a23f-905845c27092", "bridge": "br-int", "label": "tempest-network-smoke--1371204854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "828d60daff104fcdab0f25aef8cdb46b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74941ef0-62", "ovs_interfaceid": "74941ef0-6243-4016-8f33-3e7e739ec086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.170 182939 DEBUG oslo_concurrency.lockutils [req-25667d25-cdca-48f7-803e-66d98b256b6b req-3e6798a9-6e34-41fb-8ccd-7ac787983a3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-9fc1259e-0d28-405b-b42e-35f73659ff76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.170 182939 DEBUG nova.network.neutron [req-25667d25-cdca-48f7-803e-66d98b256b6b req-3e6798a9-6e34-41fb-8ccd-7ac787983a3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Refreshing network info cache for port 74941ef0-6243-4016-8f33-3e7e739ec086 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.173 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Start _get_guest_xml network_info=[{"id": "74941ef0-6243-4016-8f33-3e7e739ec086", "address": "fa:16:3e:62:e5:17", "network": {"id": "0177ab1d-de86-4d4b-a23f-905845c27092", "bridge": "br-int", "label": "tempest-network-smoke--1371204854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "828d60daff104fcdab0f25aef8cdb46b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74941ef0-62", "ovs_interfaceid": "74941ef0-6243-4016-8f33-3e7e739ec086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.179 182939 WARNING nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.184 182939 DEBUG nova.virt.libvirt.host [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.186 182939 DEBUG nova.virt.libvirt.host [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.190 182939 DEBUG nova.virt.libvirt.host [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.191 182939 DEBUG nova.virt.libvirt.host [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.193 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.193 182939 DEBUG nova.virt.hardware [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.194 182939 DEBUG nova.virt.hardware [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.194 182939 DEBUG nova.virt.hardware [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.195 182939 DEBUG nova.virt.hardware [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.195 182939 DEBUG nova.virt.hardware [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.196 182939 DEBUG nova.virt.hardware [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.196 182939 DEBUG nova.virt.hardware [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.197 182939 DEBUG nova.virt.hardware [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.197 182939 DEBUG nova.virt.hardware [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.198 182939 DEBUG nova.virt.hardware [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.198 182939 DEBUG nova.virt.hardware [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.204 182939 DEBUG nova.virt.libvirt.vif [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:18:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-852901634-access_point-38128636',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-852901634-access_point-38128636',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-852901634-acc',id=144,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDKro6oqpFAId7ik5Vn6WyyZp3vrK9KUyorcKTmJ6BBxHaUYJDX2zHomAAcWTG9p33Yti1tDUxA4PhApZ9mTehr2h9TZzHbjMWw71YFlUqJWo4dGRNgyn1ZQz6co9PPZaA==',key_name='tempest-TestSecurityGroupsBasicOps-1844760784',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='828d60daff104fcdab0f25aef8cdb46b',ramdisk_id='',reservation_id='r-qwgk6q48',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-852901634',owner_user_name='tempest-TestSecurityGroupsBasicOps-852901634-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:18:46Z,user_data=None,user_id='793e503e125c457d8f4082ecd2e4a391',uuid=9fc1259e-0d28-405b-b42e-35f73659ff76,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74941ef0-6243-4016-8f33-3e7e739ec086", "address": "fa:16:3e:62:e5:17", "network": {"id": "0177ab1d-de86-4d4b-a23f-905845c27092", "bridge": "br-int", "label": "tempest-network-smoke--1371204854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "828d60daff104fcdab0f25aef8cdb46b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74941ef0-62", "ovs_interfaceid": "74941ef0-6243-4016-8f33-3e7e739ec086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.205 182939 DEBUG nova.network.os_vif_util [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Converting VIF {"id": "74941ef0-6243-4016-8f33-3e7e739ec086", "address": "fa:16:3e:62:e5:17", "network": {"id": "0177ab1d-de86-4d4b-a23f-905845c27092", "bridge": "br-int", "label": "tempest-network-smoke--1371204854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "828d60daff104fcdab0f25aef8cdb46b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74941ef0-62", "ovs_interfaceid": "74941ef0-6243-4016-8f33-3e7e739ec086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.206 182939 DEBUG nova.network.os_vif_util [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:e5:17,bridge_name='br-int',has_traffic_filtering=True,id=74941ef0-6243-4016-8f33-3e7e739ec086,network=Network(0177ab1d-de86-4d4b-a23f-905845c27092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74941ef0-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.207 182939 DEBUG nova.objects.instance [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lazy-loading 'pci_devices' on Instance uuid 9fc1259e-0d28-405b-b42e-35f73659ff76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.226 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:18:51 compute-0 nova_compute[182935]:   <uuid>9fc1259e-0d28-405b-b42e-35f73659ff76</uuid>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   <name>instance-00000090</name>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-852901634-access_point-38128636</nova:name>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:18:51</nova:creationTime>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:18:51 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:18:51 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:18:51 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:18:51 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:18:51 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:18:51 compute-0 nova_compute[182935]:         <nova:user uuid="793e503e125c457d8f4082ecd2e4a391">tempest-TestSecurityGroupsBasicOps-852901634-project-member</nova:user>
Jan 22 00:18:51 compute-0 nova_compute[182935]:         <nova:project uuid="828d60daff104fcdab0f25aef8cdb46b">tempest-TestSecurityGroupsBasicOps-852901634</nova:project>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:18:51 compute-0 nova_compute[182935]:         <nova:port uuid="74941ef0-6243-4016-8f33-3e7e739ec086">
Jan 22 00:18:51 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <system>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <entry name="serial">9fc1259e-0d28-405b-b42e-35f73659ff76</entry>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <entry name="uuid">9fc1259e-0d28-405b-b42e-35f73659ff76</entry>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     </system>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   <os>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   </os>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   <features>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   </features>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk.config"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:62:e5:17"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <target dev="tap74941ef0-62"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/console.log" append="off"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <video>
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     </video>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:18:51 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:18:51 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:18:51 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:18:51 compute-0 nova_compute[182935]: </domain>
Jan 22 00:18:51 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.228 182939 DEBUG nova.compute.manager [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Preparing to wait for external event network-vif-plugged-74941ef0-6243-4016-8f33-3e7e739ec086 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.229 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Acquiring lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.229 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.229 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.230 182939 DEBUG nova.virt.libvirt.vif [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:18:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-852901634-access_point-38128636',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-852901634-access_point-38128636',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-852901634-acc',id=144,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDKro6oqpFAId7ik5Vn6WyyZp3vrK9KUyorcKTmJ6BBxHaUYJDX2zHomAAcWTG9p33Yti1tDUxA4PhApZ9mTehr2h9TZzHbjMWw71YFlUqJWo4dGRNgyn1ZQz6co9PPZaA==',key_name='tempest-TestSecurityGroupsBasicOps-1844760784',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='828d60daff104fcdab0f25aef8cdb46b',ramdisk_id='',reservation_id='r-qwgk6q48',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-852901634',owner_user_name='tempest-TestSecurityGroupsBasicOps-852901634-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:18:46Z,user_data=None,user_id='793e503e125c457d8f4082ecd2e4a391',uuid=9fc1259e-0d28-405b-b42e-35f73659ff76,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74941ef0-6243-4016-8f33-3e7e739ec086", "address": "fa:16:3e:62:e5:17", "network": {"id": "0177ab1d-de86-4d4b-a23f-905845c27092", "bridge": "br-int", "label": "tempest-network-smoke--1371204854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "828d60daff104fcdab0f25aef8cdb46b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74941ef0-62", "ovs_interfaceid": "74941ef0-6243-4016-8f33-3e7e739ec086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.231 182939 DEBUG nova.network.os_vif_util [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Converting VIF {"id": "74941ef0-6243-4016-8f33-3e7e739ec086", "address": "fa:16:3e:62:e5:17", "network": {"id": "0177ab1d-de86-4d4b-a23f-905845c27092", "bridge": "br-int", "label": "tempest-network-smoke--1371204854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "828d60daff104fcdab0f25aef8cdb46b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74941ef0-62", "ovs_interfaceid": "74941ef0-6243-4016-8f33-3e7e739ec086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.231 182939 DEBUG nova.network.os_vif_util [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:e5:17,bridge_name='br-int',has_traffic_filtering=True,id=74941ef0-6243-4016-8f33-3e7e739ec086,network=Network(0177ab1d-de86-4d4b-a23f-905845c27092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74941ef0-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.232 182939 DEBUG os_vif [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:e5:17,bridge_name='br-int',has_traffic_filtering=True,id=74941ef0-6243-4016-8f33-3e7e739ec086,network=Network(0177ab1d-de86-4d4b-a23f-905845c27092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74941ef0-62') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.232 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.233 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.234 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.238 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.239 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74941ef0-62, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.240 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap74941ef0-62, col_values=(('external_ids', {'iface-id': '74941ef0-6243-4016-8f33-3e7e739ec086', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:e5:17', 'vm-uuid': '9fc1259e-0d28-405b-b42e-35f73659ff76'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.242 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:51 compute-0 NetworkManager[55139]: <info>  [1769041131.2440] manager: (tap74941ef0-62): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.246 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.252 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.253 182939 INFO os_vif [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:e5:17,bridge_name='br-int',has_traffic_filtering=True,id=74941ef0-6243-4016-8f33-3e7e739ec086,network=Network(0177ab1d-de86-4d4b-a23f-905845c27092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74941ef0-62')
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.320 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.321 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.321 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] No VIF found with MAC fa:16:3e:62:e5:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.322 182939 INFO nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Using config drive
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.734 182939 INFO nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Creating config drive at /var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk.config
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.739 182939 DEBUG oslo_concurrency.processutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeg5iino5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.867 182939 DEBUG oslo_concurrency.processutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeg5iino5" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:18:51 compute-0 kernel: tap74941ef0-62: entered promiscuous mode
Jan 22 00:18:51 compute-0 NetworkManager[55139]: <info>  [1769041131.9554] manager: (tap74941ef0-62): new Tun device (/org/freedesktop/NetworkManager/Devices/253)
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.956 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:51 compute-0 ovn_controller[95047]: 2026-01-22T00:18:51Z|00522|binding|INFO|Claiming lport 74941ef0-6243-4016-8f33-3e7e739ec086 for this chassis.
Jan 22 00:18:51 compute-0 ovn_controller[95047]: 2026-01-22T00:18:51Z|00523|binding|INFO|74941ef0-6243-4016-8f33-3e7e739ec086: Claiming fa:16:3e:62:e5:17 10.100.0.7
Jan 22 00:18:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:51.967 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:e5:17 10.100.0.7'], port_security=['fa:16:3e:62:e5:17 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9fc1259e-0d28-405b-b42e-35f73659ff76', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0177ab1d-de86-4d4b-a23f-905845c27092', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '828d60daff104fcdab0f25aef8cdb46b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6ec2ef71-8956-4ce0-ac19-3334ac3e5f49 d332bd2c-735d-4b3b-97e1-e788fe54f920', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c7fb41-1c96-4f6d-a404-63df887f5b54, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=74941ef0-6243-4016-8f33-3e7e739ec086) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:18:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:51.968 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 74941ef0-6243-4016-8f33-3e7e739ec086 in datapath 0177ab1d-de86-4d4b-a23f-905845c27092 bound to our chassis
Jan 22 00:18:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:51.970 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0177ab1d-de86-4d4b-a23f-905845c27092
Jan 22 00:18:51 compute-0 ovn_controller[95047]: 2026-01-22T00:18:51Z|00524|binding|INFO|Setting lport 74941ef0-6243-4016-8f33-3e7e739ec086 up in Southbound
Jan 22 00:18:51 compute-0 ovn_controller[95047]: 2026-01-22T00:18:51Z|00525|binding|INFO|Setting lport 74941ef0-6243-4016-8f33-3e7e739ec086 ovn-installed in OVS
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.976 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:51 compute-0 nova_compute[182935]: 2026-01-22 00:18:51.980 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:51.985 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[adf8ba24-5f76-406e-95d1-ab32efbd0a1d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:51.986 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0177ab1d-d1 in ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:18:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:51.989 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0177ab1d-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:18:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:51.989 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[82569ead-457d-46b5-b3f6-d5015fa7a927]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:51.991 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a748f5d7-a0e4-4db9-911e-dae97d4683b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:51 compute-0 systemd-udevd[234643]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.004 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[ac37b9c9-0e70-4936-879c-d4875f11f7fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 systemd-machined[154182]: New machine qemu-71-instance-00000090.
Jan 22 00:18:52 compute-0 NetworkManager[55139]: <info>  [1769041132.0170] device (tap74941ef0-62): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:18:52 compute-0 NetworkManager[55139]: <info>  [1769041132.0183] device (tap74941ef0-62): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.022 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e26496f3-27af-4c63-823d-2858ec409ca7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 systemd[1]: Started Virtual Machine qemu-71-instance-00000090.
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.059 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[5300bdff-9385-4747-adcb-6a3874bf8e56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 systemd-udevd[234655]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:18:52 compute-0 NetworkManager[55139]: <info>  [1769041132.0674] manager: (tap0177ab1d-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/254)
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.066 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ce6aeaad-1852-43b1-9e48-0834dbdba7bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 podman[234627]: 2026-01-22 00:18:52.069260586 +0000 UTC m=+0.123088842 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.099 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f36eb60e-b166-4d13-ba48-8218a423b804]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.103 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[56941c5c-8300-47e1-937f-a806217249d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 NetworkManager[55139]: <info>  [1769041132.1272] device (tap0177ab1d-d0): carrier: link connected
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.133 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[cbe4401a-0ac6-4d5f-8bd9-a6bc0dce5786]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.152 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[aab3a47f-ad6d-4595-81f3-8686ae55cbd7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0177ab1d-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:b6:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563486, 'reachable_time': 44302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234684, 'error': None, 'target': 'ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.175 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6462484c-3517-4c6d-96b4-06769ab8b6d1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:b61d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 563486, 'tstamp': 563486}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234685, 'error': None, 'target': 'ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.193 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb2c0b2-8cf8-4e55-99dd-de82f67dcf01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0177ab1d-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:b6:1d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 168], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563486, 'reachable_time': 44302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234686, 'error': None, 'target': 'ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.226 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee87bce-9626-47f0-b4c0-01f73cda5dd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.301 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dfd7915f-8b89-457c-9931-da0ad8de38ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.303 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0177ab1d-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.304 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.305 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0177ab1d-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:52 compute-0 NetworkManager[55139]: <info>  [1769041132.3096] manager: (tap0177ab1d-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Jan 22 00:18:52 compute-0 kernel: tap0177ab1d-d0: entered promiscuous mode
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.308 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.312 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0177ab1d-d0, col_values=(('external_ids', {'iface-id': '7a8a9d8d-ee83-413b-aa49-79d9d4a16ceb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:52 compute-0 ovn_controller[95047]: 2026-01-22T00:18:52Z|00526|binding|INFO|Releasing lport 7a8a9d8d-ee83-413b-aa49-79d9d4a16ceb from this chassis (sb_readonly=0)
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.313 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.326 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.327 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0177ab1d-de86-4d4b-a23f-905845c27092.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0177ab1d-de86-4d4b-a23f-905845c27092.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.328 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[966c7bdb-f730-429a-ba2a-29b8b56d02d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.329 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-0177ab1d-de86-4d4b-a23f-905845c27092
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/0177ab1d-de86-4d4b-a23f-905845c27092.pid.haproxy
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 0177ab1d-de86-4d4b-a23f-905845c27092
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.331 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092', 'env', 'PROCESS_TAG=haproxy-0177ab1d-de86-4d4b-a23f-905845c27092', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0177ab1d-de86-4d4b-a23f-905845c27092.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.453 182939 DEBUG nova.compute.manager [req-11eedcd2-f5fd-4190-a2c5-089f2dc295a9 req-ba6e3d3e-0bed-4ed5-9b4d-3ae90a82ced2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Received event network-vif-plugged-74941ef0-6243-4016-8f33-3e7e739ec086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.454 182939 DEBUG oslo_concurrency.lockutils [req-11eedcd2-f5fd-4190-a2c5-089f2dc295a9 req-ba6e3d3e-0bed-4ed5-9b4d-3ae90a82ced2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.454 182939 DEBUG oslo_concurrency.lockutils [req-11eedcd2-f5fd-4190-a2c5-089f2dc295a9 req-ba6e3d3e-0bed-4ed5-9b4d-3ae90a82ced2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.454 182939 DEBUG oslo_concurrency.lockutils [req-11eedcd2-f5fd-4190-a2c5-089f2dc295a9 req-ba6e3d3e-0bed-4ed5-9b4d-3ae90a82ced2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.454 182939 DEBUG nova.compute.manager [req-11eedcd2-f5fd-4190-a2c5-089f2dc295a9 req-ba6e3d3e-0bed-4ed5-9b4d-3ae90a82ced2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Processing event network-vif-plugged-74941ef0-6243-4016-8f33-3e7e739ec086 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.547 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.547 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.699 182939 DEBUG nova.network.neutron [req-25667d25-cdca-48f7-803e-66d98b256b6b req-3e6798a9-6e34-41fb-8ccd-7ac787983a3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Updated VIF entry in instance network info cache for port 74941ef0-6243-4016-8f33-3e7e739ec086. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.700 182939 DEBUG nova.network.neutron [req-25667d25-cdca-48f7-803e-66d98b256b6b req-3e6798a9-6e34-41fb-8ccd-7ac787983a3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Updating instance_info_cache with network_info: [{"id": "74941ef0-6243-4016-8f33-3e7e739ec086", "address": "fa:16:3e:62:e5:17", "network": {"id": "0177ab1d-de86-4d4b-a23f-905845c27092", "bridge": "br-int", "label": "tempest-network-smoke--1371204854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "828d60daff104fcdab0f25aef8cdb46b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74941ef0-62", "ovs_interfaceid": "74941ef0-6243-4016-8f33-3e7e739ec086", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.741 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041132.741005, 9fc1259e-0d28-405b-b42e-35f73659ff76 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.742 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] VM Started (Lifecycle Event)
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.744 182939 DEBUG nova.compute.manager [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.748 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:18:52 compute-0 podman[234724]: 2026-01-22 00:18:52.748938123 +0000 UTC m=+0.066336831 container create cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.762 182939 INFO nova.virt.libvirt.driver [-] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Instance spawned successfully.
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.763 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:52 compute-0 systemd[1]: Started libpod-conmon-cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb.scope.
Jan 22 00:18:52 compute-0 podman[234724]: 2026-01-22 00:18:52.713227012 +0000 UTC m=+0.030625740 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.817 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:52 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:18:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8cc850cb8069588c7e66b5ac7d1dd2fee352a951a90dbb1242c725e06c49905/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.846 182939 DEBUG oslo_concurrency.lockutils [req-25667d25-cdca-48f7-803e-66d98b256b6b req-3e6798a9-6e34-41fb-8ccd-7ac787983a3f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-9fc1259e-0d28-405b-b42e-35f73659ff76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:18:52 compute-0 podman[234724]: 2026-01-22 00:18:52.852739074 +0000 UTC m=+0.170137802 container init cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:18:52 compute-0 podman[234724]: 2026-01-22 00:18:52.859329082 +0000 UTC m=+0.176727790 container start cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.871 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.879 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.883 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.885 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.885 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.886 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.886 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:18:52 compute-0 neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092[234741]: [NOTICE]   (234745) : New worker (234747) forked
Jan 22 00:18:52 compute-0 neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092[234741]: [NOTICE]   (234745) : Loading success.
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.889 182939 DEBUG nova.virt.libvirt.driver [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.918 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.919 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041132.742264, 9fc1259e-0d28-405b-b42e-35f73659ff76 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.920 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] VM Paused (Lifecycle Event)
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.938 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:18:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:18:52.939 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.956 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.961 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041132.7469976, 9fc1259e-0d28-405b-b42e-35f73659ff76 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.961 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] VM Resumed (Lifecycle Event)
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.993 182939 INFO nova.compute.manager [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Took 6.43 seconds to spawn the instance on the hypervisor.
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.994 182939 DEBUG nova.compute.manager [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:18:52 compute-0 nova_compute[182935]: 2026-01-22 00:18:52.994 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:18:53 compute-0 nova_compute[182935]: 2026-01-22 00:18:53.001 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:18:53 compute-0 nova_compute[182935]: 2026-01-22 00:18:53.031 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:18:53 compute-0 nova_compute[182935]: 2026-01-22 00:18:53.114 182939 INFO nova.compute.manager [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Took 7.12 seconds to build instance.
Jan 22 00:18:53 compute-0 nova_compute[182935]: 2026-01-22 00:18:53.136 182939 DEBUG oslo_concurrency.lockutils [None req-4233ce10-89c4-476f-a558-75e894db560b 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:53 compute-0 nova_compute[182935]: 2026-01-22 00:18:53.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:18:54 compute-0 nova_compute[182935]: 2026-01-22 00:18:54.020 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:54 compute-0 sshd-session[234774]: Invalid user redis from 188.166.69.60 port 40542
Jan 22 00:18:54 compute-0 nova_compute[182935]: 2026-01-22 00:18:54.585 182939 DEBUG nova.compute.manager [req-0aa690fe-d521-4b3f-99dc-355c9a81e19d req-2663c73d-508c-4814-bb8d-e2447c53f764 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Received event network-vif-plugged-74941ef0-6243-4016-8f33-3e7e739ec086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:18:54 compute-0 nova_compute[182935]: 2026-01-22 00:18:54.585 182939 DEBUG oslo_concurrency.lockutils [req-0aa690fe-d521-4b3f-99dc-355c9a81e19d req-2663c73d-508c-4814-bb8d-e2447c53f764 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:18:54 compute-0 nova_compute[182935]: 2026-01-22 00:18:54.586 182939 DEBUG oslo_concurrency.lockutils [req-0aa690fe-d521-4b3f-99dc-355c9a81e19d req-2663c73d-508c-4814-bb8d-e2447c53f764 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:18:54 compute-0 nova_compute[182935]: 2026-01-22 00:18:54.586 182939 DEBUG oslo_concurrency.lockutils [req-0aa690fe-d521-4b3f-99dc-355c9a81e19d req-2663c73d-508c-4814-bb8d-e2447c53f764 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:18:54 compute-0 nova_compute[182935]: 2026-01-22 00:18:54.587 182939 DEBUG nova.compute.manager [req-0aa690fe-d521-4b3f-99dc-355c9a81e19d req-2663c73d-508c-4814-bb8d-e2447c53f764 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] No waiting events found dispatching network-vif-plugged-74941ef0-6243-4016-8f33-3e7e739ec086 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:18:54 compute-0 nova_compute[182935]: 2026-01-22 00:18:54.587 182939 WARNING nova.compute.manager [req-0aa690fe-d521-4b3f-99dc-355c9a81e19d req-2663c73d-508c-4814-bb8d-e2447c53f764 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Received unexpected event network-vif-plugged-74941ef0-6243-4016-8f33-3e7e739ec086 for instance with vm_state active and task_state None.
Jan 22 00:18:54 compute-0 sshd-session[234774]: Connection closed by invalid user redis 188.166.69.60 port 40542 [preauth]
Jan 22 00:18:54 compute-0 ovn_controller[95047]: 2026-01-22T00:18:54Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f2:b3:f6 10.100.0.5
Jan 22 00:18:54 compute-0 ovn_controller[95047]: 2026-01-22T00:18:54Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:b3:f6 10.100.0.5
Jan 22 00:18:56 compute-0 nova_compute[182935]: 2026-01-22 00:18:56.244 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:57 compute-0 podman[234776]: 2026-01-22 00:18:57.688510968 +0000 UTC m=+0.064041147 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public)
Jan 22 00:18:57 compute-0 podman[234777]: 2026-01-22 00:18:57.695908953 +0000 UTC m=+0.068186585 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:18:57 compute-0 nova_compute[182935]: 2026-01-22 00:18:57.819 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:18:59 compute-0 nova_compute[182935]: 2026-01-22 00:18:59.008 182939 DEBUG nova.compute.manager [req-85baeed0-c63d-4d64-9f9b-9d6ab1fa590d req-39757ff2-a400-4884-8205-c8fbaa6322a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Received event network-changed-74941ef0-6243-4016-8f33-3e7e739ec086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:18:59 compute-0 nova_compute[182935]: 2026-01-22 00:18:59.009 182939 DEBUG nova.compute.manager [req-85baeed0-c63d-4d64-9f9b-9d6ab1fa590d req-39757ff2-a400-4884-8205-c8fbaa6322a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Refreshing instance network info cache due to event network-changed-74941ef0-6243-4016-8f33-3e7e739ec086. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:18:59 compute-0 nova_compute[182935]: 2026-01-22 00:18:59.010 182939 DEBUG oslo_concurrency.lockutils [req-85baeed0-c63d-4d64-9f9b-9d6ab1fa590d req-39757ff2-a400-4884-8205-c8fbaa6322a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-9fc1259e-0d28-405b-b42e-35f73659ff76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:18:59 compute-0 nova_compute[182935]: 2026-01-22 00:18:59.010 182939 DEBUG oslo_concurrency.lockutils [req-85baeed0-c63d-4d64-9f9b-9d6ab1fa590d req-39757ff2-a400-4884-8205-c8fbaa6322a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-9fc1259e-0d28-405b-b42e-35f73659ff76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:18:59 compute-0 nova_compute[182935]: 2026-01-22 00:18:59.010 182939 DEBUG nova.network.neutron [req-85baeed0-c63d-4d64-9f9b-9d6ab1fa590d req-39757ff2-a400-4884-8205-c8fbaa6322a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Refreshing network info cache for port 74941ef0-6243-4016-8f33-3e7e739ec086 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:19:00 compute-0 nova_compute[182935]: 2026-01-22 00:19:00.327 182939 DEBUG nova.network.neutron [req-85baeed0-c63d-4d64-9f9b-9d6ab1fa590d req-39757ff2-a400-4884-8205-c8fbaa6322a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Updated VIF entry in instance network info cache for port 74941ef0-6243-4016-8f33-3e7e739ec086. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:19:00 compute-0 nova_compute[182935]: 2026-01-22 00:19:00.328 182939 DEBUG nova.network.neutron [req-85baeed0-c63d-4d64-9f9b-9d6ab1fa590d req-39757ff2-a400-4884-8205-c8fbaa6322a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Updating instance_info_cache with network_info: [{"id": "74941ef0-6243-4016-8f33-3e7e739ec086", "address": "fa:16:3e:62:e5:17", "network": {"id": "0177ab1d-de86-4d4b-a23f-905845c27092", "bridge": "br-int", "label": "tempest-network-smoke--1371204854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "828d60daff104fcdab0f25aef8cdb46b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74941ef0-62", "ovs_interfaceid": "74941ef0-6243-4016-8f33-3e7e739ec086", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:00 compute-0 nova_compute[182935]: 2026-01-22 00:19:00.353 182939 DEBUG oslo_concurrency.lockutils [req-85baeed0-c63d-4d64-9f9b-9d6ab1fa590d req-39757ff2-a400-4884-8205-c8fbaa6322a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-9fc1259e-0d28-405b-b42e-35f73659ff76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:01 compute-0 nova_compute[182935]: 2026-01-22 00:19:01.248 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:02 compute-0 nova_compute[182935]: 2026-01-22 00:19:02.561 182939 INFO nova.compute.manager [None req-c5c9ad92-3b27-46d8-b1e2-a38c3120b4fe 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Get console output
Jan 22 00:19:02 compute-0 nova_compute[182935]: 2026-01-22 00:19:02.568 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:19:02 compute-0 nova_compute[182935]: 2026-01-22 00:19:02.821 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:03.216 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:03.216 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:03.217 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:03 compute-0 ovn_controller[95047]: 2026-01-22T00:19:03Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:b3:f6 10.100.0.5
Jan 22 00:19:03 compute-0 nova_compute[182935]: 2026-01-22 00:19:03.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:04 compute-0 ovn_controller[95047]: 2026-01-22T00:19:04Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:b3:f6 10.100.0.5
Jan 22 00:19:05 compute-0 ovn_controller[95047]: 2026-01-22T00:19:05Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:62:e5:17 10.100.0.7
Jan 22 00:19:05 compute-0 ovn_controller[95047]: 2026-01-22T00:19:05Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:62:e5:17 10.100.0.7
Jan 22 00:19:06 compute-0 nova_compute[182935]: 2026-01-22 00:19:06.289 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:07 compute-0 nova_compute[182935]: 2026-01-22 00:19:07.823 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:08 compute-0 ovn_controller[95047]: 2026-01-22T00:19:08Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:b3:f6 10.100.0.5
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.369 182939 DEBUG nova.compute.manager [req-4edd0606-9c57-4fd7-9ba4-07ad5d5ebb52 req-84a6baeb-f945-449c-87fe-77e1baa46a0b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Received event network-changed-92a7e957-ed27-4696-8db4-46047f344d0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.369 182939 DEBUG nova.compute.manager [req-4edd0606-9c57-4fd7-9ba4-07ad5d5ebb52 req-84a6baeb-f945-449c-87fe-77e1baa46a0b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Refreshing instance network info cache due to event network-changed-92a7e957-ed27-4696-8db4-46047f344d0a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.370 182939 DEBUG oslo_concurrency.lockutils [req-4edd0606-9c57-4fd7-9ba4-07ad5d5ebb52 req-84a6baeb-f945-449c-87fe-77e1baa46a0b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.370 182939 DEBUG oslo_concurrency.lockutils [req-4edd0606-9c57-4fd7-9ba4-07ad5d5ebb52 req-84a6baeb-f945-449c-87fe-77e1baa46a0b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.370 182939 DEBUG nova.network.neutron [req-4edd0606-9c57-4fd7-9ba4-07ad5d5ebb52 req-84a6baeb-f945-449c-87fe-77e1baa46a0b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Refreshing network info cache for port 92a7e957-ed27-4696-8db4-46047f344d0a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.535 182939 DEBUG oslo_concurrency.lockutils [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "00f1dbef-5c82-4287-9df6-2fca347c2852" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.536 182939 DEBUG oslo_concurrency.lockutils [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.537 182939 DEBUG oslo_concurrency.lockutils [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.537 182939 DEBUG oslo_concurrency.lockutils [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.537 182939 DEBUG oslo_concurrency.lockutils [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.549 182939 INFO nova.compute.manager [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Terminating instance
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.562 182939 DEBUG nova.compute.manager [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:19:08 compute-0 kernel: tap92a7e957-ed (unregistering): left promiscuous mode
Jan 22 00:19:08 compute-0 NetworkManager[55139]: <info>  [1769041148.5934] device (tap92a7e957-ed): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.632 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:08 compute-0 ovn_controller[95047]: 2026-01-22T00:19:08Z|00527|binding|INFO|Releasing lport 92a7e957-ed27-4696-8db4-46047f344d0a from this chassis (sb_readonly=0)
Jan 22 00:19:08 compute-0 ovn_controller[95047]: 2026-01-22T00:19:08Z|00528|binding|INFO|Setting lport 92a7e957-ed27-4696-8db4-46047f344d0a down in Southbound
Jan 22 00:19:08 compute-0 ovn_controller[95047]: 2026-01-22T00:19:08Z|00529|binding|INFO|Removing iface tap92a7e957-ed ovn-installed in OVS
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.634 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:08.642 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:b3:f6 10.100.0.5'], port_security=['fa:16:3e:f2:b3:f6 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '00f1dbef-5c82-4287-9df6-2fca347c2852', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0635e581-a43c-4a3a-8490-6d5b22361e4e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34b96b4037d24a0ea19383ca2477b2fd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3675cdb2-29f9-47b7-a8e2-4f48470eb4f9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a317f3f1-02a8-4299-a536-8c346dbb0861, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=92a7e957-ed27-4696-8db4-46047f344d0a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:19:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:08.645 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 92a7e957-ed27-4696-8db4-46047f344d0a in datapath 0635e581-a43c-4a3a-8490-6d5b22361e4e unbound from our chassis
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.646 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:08.647 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0635e581-a43c-4a3a-8490-6d5b22361e4e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:19:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:08.648 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[191228f7-1891-4784-bd2b-dcafa4262cb3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:08.649 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e namespace which is not needed anymore
Jan 22 00:19:08 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d0000008e.scope: Deactivated successfully.
Jan 22 00:19:08 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d0000008e.scope: Consumed 14.641s CPU time.
Jan 22 00:19:08 compute-0 systemd-machined[154182]: Machine qemu-70-instance-0000008e terminated.
Jan 22 00:19:08 compute-0 neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e[234495]: [NOTICE]   (234499) : haproxy version is 2.8.14-c23fe91
Jan 22 00:19:08 compute-0 neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e[234495]: [NOTICE]   (234499) : path to executable is /usr/sbin/haproxy
Jan 22 00:19:08 compute-0 neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e[234495]: [WARNING]  (234499) : Exiting Master process...
Jan 22 00:19:08 compute-0 neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e[234495]: [ALERT]    (234499) : Current worker (234501) exited with code 143 (Terminated)
Jan 22 00:19:08 compute-0 neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e[234495]: [WARNING]  (234499) : All workers exited. Exiting... (0)
Jan 22 00:19:08 compute-0 systemd[1]: libpod-0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4.scope: Deactivated successfully.
Jan 22 00:19:08 compute-0 podman[234855]: 2026-01-22 00:19:08.800208549 +0000 UTC m=+0.057425549 container died 0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.838 182939 INFO nova.virt.libvirt.driver [-] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Instance destroyed successfully.
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.839 182939 DEBUG nova.objects.instance [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lazy-loading 'resources' on Instance uuid 00f1dbef-5c82-4287-9df6-2fca347c2852 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.855 182939 DEBUG nova.virt.libvirt.vif [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:18:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-542111961',display_name='tempest-TestNetworkBasicOps-server-542111961',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-542111961',id=142,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLsQVoSMf+l9f1DJAJNJXGg3MYgdMg2NKPQouBZTvv4c7FDkjYqI22zEEEZQ6T13spe7Ib9awbPJaYWdt9L2qTnbAntWMo/p6k7z4TK8fYhUKilllGMLWNmFmwaUqp3N7w==',key_name='tempest-TestNetworkBasicOps-1288527963',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:18:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34b96b4037d24a0ea19383ca2477b2fd',ramdisk_id='',reservation_id='r-lxm1nqql',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-822850957',owner_user_name='tempest-TestNetworkBasicOps-822850957-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:18:41Z,user_data=None,user_id='833f1e9dce90456ea55a443da6704907',uuid=00f1dbef-5c82-4287-9df6-2fca347c2852,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.855 182939 DEBUG nova.network.os_vif_util [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converting VIF {"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.856 182939 DEBUG nova.network.os_vif_util [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=92a7e957-ed27-4696-8db4-46047f344d0a,network=Network(0635e581-a43c-4a3a-8490-6d5b22361e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a7e957-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.856 182939 DEBUG os_vif [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=92a7e957-ed27-4696-8db4-46047f344d0a,network=Network(0635e581-a43c-4a3a-8490-6d5b22361e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a7e957-ed') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:19:08 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.859 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:08 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.860 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92a7e957-ed, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.861 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.863 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.868 182939 INFO os_vif [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:b3:f6,bridge_name='br-int',has_traffic_filtering=True,id=92a7e957-ed27-4696-8db4-46047f344d0a,network=Network(0635e581-a43c-4a3a-8490-6d5b22361e4e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92a7e957-ed')
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.869 182939 INFO nova.virt.libvirt.driver [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Deleting instance files /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852_del
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.870 182939 INFO nova.virt.libvirt.driver [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Deletion of /var/lib/nova/instances/00f1dbef-5c82-4287-9df6-2fca347c2852_del complete
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.876 182939 DEBUG nova.compute.manager [req-be464eb7-f872-4d65-a9e5-2ee8c7e98af4 req-fcf373a5-3a3c-4cce-83d4-8556b4826151 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Received event network-vif-unplugged-92a7e957-ed27-4696-8db4-46047f344d0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.876 182939 DEBUG oslo_concurrency.lockutils [req-be464eb7-f872-4d65-a9e5-2ee8c7e98af4 req-fcf373a5-3a3c-4cce-83d4-8556b4826151 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.877 182939 DEBUG oslo_concurrency.lockutils [req-be464eb7-f872-4d65-a9e5-2ee8c7e98af4 req-fcf373a5-3a3c-4cce-83d4-8556b4826151 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.877 182939 DEBUG oslo_concurrency.lockutils [req-be464eb7-f872-4d65-a9e5-2ee8c7e98af4 req-fcf373a5-3a3c-4cce-83d4-8556b4826151 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.877 182939 DEBUG nova.compute.manager [req-be464eb7-f872-4d65-a9e5-2ee8c7e98af4 req-fcf373a5-3a3c-4cce-83d4-8556b4826151 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] No waiting events found dispatching network-vif-unplugged-92a7e957-ed27-4696-8db4-46047f344d0a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:08 compute-0 nova_compute[182935]: 2026-01-22 00:19:08.877 182939 DEBUG nova.compute.manager [req-be464eb7-f872-4d65-a9e5-2ee8c7e98af4 req-fcf373a5-3a3c-4cce-83d4-8556b4826151 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Received event network-vif-unplugged-92a7e957-ed27-4696-8db4-46047f344d0a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:19:09 compute-0 nova_compute[182935]: 2026-01-22 00:19:09.118 182939 INFO nova.compute.manager [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Took 0.56 seconds to destroy the instance on the hypervisor.
Jan 22 00:19:09 compute-0 nova_compute[182935]: 2026-01-22 00:19:09.119 182939 DEBUG oslo.service.loopingcall [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:19:09 compute-0 nova_compute[182935]: 2026-01-22 00:19:09.119 182939 DEBUG nova.compute.manager [-] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:19:09 compute-0 nova_compute[182935]: 2026-01-22 00:19:09.119 182939 DEBUG nova.network.neutron [-] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:19:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4-userdata-shm.mount: Deactivated successfully.
Jan 22 00:19:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-ad7653be67caa0d848eb68b662e1b080714c162f2a0c08e80e93121176450568-merged.mount: Deactivated successfully.
Jan 22 00:19:09 compute-0 podman[234855]: 2026-01-22 00:19:09.161752978 +0000 UTC m=+0.418969978 container cleanup 0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:19:09 compute-0 podman[234900]: 2026-01-22 00:19:09.793181776 +0000 UTC m=+0.609610509 container remove 0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:19:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:09.800 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[197d61e8-e0d0-4ae0-9a2f-b8a4a7d43fcf]: (4, ('Thu Jan 22 12:19:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e (0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4)\n0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4\nThu Jan 22 12:19:09 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e (0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4)\n0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:09.803 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0bbced67-5c25-4698-b5c3-cb5d8821f937]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:09.804 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0635e581-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:09 compute-0 systemd[1]: libpod-conmon-0d446a958ced5c64b0585665b1b946091b38722f574532efe5bf8d6773b0e6d4.scope: Deactivated successfully.
Jan 22 00:19:09 compute-0 kernel: tap0635e581-a0: left promiscuous mode
Jan 22 00:19:09 compute-0 nova_compute[182935]: 2026-01-22 00:19:09.857 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:09 compute-0 nova_compute[182935]: 2026-01-22 00:19:09.869 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:09.875 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a53683af-c87d-4daa-9217-ae0854fc3105]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:09.887 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[514337b1-2059-46ce-b2ad-19b42845cb44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:09.888 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8a9c56bd-8267-4dd4-bb67-2f5b61c944c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:09.904 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3b7c1a-63a5-4af5-a9e4-68bbd055f9a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562344, 'reachable_time': 18475, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234916, 'error': None, 'target': 'ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d0635e581\x2da43c\x2d4a3a\x2d8490\x2d6d5b22361e4e.mount: Deactivated successfully.
Jan 22 00:19:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:09.906 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0635e581-a43c-4a3a-8490-6d5b22361e4e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:19:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:09.906 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[aa92c208-6e38-44e4-ab1e-a1e0c1f9001d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:10 compute-0 nova_compute[182935]: 2026-01-22 00:19:10.102 182939 DEBUG nova.network.neutron [req-4edd0606-9c57-4fd7-9ba4-07ad5d5ebb52 req-84a6baeb-f945-449c-87fe-77e1baa46a0b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Updated VIF entry in instance network info cache for port 92a7e957-ed27-4696-8db4-46047f344d0a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:19:10 compute-0 nova_compute[182935]: 2026-01-22 00:19:10.102 182939 DEBUG nova.network.neutron [req-4edd0606-9c57-4fd7-9ba4-07ad5d5ebb52 req-84a6baeb-f945-449c-87fe-77e1baa46a0b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Updating instance_info_cache with network_info: [{"id": "92a7e957-ed27-4696-8db4-46047f344d0a", "address": "fa:16:3e:f2:b3:f6", "network": {"id": "0635e581-a43c-4a3a-8490-6d5b22361e4e", "bridge": "br-int", "label": "tempest-network-smoke--1502264170", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34b96b4037d24a0ea19383ca2477b2fd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92a7e957-ed", "ovs_interfaceid": "92a7e957-ed27-4696-8db4-46047f344d0a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:10 compute-0 nova_compute[182935]: 2026-01-22 00:19:10.126 182939 DEBUG oslo_concurrency.lockutils [req-4edd0606-9c57-4fd7-9ba4-07ad5d5ebb52 req-84a6baeb-f945-449c-87fe-77e1baa46a0b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-00f1dbef-5c82-4287-9df6-2fca347c2852" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:10 compute-0 nova_compute[182935]: 2026-01-22 00:19:10.397 182939 DEBUG nova.network.neutron [-] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:10 compute-0 nova_compute[182935]: 2026-01-22 00:19:10.621 182939 INFO nova.compute.manager [-] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Took 1.50 seconds to deallocate network for instance.
Jan 22 00:19:10 compute-0 nova_compute[182935]: 2026-01-22 00:19:10.666 182939 DEBUG nova.compute.manager [req-1cc8f8a4-321d-4df1-b5e9-14483bedb652 req-b48d110f-3495-4467-a021-6aa85b8d9751 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Received event network-vif-deleted-92a7e957-ed27-4696-8db4-46047f344d0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:10 compute-0 nova_compute[182935]: 2026-01-22 00:19:10.667 182939 INFO nova.compute.manager [req-1cc8f8a4-321d-4df1-b5e9-14483bedb652 req-b48d110f-3495-4467-a021-6aa85b8d9751 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Neutron deleted interface 92a7e957-ed27-4696-8db4-46047f344d0a; detaching it from the instance and deleting it from the info cache
Jan 22 00:19:10 compute-0 nova_compute[182935]: 2026-01-22 00:19:10.667 182939 DEBUG nova.network.neutron [req-1cc8f8a4-321d-4df1-b5e9-14483bedb652 req-b48d110f-3495-4467-a021-6aa85b8d9751 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:10 compute-0 nova_compute[182935]: 2026-01-22 00:19:10.787 182939 DEBUG nova.compute.manager [req-1cc8f8a4-321d-4df1-b5e9-14483bedb652 req-b48d110f-3495-4467-a021-6aa85b8d9751 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Detach interface failed, port_id=92a7e957-ed27-4696-8db4-46047f344d0a, reason: Instance 00f1dbef-5c82-4287-9df6-2fca347c2852 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:19:10 compute-0 nova_compute[182935]: 2026-01-22 00:19:10.909 182939 DEBUG oslo_concurrency.lockutils [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:10 compute-0 nova_compute[182935]: 2026-01-22 00:19:10.910 182939 DEBUG oslo_concurrency.lockutils [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:10 compute-0 nova_compute[182935]: 2026-01-22 00:19:10.987 182939 DEBUG nova.compute.provider_tree [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:19:11 compute-0 nova_compute[182935]: 2026-01-22 00:19:11.006 182939 DEBUG nova.scheduler.client.report [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:19:11 compute-0 nova_compute[182935]: 2026-01-22 00:19:11.041 182939 DEBUG oslo_concurrency.lockutils [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:11 compute-0 nova_compute[182935]: 2026-01-22 00:19:11.046 182939 DEBUG nova.compute.manager [req-f422bb46-fd4e-4552-abed-0f6d3370f896 req-17c2e23c-5d75-4a88-9e31-76d423ee77d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Received event network-vif-plugged-92a7e957-ed27-4696-8db4-46047f344d0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:11 compute-0 nova_compute[182935]: 2026-01-22 00:19:11.046 182939 DEBUG oslo_concurrency.lockutils [req-f422bb46-fd4e-4552-abed-0f6d3370f896 req-17c2e23c-5d75-4a88-9e31-76d423ee77d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:11 compute-0 nova_compute[182935]: 2026-01-22 00:19:11.046 182939 DEBUG oslo_concurrency.lockutils [req-f422bb46-fd4e-4552-abed-0f6d3370f896 req-17c2e23c-5d75-4a88-9e31-76d423ee77d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:11 compute-0 nova_compute[182935]: 2026-01-22 00:19:11.047 182939 DEBUG oslo_concurrency.lockutils [req-f422bb46-fd4e-4552-abed-0f6d3370f896 req-17c2e23c-5d75-4a88-9e31-76d423ee77d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:11 compute-0 nova_compute[182935]: 2026-01-22 00:19:11.047 182939 DEBUG nova.compute.manager [req-f422bb46-fd4e-4552-abed-0f6d3370f896 req-17c2e23c-5d75-4a88-9e31-76d423ee77d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] No waiting events found dispatching network-vif-plugged-92a7e957-ed27-4696-8db4-46047f344d0a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:11 compute-0 nova_compute[182935]: 2026-01-22 00:19:11.047 182939 WARNING nova.compute.manager [req-f422bb46-fd4e-4552-abed-0f6d3370f896 req-17c2e23c-5d75-4a88-9e31-76d423ee77d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Received unexpected event network-vif-plugged-92a7e957-ed27-4696-8db4-46047f344d0a for instance with vm_state deleted and task_state None.
Jan 22 00:19:11 compute-0 nova_compute[182935]: 2026-01-22 00:19:11.081 182939 INFO nova.scheduler.client.report [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Deleted allocations for instance 00f1dbef-5c82-4287-9df6-2fca347c2852
Jan 22 00:19:11 compute-0 nova_compute[182935]: 2026-01-22 00:19:11.186 182939 DEBUG oslo_concurrency.lockutils [None req-277362fd-c186-4a88-957f-464ff21e24df 833f1e9dce90456ea55a443da6704907 34b96b4037d24a0ea19383ca2477b2fd - - default default] Lock "00f1dbef-5c82-4287-9df6-2fca347c2852" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.650s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:12 compute-0 nova_compute[182935]: 2026-01-22 00:19:12.826 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:13 compute-0 podman[234919]: 2026-01-22 00:19:13.682690924 +0000 UTC m=+0.054238833 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:19:13 compute-0 podman[234918]: 2026-01-22 00:19:13.76397251 +0000 UTC m=+0.135342615 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:19:13 compute-0 nova_compute[182935]: 2026-01-22 00:19:13.863 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:16 compute-0 podman[234967]: 2026-01-22 00:19:16.679892303 +0000 UTC m=+0.056196900 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:19:17 compute-0 ovn_controller[95047]: 2026-01-22T00:19:17Z|00530|binding|INFO|Releasing lport 7a8a9d8d-ee83-413b-aa49-79d9d4a16ceb from this chassis (sb_readonly=0)
Jan 22 00:19:17 compute-0 nova_compute[182935]: 2026-01-22 00:19:17.434 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:17 compute-0 nova_compute[182935]: 2026-01-22 00:19:17.831 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:18 compute-0 nova_compute[182935]: 2026-01-22 00:19:18.865 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:22 compute-0 podman[234993]: 2026-01-22 00:19:22.681615901 +0000 UTC m=+0.055492582 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:19:22 compute-0 nova_compute[182935]: 2026-01-22 00:19:22.834 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:23 compute-0 nova_compute[182935]: 2026-01-22 00:19:23.835 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041148.8334298, 00f1dbef-5c82-4287-9df6-2fca347c2852 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:23 compute-0 nova_compute[182935]: 2026-01-22 00:19:23.835 182939 INFO nova.compute.manager [-] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] VM Stopped (Lifecycle Event)
Jan 22 00:19:23 compute-0 nova_compute[182935]: 2026-01-22 00:19:23.869 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:23 compute-0 nova_compute[182935]: 2026-01-22 00:19:23.871 182939 DEBUG nova.compute.manager [None req-5065031c-34ea-4f79-917e-5d0aaa2a0eeb - - - - - -] [instance: 00f1dbef-5c82-4287-9df6-2fca347c2852] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:23 compute-0 nova_compute[182935]: 2026-01-22 00:19:23.909 182939 DEBUG nova.compute.manager [req-945dbed2-5446-4fa2-a103-904672c4cf30 req-2673e8d1-c340-4223-b8a2-b184224a3958 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Received event network-changed-74941ef0-6243-4016-8f33-3e7e739ec086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:23 compute-0 nova_compute[182935]: 2026-01-22 00:19:23.910 182939 DEBUG nova.compute.manager [req-945dbed2-5446-4fa2-a103-904672c4cf30 req-2673e8d1-c340-4223-b8a2-b184224a3958 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Refreshing instance network info cache due to event network-changed-74941ef0-6243-4016-8f33-3e7e739ec086. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:19:23 compute-0 nova_compute[182935]: 2026-01-22 00:19:23.910 182939 DEBUG oslo_concurrency.lockutils [req-945dbed2-5446-4fa2-a103-904672c4cf30 req-2673e8d1-c340-4223-b8a2-b184224a3958 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-9fc1259e-0d28-405b-b42e-35f73659ff76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:23 compute-0 nova_compute[182935]: 2026-01-22 00:19:23.910 182939 DEBUG oslo_concurrency.lockutils [req-945dbed2-5446-4fa2-a103-904672c4cf30 req-2673e8d1-c340-4223-b8a2-b184224a3958 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-9fc1259e-0d28-405b-b42e-35f73659ff76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:23 compute-0 nova_compute[182935]: 2026-01-22 00:19:23.910 182939 DEBUG nova.network.neutron [req-945dbed2-5446-4fa2-a103-904672c4cf30 req-2673e8d1-c340-4223-b8a2-b184224a3958 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Refreshing network info cache for port 74941ef0-6243-4016-8f33-3e7e739ec086 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.130 182939 DEBUG oslo_concurrency.lockutils [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Acquiring lock "9fc1259e-0d28-405b-b42e-35f73659ff76" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.131 182939 DEBUG oslo_concurrency.lockutils [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.131 182939 DEBUG oslo_concurrency.lockutils [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Acquiring lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.131 182939 DEBUG oslo_concurrency.lockutils [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.132 182939 DEBUG oslo_concurrency.lockutils [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.144 182939 INFO nova.compute.manager [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Terminating instance
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.155 182939 DEBUG nova.compute.manager [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:19:24 compute-0 kernel: tap74941ef0-62 (unregistering): left promiscuous mode
Jan 22 00:19:24 compute-0 NetworkManager[55139]: <info>  [1769041164.1814] device (tap74941ef0-62): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:19:24 compute-0 ovn_controller[95047]: 2026-01-22T00:19:24Z|00531|binding|INFO|Releasing lport 74941ef0-6243-4016-8f33-3e7e739ec086 from this chassis (sb_readonly=0)
Jan 22 00:19:24 compute-0 ovn_controller[95047]: 2026-01-22T00:19:24Z|00532|binding|INFO|Setting lport 74941ef0-6243-4016-8f33-3e7e739ec086 down in Southbound
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.187 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:24 compute-0 ovn_controller[95047]: 2026-01-22T00:19:24Z|00533|binding|INFO|Removing iface tap74941ef0-62 ovn-installed in OVS
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.189 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:24.195 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:e5:17 10.100.0.7'], port_security=['fa:16:3e:62:e5:17 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9fc1259e-0d28-405b-b42e-35f73659ff76', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0177ab1d-de86-4d4b-a23f-905845c27092', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '828d60daff104fcdab0f25aef8cdb46b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6ec2ef71-8956-4ce0-ac19-3334ac3e5f49 d332bd2c-735d-4b3b-97e1-e788fe54f920', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c7fb41-1c96-4f6d-a404-63df887f5b54, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=74941ef0-6243-4016-8f33-3e7e739ec086) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:19:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:24.196 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 74941ef0-6243-4016-8f33-3e7e739ec086 in datapath 0177ab1d-de86-4d4b-a23f-905845c27092 unbound from our chassis
Jan 22 00:19:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:24.198 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0177ab1d-de86-4d4b-a23f-905845c27092, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:19:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:24.199 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2fe1a088-fbd2-43c9-8045-da009509316c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:24.199 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092 namespace which is not needed anymore
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.205 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:24 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000090.scope: Deactivated successfully.
Jan 22 00:19:24 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000090.scope: Consumed 14.354s CPU time.
Jan 22 00:19:24 compute-0 systemd-machined[154182]: Machine qemu-71-instance-00000090 terminated.
Jan 22 00:19:24 compute-0 neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092[234741]: [NOTICE]   (234745) : haproxy version is 2.8.14-c23fe91
Jan 22 00:19:24 compute-0 neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092[234741]: [NOTICE]   (234745) : path to executable is /usr/sbin/haproxy
Jan 22 00:19:24 compute-0 neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092[234741]: [WARNING]  (234745) : Exiting Master process...
Jan 22 00:19:24 compute-0 neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092[234741]: [WARNING]  (234745) : Exiting Master process...
Jan 22 00:19:24 compute-0 neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092[234741]: [ALERT]    (234745) : Current worker (234747) exited with code 143 (Terminated)
Jan 22 00:19:24 compute-0 neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092[234741]: [WARNING]  (234745) : All workers exited. Exiting... (0)
Jan 22 00:19:24 compute-0 systemd[1]: libpod-cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb.scope: Deactivated successfully.
Jan 22 00:19:24 compute-0 conmon[234741]: conmon cc137c22d80306e6fa3b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb.scope/container/memory.events
Jan 22 00:19:24 compute-0 podman[235038]: 2026-01-22 00:19:24.418129216 +0000 UTC m=+0.127582179 container died cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.423 182939 INFO nova.virt.libvirt.driver [-] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Instance destroyed successfully.
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.423 182939 DEBUG nova.objects.instance [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lazy-loading 'resources' on Instance uuid 9fc1259e-0d28-405b-b42e-35f73659ff76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.442 182939 DEBUG nova.virt.libvirt.vif [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:18:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-852901634-access_point-38128636',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-852901634-access_point-38128636',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-852901634-acc',id=144,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDKro6oqpFAId7ik5Vn6WyyZp3vrK9KUyorcKTmJ6BBxHaUYJDX2zHomAAcWTG9p33Yti1tDUxA4PhApZ9mTehr2h9TZzHbjMWw71YFlUqJWo4dGRNgyn1ZQz6co9PPZaA==',key_name='tempest-TestSecurityGroupsBasicOps-1844760784',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:18:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='828d60daff104fcdab0f25aef8cdb46b',ramdisk_id='',reservation_id='r-qwgk6q48',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-852901634',owner_user_name='tempest-TestSecurityGroupsBasicOps-852901634-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:18:53Z,user_data=None,user_id='793e503e125c457d8f4082ecd2e4a391',uuid=9fc1259e-0d28-405b-b42e-35f73659ff76,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "74941ef0-6243-4016-8f33-3e7e739ec086", "address": "fa:16:3e:62:e5:17", "network": {"id": "0177ab1d-de86-4d4b-a23f-905845c27092", "bridge": "br-int", "label": "tempest-network-smoke--1371204854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "828d60daff104fcdab0f25aef8cdb46b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74941ef0-62", "ovs_interfaceid": "74941ef0-6243-4016-8f33-3e7e739ec086", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.442 182939 DEBUG nova.network.os_vif_util [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Converting VIF {"id": "74941ef0-6243-4016-8f33-3e7e739ec086", "address": "fa:16:3e:62:e5:17", "network": {"id": "0177ab1d-de86-4d4b-a23f-905845c27092", "bridge": "br-int", "label": "tempest-network-smoke--1371204854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "828d60daff104fcdab0f25aef8cdb46b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74941ef0-62", "ovs_interfaceid": "74941ef0-6243-4016-8f33-3e7e739ec086", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.443 182939 DEBUG nova.network.os_vif_util [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:e5:17,bridge_name='br-int',has_traffic_filtering=True,id=74941ef0-6243-4016-8f33-3e7e739ec086,network=Network(0177ab1d-de86-4d4b-a23f-905845c27092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74941ef0-62') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.443 182939 DEBUG os_vif [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:e5:17,bridge_name='br-int',has_traffic_filtering=True,id=74941ef0-6243-4016-8f33-3e7e739ec086,network=Network(0177ab1d-de86-4d4b-a23f-905845c27092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74941ef0-62') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.444 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.444 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74941ef0-62, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.446 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.447 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.449 182939 INFO os_vif [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:e5:17,bridge_name='br-int',has_traffic_filtering=True,id=74941ef0-6243-4016-8f33-3e7e739ec086,network=Network(0177ab1d-de86-4d4b-a23f-905845c27092),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74941ef0-62')
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.450 182939 INFO nova.virt.libvirt.driver [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Deleting instance files /var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76_del
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.450 182939 INFO nova.virt.libvirt.driver [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Deletion of /var/lib/nova/instances/9fc1259e-0d28-405b-b42e-35f73659ff76_del complete
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.476 182939 DEBUG nova.compute.manager [req-2e00b698-26ed-432a-9d29-8e38220498b1 req-d39f2e7c-1198-4727-92cc-4568a74a0076 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Received event network-vif-unplugged-74941ef0-6243-4016-8f33-3e7e739ec086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.477 182939 DEBUG oslo_concurrency.lockutils [req-2e00b698-26ed-432a-9d29-8e38220498b1 req-d39f2e7c-1198-4727-92cc-4568a74a0076 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.477 182939 DEBUG oslo_concurrency.lockutils [req-2e00b698-26ed-432a-9d29-8e38220498b1 req-d39f2e7c-1198-4727-92cc-4568a74a0076 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.477 182939 DEBUG oslo_concurrency.lockutils [req-2e00b698-26ed-432a-9d29-8e38220498b1 req-d39f2e7c-1198-4727-92cc-4568a74a0076 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.477 182939 DEBUG nova.compute.manager [req-2e00b698-26ed-432a-9d29-8e38220498b1 req-d39f2e7c-1198-4727-92cc-4568a74a0076 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] No waiting events found dispatching network-vif-unplugged-74941ef0-6243-4016-8f33-3e7e739ec086 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.478 182939 DEBUG nova.compute.manager [req-2e00b698-26ed-432a-9d29-8e38220498b1 req-d39f2e7c-1198-4727-92cc-4568a74a0076 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Received event network-vif-unplugged-74941ef0-6243-4016-8f33-3e7e739ec086 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.610 182939 INFO nova.compute.manager [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Took 0.45 seconds to destroy the instance on the hypervisor.
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.611 182939 DEBUG oslo.service.loopingcall [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.611 182939 DEBUG nova.compute.manager [-] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:19:24 compute-0 nova_compute[182935]: 2026-01-22 00:19:24.611 182939 DEBUG nova.network.neutron [-] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:19:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb-userdata-shm.mount: Deactivated successfully.
Jan 22 00:19:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-c8cc850cb8069588c7e66b5ac7d1dd2fee352a951a90dbb1242c725e06c49905-merged.mount: Deactivated successfully.
Jan 22 00:19:25 compute-0 nova_compute[182935]: 2026-01-22 00:19:25.745 182939 DEBUG nova.network.neutron [req-945dbed2-5446-4fa2-a103-904672c4cf30 req-2673e8d1-c340-4223-b8a2-b184224a3958 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Updated VIF entry in instance network info cache for port 74941ef0-6243-4016-8f33-3e7e739ec086. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:19:25 compute-0 nova_compute[182935]: 2026-01-22 00:19:25.746 182939 DEBUG nova.network.neutron [req-945dbed2-5446-4fa2-a103-904672c4cf30 req-2673e8d1-c340-4223-b8a2-b184224a3958 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Updating instance_info_cache with network_info: [{"id": "74941ef0-6243-4016-8f33-3e7e739ec086", "address": "fa:16:3e:62:e5:17", "network": {"id": "0177ab1d-de86-4d4b-a23f-905845c27092", "bridge": "br-int", "label": "tempest-network-smoke--1371204854", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "828d60daff104fcdab0f25aef8cdb46b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74941ef0-62", "ovs_interfaceid": "74941ef0-6243-4016-8f33-3e7e739ec086", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:25 compute-0 nova_compute[182935]: 2026-01-22 00:19:25.839 182939 DEBUG oslo_concurrency.lockutils [req-945dbed2-5446-4fa2-a103-904672c4cf30 req-2673e8d1-c340-4223-b8a2-b184224a3958 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-9fc1259e-0d28-405b-b42e-35f73659ff76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:26 compute-0 podman[235038]: 2026-01-22 00:19:26.142123993 +0000 UTC m=+1.851576956 container cleanup cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.147 182939 DEBUG nova.network.neutron [-] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:26 compute-0 systemd[1]: libpod-conmon-cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb.scope: Deactivated successfully.
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.165 182939 INFO nova.compute.manager [-] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Took 1.55 seconds to deallocate network for instance.
Jan 22 00:19:26 compute-0 podman[235085]: 2026-01-22 00:19:26.242411751 +0000 UTC m=+0.057764606 container remove cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:19:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:26.249 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5c4fa954-964b-479e-9706-eaa9161b4365]: (4, ('Thu Jan 22 12:19:24 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092 (cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb)\ncc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb\nThu Jan 22 12:19:26 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092 (cc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb)\ncc137c22d80306e6fa3b1b885dde97f64d4fe06ac5e0de52ca1f6e722601a6bb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:26.251 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e33197f7-1715-4be8-af30-a6f03f873eda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:26.253 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0177ab1d-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.324 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:26 compute-0 kernel: tap0177ab1d-d0: left promiscuous mode
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.326 182939 DEBUG oslo_concurrency.lockutils [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.326 182939 DEBUG oslo_concurrency.lockutils [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.337 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:26.338 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[98274a5e-f63d-4ef5-8424-b1023c759acf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:26.360 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[283a7c6d-b9ff-4f10-90dc-c3aac178a392]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:26.362 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[feb5af72-8358-42e3-849f-1009f5a7ce1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:26.378 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ccdf0393-b2dd-4c0a-ba1c-2613439e0fdb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563479, 'reachable_time': 34679, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235101, 'error': None, 'target': 'ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:26.381 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0177ab1d-de86-4d4b-a23f-905845c27092 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:19:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:26.381 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[1552ee64-f4cf-43fc-b6a1-cb082684cf68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:26 compute-0 systemd[1]: run-netns-ovnmeta\x2d0177ab1d\x2dde86\x2d4d4b\x2da23f\x2d905845c27092.mount: Deactivated successfully.
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.446 182939 DEBUG nova.compute.provider_tree [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.470 182939 DEBUG nova.scheduler.client.report [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.506 182939 DEBUG oslo_concurrency.lockutils [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.557 182939 INFO nova.scheduler.client.report [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Deleted allocations for instance 9fc1259e-0d28-405b-b42e-35f73659ff76
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.646 182939 DEBUG oslo_concurrency.lockutils [None req-bcd9e8b1-8897-4caa-bbdc-f6c4dd7430af 793e503e125c457d8f4082ecd2e4a391 828d60daff104fcdab0f25aef8cdb46b - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.515s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.687 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.947 182939 DEBUG nova.compute.manager [req-52a49e40-5439-43b1-9796-300279c1696d req-9c046d38-8ff3-4f1c-97b0-2d00222493a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Received event network-vif-plugged-74941ef0-6243-4016-8f33-3e7e739ec086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.947 182939 DEBUG oslo_concurrency.lockutils [req-52a49e40-5439-43b1-9796-300279c1696d req-9c046d38-8ff3-4f1c-97b0-2d00222493a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.948 182939 DEBUG oslo_concurrency.lockutils [req-52a49e40-5439-43b1-9796-300279c1696d req-9c046d38-8ff3-4f1c-97b0-2d00222493a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.948 182939 DEBUG oslo_concurrency.lockutils [req-52a49e40-5439-43b1-9796-300279c1696d req-9c046d38-8ff3-4f1c-97b0-2d00222493a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "9fc1259e-0d28-405b-b42e-35f73659ff76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.948 182939 DEBUG nova.compute.manager [req-52a49e40-5439-43b1-9796-300279c1696d req-9c046d38-8ff3-4f1c-97b0-2d00222493a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] No waiting events found dispatching network-vif-plugged-74941ef0-6243-4016-8f33-3e7e739ec086 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.949 182939 WARNING nova.compute.manager [req-52a49e40-5439-43b1-9796-300279c1696d req-9c046d38-8ff3-4f1c-97b0-2d00222493a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Received unexpected event network-vif-plugged-74941ef0-6243-4016-8f33-3e7e739ec086 for instance with vm_state deleted and task_state None.
Jan 22 00:19:26 compute-0 nova_compute[182935]: 2026-01-22 00:19:26.949 182939 DEBUG nova.compute.manager [req-52a49e40-5439-43b1-9796-300279c1696d req-9c046d38-8ff3-4f1c-97b0-2d00222493a3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Received event network-vif-deleted-74941ef0-6243-4016-8f33-3e7e739ec086 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:27 compute-0 nova_compute[182935]: 2026-01-22 00:19:27.836 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:28 compute-0 podman[235102]: 2026-01-22 00:19:28.723423156 +0000 UTC m=+0.086865910 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1755695350, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=)
Jan 22 00:19:28 compute-0 podman[235103]: 2026-01-22 00:19:28.725628529 +0000 UTC m=+0.084299639 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:19:29 compute-0 nova_compute[182935]: 2026-01-22 00:19:29.447 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.354 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "af4c2559-6129-4c4e-92f5-c991d58e4225" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.355 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.383 182939 DEBUG nova.compute.manager [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.516 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.517 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.524 182939 DEBUG nova.virt.hardware [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.525 182939 INFO nova.compute.claims [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.668 182939 DEBUG nova.compute.provider_tree [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.689 182939 DEBUG nova.scheduler.client.report [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.711 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.713 182939 DEBUG nova.compute.manager [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.812 182939 DEBUG nova.compute.manager [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.812 182939 DEBUG nova.network.neutron [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.833 182939 INFO nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.851 182939 DEBUG nova.compute.manager [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 22 00:19:31 compute-0 nova_compute[182935]: 2026-01-22 00:19:31.855 182939 DEBUG nova.compute.manager [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.179 182939 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.180 182939 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.184 182939 DEBUG nova.policy [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5eb4e81f0cef4003ae49faa67b3f17c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3e408650207b498c8d115fd0c4f776dc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.187 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.496 182939 DEBUG nova.objects.instance [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'pci_requests' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.523 182939 DEBUG nova.virt.hardware [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.524 182939 INFO nova.compute.claims [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.524 182939 DEBUG nova.objects.instance [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'resources' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.527 182939 DEBUG nova.compute.manager [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.528 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.529 182939 INFO nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Creating image(s)
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.530 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "/var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.530 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.531 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "/var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.543 182939 DEBUG nova.objects.instance [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'numa_topology' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.544 182939 DEBUG oslo_concurrency.processutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.584 182939 DEBUG nova.objects.instance [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'pci_devices' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.610 182939 DEBUG oslo_concurrency.processutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.611 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.611 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.627 182939 DEBUG oslo_concurrency.processutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.648 182939 INFO nova.compute.resource_tracker [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating resource usage from migration 70071767-0c89-48d9-81cd-be888f6d084e
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.649 182939 DEBUG nova.compute.resource_tracker [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Starting to track incoming migration 70071767-0c89-48d9-81cd-be888f6d084e with flavor c3389c03-89c4-4ff5-9e03-1a99d41713d4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.686 182939 DEBUG oslo_concurrency.processutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.687 182939 DEBUG oslo_concurrency.processutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.796 182939 DEBUG nova.compute.provider_tree [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.816 182939 DEBUG nova.scheduler.client.report [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.833 182939 DEBUG oslo_concurrency.processutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk 1073741824" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.834 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.834 182939 DEBUG oslo_concurrency.processutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.855 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.858 182939 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.858 182939 INFO nova.compute.manager [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Migrating
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.895 182939 DEBUG oslo_concurrency.processutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.896 182939 DEBUG nova.virt.disk.api [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Checking if we can resize image /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.896 182939 DEBUG oslo_concurrency.processutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.954 182939 DEBUG oslo_concurrency.processutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.955 182939 DEBUG nova.virt.disk.api [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Cannot resize image /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.956 182939 DEBUG nova.objects.instance [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'migration_context' on Instance uuid af4c2559-6129-4c4e-92f5-c991d58e4225 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.970 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.971 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Ensure instance console log exists: /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.971 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.972 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:32 compute-0 nova_compute[182935]: 2026-01-22 00:19:32.972 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:34 compute-0 nova_compute[182935]: 2026-01-22 00:19:34.008 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:34 compute-0 nova_compute[182935]: 2026-01-22 00:19:34.187 182939 DEBUG nova.network.neutron [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Successfully created port: 1703eb18-c425-4648-b9be-39cfa1090d95 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:19:34 compute-0 nova_compute[182935]: 2026-01-22 00:19:34.233 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:34 compute-0 nova_compute[182935]: 2026-01-22 00:19:34.448 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:36 compute-0 sshd-session[235160]: Accepted publickey for nova from 192.168.122.101 port 37618 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:19:36 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 00:19:36 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 00:19:36 compute-0 systemd-logind[784]: New session 55 of user nova.
Jan 22 00:19:36 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 00:19:36 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 22 00:19:36 compute-0 systemd[235164]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:19:36 compute-0 systemd[235164]: Queued start job for default target Main User Target.
Jan 22 00:19:36 compute-0 systemd[235164]: Created slice User Application Slice.
Jan 22 00:19:36 compute-0 systemd[235164]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:19:36 compute-0 systemd[235164]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 00:19:36 compute-0 systemd[235164]: Reached target Paths.
Jan 22 00:19:36 compute-0 systemd[235164]: Reached target Timers.
Jan 22 00:19:36 compute-0 systemd[235164]: Starting D-Bus User Message Bus Socket...
Jan 22 00:19:36 compute-0 systemd[235164]: Starting Create User's Volatile Files and Directories...
Jan 22 00:19:36 compute-0 systemd[235164]: Finished Create User's Volatile Files and Directories.
Jan 22 00:19:36 compute-0 systemd[235164]: Listening on D-Bus User Message Bus Socket.
Jan 22 00:19:36 compute-0 systemd[235164]: Reached target Sockets.
Jan 22 00:19:36 compute-0 systemd[235164]: Reached target Basic System.
Jan 22 00:19:36 compute-0 systemd[235164]: Reached target Main User Target.
Jan 22 00:19:36 compute-0 systemd[235164]: Startup finished in 124ms.
Jan 22 00:19:36 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 22 00:19:36 compute-0 systemd[1]: Started Session 55 of User nova.
Jan 22 00:19:36 compute-0 sshd-session[235160]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:19:36 compute-0 sshd-session[235179]: Received disconnect from 192.168.122.101 port 37618:11: disconnected by user
Jan 22 00:19:36 compute-0 sshd-session[235179]: Disconnected from user nova 192.168.122.101 port 37618
Jan 22 00:19:36 compute-0 sshd-session[235160]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:19:36 compute-0 systemd[1]: session-55.scope: Deactivated successfully.
Jan 22 00:19:36 compute-0 systemd-logind[784]: Session 55 logged out. Waiting for processes to exit.
Jan 22 00:19:36 compute-0 systemd-logind[784]: Removed session 55.
Jan 22 00:19:36 compute-0 sshd-session[235181]: Accepted publickey for nova from 192.168.122.101 port 37620 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:19:36 compute-0 systemd-logind[784]: New session 57 of user nova.
Jan 22 00:19:36 compute-0 systemd[1]: Started Session 57 of User nova.
Jan 22 00:19:36 compute-0 sshd-session[235181]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:19:36 compute-0 sshd-session[235184]: Received disconnect from 192.168.122.101 port 37620:11: disconnected by user
Jan 22 00:19:36 compute-0 sshd-session[235184]: Disconnected from user nova 192.168.122.101 port 37620
Jan 22 00:19:36 compute-0 sshd-session[235181]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:19:36 compute-0 systemd[1]: session-57.scope: Deactivated successfully.
Jan 22 00:19:36 compute-0 systemd-logind[784]: Session 57 logged out. Waiting for processes to exit.
Jan 22 00:19:36 compute-0 systemd-logind[784]: Removed session 57.
Jan 22 00:19:37 compute-0 nova_compute[182935]: 2026-01-22 00:19:37.266 182939 DEBUG nova.network.neutron [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Successfully updated port: 1703eb18-c425-4648-b9be-39cfa1090d95 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:19:37 compute-0 nova_compute[182935]: 2026-01-22 00:19:37.346 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "refresh_cache-af4c2559-6129-4c4e-92f5-c991d58e4225" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:37 compute-0 nova_compute[182935]: 2026-01-22 00:19:37.346 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquired lock "refresh_cache-af4c2559-6129-4c4e-92f5-c991d58e4225" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:37 compute-0 nova_compute[182935]: 2026-01-22 00:19:37.347 182939 DEBUG nova.network.neutron [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:19:37 compute-0 nova_compute[182935]: 2026-01-22 00:19:37.417 182939 DEBUG nova.compute.manager [req-7b97a69d-c4db-42d0-8566-e9c1abd1dfcc req-365871cd-1442-4649-a04f-dd69893bac1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Received event network-changed-1703eb18-c425-4648-b9be-39cfa1090d95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:37 compute-0 nova_compute[182935]: 2026-01-22 00:19:37.417 182939 DEBUG nova.compute.manager [req-7b97a69d-c4db-42d0-8566-e9c1abd1dfcc req-365871cd-1442-4649-a04f-dd69893bac1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Refreshing instance network info cache due to event network-changed-1703eb18-c425-4648-b9be-39cfa1090d95. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:19:37 compute-0 nova_compute[182935]: 2026-01-22 00:19:37.417 182939 DEBUG oslo_concurrency.lockutils [req-7b97a69d-c4db-42d0-8566-e9c1abd1dfcc req-365871cd-1442-4649-a04f-dd69893bac1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-af4c2559-6129-4c4e-92f5-c991d58e4225" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:37 compute-0 nova_compute[182935]: 2026-01-22 00:19:37.651 182939 DEBUG nova.network.neutron [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:19:37 compute-0 nova_compute[182935]: 2026-01-22 00:19:37.839 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:38 compute-0 sshd-session[235186]: Invalid user redis from 188.166.69.60 port 51566
Jan 22 00:19:38 compute-0 sshd-session[235186]: Connection closed by invalid user redis 188.166.69.60 port 51566 [preauth]
Jan 22 00:19:39 compute-0 nova_compute[182935]: 2026-01-22 00:19:39.422 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041164.4206574, 9fc1259e-0d28-405b-b42e-35f73659ff76 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:39 compute-0 nova_compute[182935]: 2026-01-22 00:19:39.422 182939 INFO nova.compute.manager [-] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] VM Stopped (Lifecycle Event)
Jan 22 00:19:39 compute-0 nova_compute[182935]: 2026-01-22 00:19:39.441 182939 DEBUG nova.compute.manager [None req-a2e33298-35d5-48e6-bb4f-0c77bcf4d804 - - - - - -] [instance: 9fc1259e-0d28-405b-b42e-35f73659ff76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:39 compute-0 nova_compute[182935]: 2026-01-22 00:19:39.451 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:39 compute-0 sshd-session[235188]: Accepted publickey for nova from 192.168.122.101 port 37636 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:19:39 compute-0 systemd-logind[784]: New session 58 of user nova.
Jan 22 00:19:40 compute-0 systemd[1]: Started Session 58 of User nova.
Jan 22 00:19:40 compute-0 sshd-session[235188]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.255 182939 DEBUG nova.network.neutron [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Updating instance_info_cache with network_info: [{"id": "1703eb18-c425-4648-b9be-39cfa1090d95", "address": "fa:16:3e:56:8b:61", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1703eb18-c4", "ovs_interfaceid": "1703eb18-c425-4648-b9be-39cfa1090d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.365 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Releasing lock "refresh_cache-af4c2559-6129-4c4e-92f5-c991d58e4225" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.366 182939 DEBUG nova.compute.manager [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Instance network_info: |[{"id": "1703eb18-c425-4648-b9be-39cfa1090d95", "address": "fa:16:3e:56:8b:61", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1703eb18-c4", "ovs_interfaceid": "1703eb18-c425-4648-b9be-39cfa1090d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.366 182939 DEBUG oslo_concurrency.lockutils [req-7b97a69d-c4db-42d0-8566-e9c1abd1dfcc req-365871cd-1442-4649-a04f-dd69893bac1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-af4c2559-6129-4c4e-92f5-c991d58e4225" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.367 182939 DEBUG nova.network.neutron [req-7b97a69d-c4db-42d0-8566-e9c1abd1dfcc req-365871cd-1442-4649-a04f-dd69893bac1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Refreshing network info cache for port 1703eb18-c425-4648-b9be-39cfa1090d95 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.369 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Start _get_guest_xml network_info=[{"id": "1703eb18-c425-4648-b9be-39cfa1090d95", "address": "fa:16:3e:56:8b:61", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1703eb18-c4", "ovs_interfaceid": "1703eb18-c425-4648-b9be-39cfa1090d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.373 182939 WARNING nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.381 182939 DEBUG nova.virt.libvirt.host [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.382 182939 DEBUG nova.virt.libvirt.host [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.385 182939 DEBUG nova.virt.libvirt.host [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.386 182939 DEBUG nova.virt.libvirt.host [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.387 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.387 182939 DEBUG nova.virt.hardware [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.387 182939 DEBUG nova.virt.hardware [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.387 182939 DEBUG nova.virt.hardware [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.388 182939 DEBUG nova.virt.hardware [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.388 182939 DEBUG nova.virt.hardware [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.388 182939 DEBUG nova.virt.hardware [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.388 182939 DEBUG nova.virt.hardware [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.388 182939 DEBUG nova.virt.hardware [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.389 182939 DEBUG nova.virt.hardware [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.389 182939 DEBUG nova.virt.hardware [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.389 182939 DEBUG nova.virt.hardware [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.392 182939 DEBUG nova.virt.libvirt.vif [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:19:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1466679265',display_name='tempest-ServersTestJSON-server-1466679265',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1466679265',id=147,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-3i4g2m0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:19:31Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=af4c2559-6129-4c4e-92f5-c991d58e4225,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1703eb18-c425-4648-b9be-39cfa1090d95", "address": "fa:16:3e:56:8b:61", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1703eb18-c4", "ovs_interfaceid": "1703eb18-c425-4648-b9be-39cfa1090d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.393 182939 DEBUG nova.network.os_vif_util [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "1703eb18-c425-4648-b9be-39cfa1090d95", "address": "fa:16:3e:56:8b:61", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1703eb18-c4", "ovs_interfaceid": "1703eb18-c425-4648-b9be-39cfa1090d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.393 182939 DEBUG nova.network.os_vif_util [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:8b:61,bridge_name='br-int',has_traffic_filtering=True,id=1703eb18-c425-4648-b9be-39cfa1090d95,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1703eb18-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.394 182939 DEBUG nova.objects.instance [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'pci_devices' on Instance uuid af4c2559-6129-4c4e-92f5-c991d58e4225 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:40 compute-0 sshd-session[235191]: Received disconnect from 192.168.122.101 port 37636:11: disconnected by user
Jan 22 00:19:40 compute-0 sshd-session[235191]: Disconnected from user nova 192.168.122.101 port 37636
Jan 22 00:19:40 compute-0 sshd-session[235188]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:19:40 compute-0 systemd[1]: session-58.scope: Deactivated successfully.
Jan 22 00:19:40 compute-0 systemd-logind[784]: Session 58 logged out. Waiting for processes to exit.
Jan 22 00:19:40 compute-0 systemd-logind[784]: Removed session 58.
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.422 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:19:40 compute-0 nova_compute[182935]:   <uuid>af4c2559-6129-4c4e-92f5-c991d58e4225</uuid>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   <name>instance-00000093</name>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <nova:name>tempest-ServersTestJSON-server-1466679265</nova:name>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:19:40</nova:creationTime>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:19:40 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:19:40 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:19:40 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:19:40 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:19:40 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:19:40 compute-0 nova_compute[182935]:         <nova:user uuid="5eb4e81f0cef4003ae49faa67b3f17c3">tempest-ServersTestJSON-374007797-project-member</nova:user>
Jan 22 00:19:40 compute-0 nova_compute[182935]:         <nova:project uuid="3e408650207b498c8d115fd0c4f776dc">tempest-ServersTestJSON-374007797</nova:project>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:19:40 compute-0 nova_compute[182935]:         <nova:port uuid="1703eb18-c425-4648-b9be-39cfa1090d95">
Jan 22 00:19:40 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <system>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <entry name="serial">af4c2559-6129-4c4e-92f5-c991d58e4225</entry>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <entry name="uuid">af4c2559-6129-4c4e-92f5-c991d58e4225</entry>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     </system>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   <os>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   </os>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   <features>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   </features>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk.config"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:56:8b:61"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <target dev="tap1703eb18-c4"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/console.log" append="off"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <video>
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     </video>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:19:40 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:19:40 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:19:40 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:19:40 compute-0 nova_compute[182935]: </domain>
Jan 22 00:19:40 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.423 182939 DEBUG nova.compute.manager [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Preparing to wait for external event network-vif-plugged-1703eb18-c425-4648-b9be-39cfa1090d95 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.423 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.423 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.423 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.424 182939 DEBUG nova.virt.libvirt.vif [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:19:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1466679265',display_name='tempest-ServersTestJSON-server-1466679265',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1466679265',id=147,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-3i4g2m0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:19:31Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=af4c2559-6129-4c4e-92f5-c991d58e4225,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1703eb18-c425-4648-b9be-39cfa1090d95", "address": "fa:16:3e:56:8b:61", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1703eb18-c4", "ovs_interfaceid": "1703eb18-c425-4648-b9be-39cfa1090d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.424 182939 DEBUG nova.network.os_vif_util [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "1703eb18-c425-4648-b9be-39cfa1090d95", "address": "fa:16:3e:56:8b:61", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1703eb18-c4", "ovs_interfaceid": "1703eb18-c425-4648-b9be-39cfa1090d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.425 182939 DEBUG nova.network.os_vif_util [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:8b:61,bridge_name='br-int',has_traffic_filtering=True,id=1703eb18-c425-4648-b9be-39cfa1090d95,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1703eb18-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.425 182939 DEBUG os_vif [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:8b:61,bridge_name='br-int',has_traffic_filtering=True,id=1703eb18-c425-4648-b9be-39cfa1090d95,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1703eb18-c4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.425 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.426 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.426 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.428 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.428 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1703eb18-c4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.429 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1703eb18-c4, col_values=(('external_ids', {'iface-id': '1703eb18-c425-4648-b9be-39cfa1090d95', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:8b:61', 'vm-uuid': 'af4c2559-6129-4c4e-92f5-c991d58e4225'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.430 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:40 compute-0 NetworkManager[55139]: <info>  [1769041180.4310] manager: (tap1703eb18-c4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.432 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.437 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.439 182939 INFO os_vif [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:8b:61,bridge_name='br-int',has_traffic_filtering=True,id=1703eb18-c425-4648-b9be-39cfa1090d95,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1703eb18-c4')
Jan 22 00:19:40 compute-0 sshd-session[235193]: Accepted publickey for nova from 192.168.122.101 port 48054 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:19:40 compute-0 systemd-logind[784]: New session 59 of user nova.
Jan 22 00:19:40 compute-0 systemd[1]: Started Session 59 of User nova.
Jan 22 00:19:40 compute-0 sshd-session[235193]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:19:40 compute-0 sshd-session[235199]: Received disconnect from 192.168.122.101 port 48054:11: disconnected by user
Jan 22 00:19:40 compute-0 sshd-session[235199]: Disconnected from user nova 192.168.122.101 port 48054
Jan 22 00:19:40 compute-0 sshd-session[235193]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:19:40 compute-0 systemd[1]: session-59.scope: Deactivated successfully.
Jan 22 00:19:40 compute-0 systemd-logind[784]: Session 59 logged out. Waiting for processes to exit.
Jan 22 00:19:40 compute-0 systemd-logind[784]: Removed session 59.
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.658 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.658 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.658 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] No VIF found with MAC fa:16:3e:56:8b:61, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:19:40 compute-0 nova_compute[182935]: 2026-01-22 00:19:40.658 182939 INFO nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Using config drive
Jan 22 00:19:40 compute-0 sshd-session[235201]: Accepted publickey for nova from 192.168.122.101 port 48066 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:19:40 compute-0 systemd-logind[784]: New session 60 of user nova.
Jan 22 00:19:40 compute-0 systemd[1]: Started Session 60 of User nova.
Jan 22 00:19:40 compute-0 sshd-session[235201]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:19:40 compute-0 sshd-session[235204]: Received disconnect from 192.168.122.101 port 48066:11: disconnected by user
Jan 22 00:19:40 compute-0 sshd-session[235204]: Disconnected from user nova 192.168.122.101 port 48066
Jan 22 00:19:40 compute-0 sshd-session[235201]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:19:40 compute-0 systemd[1]: session-60.scope: Deactivated successfully.
Jan 22 00:19:40 compute-0 systemd-logind[784]: Session 60 logged out. Waiting for processes to exit.
Jan 22 00:19:40 compute-0 systemd-logind[784]: Removed session 60.
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.222 182939 INFO nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Creating config drive at /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk.config
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.228 182939 DEBUG oslo_concurrency.processutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_uffx_bl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.355 182939 DEBUG oslo_concurrency.processutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_uffx_bl" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:41 compute-0 kernel: tap1703eb18-c4: entered promiscuous mode
Jan 22 00:19:41 compute-0 ovn_controller[95047]: 2026-01-22T00:19:41Z|00534|binding|INFO|Claiming lport 1703eb18-c425-4648-b9be-39cfa1090d95 for this chassis.
Jan 22 00:19:41 compute-0 ovn_controller[95047]: 2026-01-22T00:19:41Z|00535|binding|INFO|1703eb18-c425-4648-b9be-39cfa1090d95: Claiming fa:16:3e:56:8b:61 10.100.0.9
Jan 22 00:19:41 compute-0 NetworkManager[55139]: <info>  [1769041181.4226] manager: (tap1703eb18-c4): new Tun device (/org/freedesktop/NetworkManager/Devices/257)
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.423 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.442 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:8b:61 10.100.0.9'], port_security=['fa:16:3e:56:8b:61 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'af4c2559-6129-4c4e-92f5-c991d58e4225', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=1703eb18-c425-4648-b9be-39cfa1090d95) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.443 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 1703eb18-c425-4648-b9be-39cfa1090d95 in datapath aabf11c6-ef94-408a-8148-6c6400566606 bound to our chassis
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.444 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network aabf11c6-ef94-408a-8148-6c6400566606
Jan 22 00:19:41 compute-0 systemd-udevd[235220]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.456 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ca0bc7ff-ac42-4181-bafb-59abdac55f0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.457 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaabf11c6-e1 in ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.459 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaabf11c6-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.459 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[764c0a8c-1a90-4d74-840e-1bdff690e99a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.460 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0339d69f-03af-445a-b4e6-69d8d6610638]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 NetworkManager[55139]: <info>  [1769041181.4691] device (tap1703eb18-c4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:19:41 compute-0 NetworkManager[55139]: <info>  [1769041181.4697] device (tap1703eb18-c4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.472 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[7cbb2010-4e38-4850-a01a-2b9596d1f3d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 systemd-machined[154182]: New machine qemu-72-instance-00000093.
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.486 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.492 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:41 compute-0 systemd[1]: Started Virtual Machine qemu-72-instance-00000093.
Jan 22 00:19:41 compute-0 ovn_controller[95047]: 2026-01-22T00:19:41Z|00536|binding|INFO|Setting lport 1703eb18-c425-4648-b9be-39cfa1090d95 ovn-installed in OVS
Jan 22 00:19:41 compute-0 ovn_controller[95047]: 2026-01-22T00:19:41Z|00537|binding|INFO|Setting lport 1703eb18-c425-4648-b9be-39cfa1090d95 up in Southbound
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.496 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.497 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fcf11963-b09e-4a18-9f54-e60769a3726b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.529 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[ded7e7a7-424d-42a3-b380-a054df313815]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.535 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[784724ec-6f2a-4745-98f4-9a15668de30b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 NetworkManager[55139]: <info>  [1769041181.5367] manager: (tapaabf11c6-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/258)
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.566 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[5d282036-f4d8-434f-bfeb-2c2f4886c1a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.570 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8dd4c0-60ba-4179-8f81-8edb019a7fe6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 NetworkManager[55139]: <info>  [1769041181.5909] device (tapaabf11c6-e0): carrier: link connected
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.597 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[3573d4f2-2581-4e1d-a43b-8ca7801decb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.616 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[491870c1-1cbe-40da-aec2-9fff3ab718d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568432, 'reachable_time': 33770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235256, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.629 182939 DEBUG nova.compute.manager [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.629 182939 DEBUG oslo_concurrency.lockutils [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.630 182939 DEBUG oslo_concurrency.lockutils [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.630 182939 DEBUG oslo_concurrency.lockutils [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.630 182939 DEBUG nova.compute.manager [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.630 182939 WARNING nova.compute.manager [req-b0799055-85b6-4fc6-8dfd-928220673e6b req-00c82f7b-01c0-4fdc-9690-261365235555 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state active and task_state resize_migrated.
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.633 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a11c6ff1-a5a1-4b16-88db-35d9be09309c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:1b62'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568432, 'tstamp': 568432}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235257, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.654 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f18e32a6-1645-48ff-8fa4-e8448530048d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaabf11c6-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:1b:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568432, 'reachable_time': 33770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235258, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.687 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1f55313c-c064-4b73-a578-dc130aa7d92c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.749 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[689a52b1-d109-4a59-9927-652086f7d0ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.750 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.750 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.751 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaabf11c6-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:41 compute-0 kernel: tapaabf11c6-e0: entered promiscuous mode
Jan 22 00:19:41 compute-0 NetworkManager[55139]: <info>  [1769041181.7537] manager: (tapaabf11c6-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/259)
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.754 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.755 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaabf11c6-e0, col_values=(('external_ids', {'iface-id': '1ae0dbff-a7cd-4db8-afc3-1d102fdd130f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:41 compute-0 ovn_controller[95047]: 2026-01-22T00:19:41Z|00538|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.768 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.769 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.769 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[aa75ab34-5ea7-4838-8411-f047f5945cf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.769 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-aabf11c6-ef94-408a-8148-6c6400566606
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/aabf11c6-ef94-408a-8148-6c6400566606.pid.haproxy
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID aabf11c6-ef94-408a-8148-6c6400566606
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:19:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:41.770 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'env', 'PROCESS_TAG=haproxy-aabf11c6-ef94-408a-8148-6c6400566606', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/aabf11c6-ef94-408a-8148-6c6400566606.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:19:41 compute-0 nova_compute[182935]: 2026-01-22 00:19:41.782 182939 INFO nova.network.neutron [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating port 89ad850c-a87f-489f-8c3e-51dfc078a374 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 22 00:19:42 compute-0 nova_compute[182935]: 2026-01-22 00:19:42.115 182939 DEBUG nova.compute.manager [req-f19ebb3c-70e4-4625-be80-a99442a1a409 req-e0609eae-d917-48bf-bfad-5680875c0fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Received event network-vif-plugged-1703eb18-c425-4648-b9be-39cfa1090d95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:42 compute-0 nova_compute[182935]: 2026-01-22 00:19:42.116 182939 DEBUG oslo_concurrency.lockutils [req-f19ebb3c-70e4-4625-be80-a99442a1a409 req-e0609eae-d917-48bf-bfad-5680875c0fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:42 compute-0 nova_compute[182935]: 2026-01-22 00:19:42.116 182939 DEBUG oslo_concurrency.lockutils [req-f19ebb3c-70e4-4625-be80-a99442a1a409 req-e0609eae-d917-48bf-bfad-5680875c0fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:42 compute-0 nova_compute[182935]: 2026-01-22 00:19:42.116 182939 DEBUG oslo_concurrency.lockutils [req-f19ebb3c-70e4-4625-be80-a99442a1a409 req-e0609eae-d917-48bf-bfad-5680875c0fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:42 compute-0 nova_compute[182935]: 2026-01-22 00:19:42.116 182939 DEBUG nova.compute.manager [req-f19ebb3c-70e4-4625-be80-a99442a1a409 req-e0609eae-d917-48bf-bfad-5680875c0fbd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Processing event network-vif-plugged-1703eb18-c425-4648-b9be-39cfa1090d95 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:19:42 compute-0 podman[235290]: 2026-01-22 00:19:42.166159364 +0000 UTC m=+0.092477424 container create b509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 00:19:42 compute-0 podman[235290]: 2026-01-22 00:19:42.094489877 +0000 UTC m=+0.020807957 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:19:42 compute-0 systemd[1]: Started libpod-conmon-b509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4.scope.
Jan 22 00:19:42 compute-0 nova_compute[182935]: 2026-01-22 00:19:42.273 182939 DEBUG nova.network.neutron [req-7b97a69d-c4db-42d0-8566-e9c1abd1dfcc req-365871cd-1442-4649-a04f-dd69893bac1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Updated VIF entry in instance network info cache for port 1703eb18-c425-4648-b9be-39cfa1090d95. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:19:42 compute-0 nova_compute[182935]: 2026-01-22 00:19:42.274 182939 DEBUG nova.network.neutron [req-7b97a69d-c4db-42d0-8566-e9c1abd1dfcc req-365871cd-1442-4649-a04f-dd69893bac1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Updating instance_info_cache with network_info: [{"id": "1703eb18-c425-4648-b9be-39cfa1090d95", "address": "fa:16:3e:56:8b:61", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1703eb18-c4", "ovs_interfaceid": "1703eb18-c425-4648-b9be-39cfa1090d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:42 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:19:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/431d9f4661f435f3dd6f46868c1d8d6f982bb23ae1347493f4575c2a9149d4b3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:19:42 compute-0 podman[235290]: 2026-01-22 00:19:42.320736085 +0000 UTC m=+0.247054165 container init b509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 22 00:19:42 compute-0 podman[235290]: 2026-01-22 00:19:42.326428791 +0000 UTC m=+0.252746851 container start b509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:19:42 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[235306]: [NOTICE]   (235310) : New worker (235312) forked
Jan 22 00:19:42 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[235306]: [NOTICE]   (235310) : Loading success.
Jan 22 00:19:42 compute-0 nova_compute[182935]: 2026-01-22 00:19:42.436 182939 DEBUG oslo_concurrency.lockutils [req-7b97a69d-c4db-42d0-8566-e9c1abd1dfcc req-365871cd-1442-4649-a04f-dd69893bac1b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-af4c2559-6129-4c4e-92f5-c991d58e4225" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:42 compute-0 nova_compute[182935]: 2026-01-22 00:19:42.798 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:42.800 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:19:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:42.801 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:19:42 compute-0 nova_compute[182935]: 2026-01-22 00:19:42.841 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.156 182939 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.156 182939 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.157 182939 DEBUG nova.network.neutron [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.312 182939 DEBUG nova.compute.manager [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.312 182939 DEBUG nova.compute.manager [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing instance network info cache due to event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.313 182939 DEBUG oslo_concurrency.lockutils [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.491 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041183.4913936, af4c2559-6129-4c4e-92f5-c991d58e4225 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.492 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] VM Started (Lifecycle Event)
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.494 182939 DEBUG nova.compute.manager [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.497 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.501 182939 INFO nova.virt.libvirt.driver [-] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Instance spawned successfully.
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.501 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.523 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.529 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.534 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.535 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.535 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.535 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.536 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.536 182939 DEBUG nova.virt.libvirt.driver [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.570 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.570 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041183.4938366, af4c2559-6129-4c4e-92f5-c991d58e4225 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.570 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] VM Paused (Lifecycle Event)
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.598 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.601 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041183.4967806, af4c2559-6129-4c4e-92f5-c991d58e4225 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.602 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] VM Resumed (Lifecycle Event)
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.622 182939 INFO nova.compute.manager [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Took 11.09 seconds to spawn the instance on the hypervisor.
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.622 182939 DEBUG nova.compute.manager [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.623 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.629 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.669 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.728 182939 INFO nova.compute.manager [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Took 12.25 seconds to build instance.
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.747 182939 DEBUG nova.compute.manager [req-112b9429-15af-466e-9eab-76315045490e req-26d24cc4-20f4-4d0a-90fa-5e149da8ac5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.747 182939 DEBUG oslo_concurrency.lockutils [req-112b9429-15af-466e-9eab-76315045490e req-26d24cc4-20f4-4d0a-90fa-5e149da8ac5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.747 182939 DEBUG oslo_concurrency.lockutils [req-112b9429-15af-466e-9eab-76315045490e req-26d24cc4-20f4-4d0a-90fa-5e149da8ac5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.747 182939 DEBUG oslo_concurrency.lockutils [req-112b9429-15af-466e-9eab-76315045490e req-26d24cc4-20f4-4d0a-90fa-5e149da8ac5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.748 182939 DEBUG nova.compute.manager [req-112b9429-15af-466e-9eab-76315045490e req-26d24cc4-20f4-4d0a-90fa-5e149da8ac5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.748 182939 WARNING nova.compute.manager [req-112b9429-15af-466e-9eab-76315045490e req-26d24cc4-20f4-4d0a-90fa-5e149da8ac5b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state active and task_state resize_migrated.
Jan 22 00:19:43 compute-0 nova_compute[182935]: 2026-01-22 00:19:43.750 182939 DEBUG oslo_concurrency.lockutils [None req-c54c9899-80d8-4ac8-992c-794884c5a770 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:44 compute-0 nova_compute[182935]: 2026-01-22 00:19:44.519 182939 DEBUG nova.compute.manager [req-e2767e4e-f4c6-436f-859f-30d3f1cb4e28 req-7b60f301-ed71-42bc-b5d0-9db9560f3504 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Received event network-vif-plugged-1703eb18-c425-4648-b9be-39cfa1090d95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:44 compute-0 nova_compute[182935]: 2026-01-22 00:19:44.519 182939 DEBUG oslo_concurrency.lockutils [req-e2767e4e-f4c6-436f-859f-30d3f1cb4e28 req-7b60f301-ed71-42bc-b5d0-9db9560f3504 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:44 compute-0 nova_compute[182935]: 2026-01-22 00:19:44.519 182939 DEBUG oslo_concurrency.lockutils [req-e2767e4e-f4c6-436f-859f-30d3f1cb4e28 req-7b60f301-ed71-42bc-b5d0-9db9560f3504 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:44 compute-0 nova_compute[182935]: 2026-01-22 00:19:44.520 182939 DEBUG oslo_concurrency.lockutils [req-e2767e4e-f4c6-436f-859f-30d3f1cb4e28 req-7b60f301-ed71-42bc-b5d0-9db9560f3504 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:44 compute-0 nova_compute[182935]: 2026-01-22 00:19:44.520 182939 DEBUG nova.compute.manager [req-e2767e4e-f4c6-436f-859f-30d3f1cb4e28 req-7b60f301-ed71-42bc-b5d0-9db9560f3504 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] No waiting events found dispatching network-vif-plugged-1703eb18-c425-4648-b9be-39cfa1090d95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:44 compute-0 nova_compute[182935]: 2026-01-22 00:19:44.520 182939 WARNING nova.compute.manager [req-e2767e4e-f4c6-436f-859f-30d3f1cb4e28 req-7b60f301-ed71-42bc-b5d0-9db9560f3504 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Received unexpected event network-vif-plugged-1703eb18-c425-4648-b9be-39cfa1090d95 for instance with vm_state active and task_state None.
Jan 22 00:19:44 compute-0 podman[235329]: 2026-01-22 00:19:44.689484277 +0000 UTC m=+0.055975234 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:19:44 compute-0 podman[235328]: 2026-01-22 00:19:44.729390117 +0000 UTC m=+0.098490177 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.431 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.520 182939 DEBUG nova.network.neutron [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.540 182939 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.543 182939 DEBUG oslo_concurrency.lockutils [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.544 182939 DEBUG nova.network.neutron [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.685 182939 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.688 182939 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.689 182939 INFO nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Creating image(s)
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.691 182939 DEBUG nova.objects.instance [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'trusted_certs' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.718 182939 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.783 182939 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.784 182939 DEBUG nova.virt.disk.api [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Checking if we can resize image /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.785 182939 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.805 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.806 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.806 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.839 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.850 182939 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.851 182939 DEBUG nova.virt.disk.api [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Cannot resize image /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.907 182939 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.908 182939 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Ensure instance console log exists: /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.908 182939 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.908 182939 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.909 182939 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.911 182939 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Start _get_guest_xml network_info=[{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1118354446", "vif_mac": "fa:16:3e:9d:17:06"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.916 182939 WARNING nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.920 182939 DEBUG nova.virt.libvirt.host [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.921 182939 DEBUG nova.virt.libvirt.host [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.923 182939 DEBUG nova.virt.libvirt.host [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.924 182939 DEBUG nova.virt.libvirt.host [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.925 182939 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.925 182939 DEBUG nova.virt.hardware [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.926 182939 DEBUG nova.virt.hardware [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.926 182939 DEBUG nova.virt.hardware [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.926 182939 DEBUG nova.virt.hardware [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.926 182939 DEBUG nova.virt.hardware [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.926 182939 DEBUG nova.virt.hardware [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.927 182939 DEBUG nova.virt.hardware [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.927 182939 DEBUG nova.virt.hardware [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.927 182939 DEBUG nova.virt.hardware [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.927 182939 DEBUG nova.virt.hardware [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.928 182939 DEBUG nova.virt.hardware [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.928 182939 DEBUG nova.objects.instance [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'vcpu_model' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:45 compute-0 nova_compute[182935]: 2026-01-22 00:19:45.951 182939 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.021 182939 DEBUG oslo_concurrency.processutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.022 182939 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.023 182939 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.024 182939 DEBUG oslo_concurrency.lockutils [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.026 182939 DEBUG nova.virt.libvirt.vif [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:18:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2079598704',display_name='tempest-TestNetworkAdvancedServerOps-server-2079598704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2079598704',id=145,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIY3eqmdW0m2q20hwTxy7fCq5RPOCY+KqJLqriFpcPzIAlQnzQNfW6TIp9Y1voEv/PtpLpDAT0kqBnGToo/qNh+oTys/PEZ/7XtlTWunC6nPRFTGOxMn536DUj7Tail8LA==',key_name='tempest-TestNetworkAdvancedServerOps-1230336080',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:19:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-zrj70p8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:19:41Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=d59e0943-5372-4680-af52-c9af874c8578,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1118354446", "vif_mac": "fa:16:3e:9d:17:06"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.027 182939 DEBUG nova.network.os_vif_util [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converting VIF {"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1118354446", "vif_mac": "fa:16:3e:9d:17:06"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.028 182939 DEBUG nova.network.os_vif_util [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.032 182939 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:19:46 compute-0 nova_compute[182935]:   <uuid>d59e0943-5372-4680-af52-c9af874c8578</uuid>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   <name>instance-00000091</name>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-2079598704</nova:name>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:19:45</nova:creationTime>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:19:46 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:19:46 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:19:46 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:19:46 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:19:46 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:19:46 compute-0 nova_compute[182935]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:19:46 compute-0 nova_compute[182935]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:19:46 compute-0 nova_compute[182935]:         <nova:port uuid="89ad850c-a87f-489f-8c3e-51dfc078a374">
Jan 22 00:19:46 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <system>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <entry name="serial">d59e0943-5372-4680-af52-c9af874c8578</entry>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <entry name="uuid">d59e0943-5372-4680-af52-c9af874c8578</entry>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     </system>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   <os>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   </os>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   <features>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   </features>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk.config"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:9d:17:06"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <target dev="tap89ad850c-a8"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/console.log" append="off"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <video>
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     </video>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:19:46 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:19:46 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:19:46 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:19:46 compute-0 nova_compute[182935]: </domain>
Jan 22 00:19:46 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.034 182939 DEBUG nova.virt.libvirt.vif [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:18:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2079598704',display_name='tempest-TestNetworkAdvancedServerOps-server-2079598704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2079598704',id=145,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIY3eqmdW0m2q20hwTxy7fCq5RPOCY+KqJLqriFpcPzIAlQnzQNfW6TIp9Y1voEv/PtpLpDAT0kqBnGToo/qNh+oTys/PEZ/7XtlTWunC6nPRFTGOxMn536DUj7Tail8LA==',key_name='tempest-TestNetworkAdvancedServerOps-1230336080',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:19:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-zrj70p8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:19:41Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=d59e0943-5372-4680-af52-c9af874c8578,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1118354446", "vif_mac": "fa:16:3e:9d:17:06"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.035 182939 DEBUG nova.network.os_vif_util [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converting VIF {"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1118354446", "vif_mac": "fa:16:3e:9d:17:06"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.035 182939 DEBUG nova.network.os_vif_util [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.036 182939 DEBUG os_vif [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.036 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.037 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.037 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.040 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.040 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89ad850c-a8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.041 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap89ad850c-a8, col_values=(('external_ids', {'iface-id': '89ad850c-a87f-489f-8c3e-51dfc078a374', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9d:17:06', 'vm-uuid': 'd59e0943-5372-4680-af52-c9af874c8578'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.043 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:46 compute-0 NetworkManager[55139]: <info>  [1769041186.0439] manager: (tap89ad850c-a8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/260)
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.045 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.050 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.050 182939 INFO os_vif [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8')
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.798 182939 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.799 182939 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.799 182939 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] No VIF found with MAC fa:16:3e:9d:17:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.799 182939 INFO nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Using config drive
Jan 22 00:19:46 compute-0 kernel: tap89ad850c-a8: entered promiscuous mode
Jan 22 00:19:46 compute-0 NetworkManager[55139]: <info>  [1769041186.8783] manager: (tap89ad850c-a8): new Tun device (/org/freedesktop/NetworkManager/Devices/261)
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.883 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:46 compute-0 ovn_controller[95047]: 2026-01-22T00:19:46Z|00539|binding|INFO|Claiming lport 89ad850c-a87f-489f-8c3e-51dfc078a374 for this chassis.
Jan 22 00:19:46 compute-0 ovn_controller[95047]: 2026-01-22T00:19:46Z|00540|binding|INFO|89ad850c-a87f-489f-8c3e-51dfc078a374: Claiming fa:16:3e:9d:17:06 10.100.0.5
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.887 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:46 compute-0 NetworkManager[55139]: <info>  [1769041186.8942] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Jan 22 00:19:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:46.896 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:17:06 10.100.0.5'], port_security=['fa:16:3e:9d:17:06 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd59e0943-5372-4680-af52-c9af874c8578', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1bc50146-1f14-43fa-a2db-2904419fa654', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.194'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f19555ab-2ed1-467b-9e13-e9518e9577aa, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=89ad850c-a87f-489f-8c3e-51dfc078a374) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:19:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:46.898 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 89ad850c-a87f-489f-8c3e-51dfc078a374 in datapath 7347045a-f38e-4f56-a03a-a68e0fbe1e8d bound to our chassis
Jan 22 00:19:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:46.901 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7347045a-f38e-4f56-a03a-a68e0fbe1e8d
Jan 22 00:19:46 compute-0 nova_compute[182935]: 2026-01-22 00:19:46.892 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:46 compute-0 NetworkManager[55139]: <info>  [1769041186.9026] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/263)
Jan 22 00:19:46 compute-0 systemd-udevd[235407]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:19:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:46.918 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6b137a18-a53c-4d1e-b179-0a64237cd3c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:46.919 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7347045a-f1 in ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:19:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:46.920 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7347045a-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:19:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:46.920 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d40dcc3b-3526-4aa8-bef5-1d8937122631]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:46.921 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b8891d28-b29e-4e88-8263-c1311e7370ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:46 compute-0 systemd-machined[154182]: New machine qemu-73-instance-00000091.
Jan 22 00:19:46 compute-0 NetworkManager[55139]: <info>  [1769041186.9285] device (tap89ad850c-a8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:19:46 compute-0 NetworkManager[55139]: <info>  [1769041186.9290] device (tap89ad850c-a8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:19:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:46.941 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f053de-876b-4442-9a52-0afcd9da2fbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:46 compute-0 systemd[1]: Started Virtual Machine qemu-73-instance-00000091.
Jan 22 00:19:46 compute-0 podman[235392]: 2026-01-22 00:19:46.958451741 +0000 UTC m=+0.086035159 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:19:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:46.973 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0224ddf4-539b-41b5-8688-9adee1d7589e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.002 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[e51283f4-5c3e-40a6-9279-d6af9edc3643]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:47 compute-0 systemd-udevd[235416]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:19:47 compute-0 NetworkManager[55139]: <info>  [1769041187.0163] manager: (tap7347045a-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/264)
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.016 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3d02b549-377a-4d37-accb-43b36a032a67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.019 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:47 compute-0 ovn_controller[95047]: 2026-01-22T00:19:47Z|00541|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.040 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:47 compute-0 ovn_controller[95047]: 2026-01-22T00:19:47Z|00542|binding|INFO|Setting lport 89ad850c-a87f-489f-8c3e-51dfc078a374 ovn-installed in OVS
Jan 22 00:19:47 compute-0 ovn_controller[95047]: 2026-01-22T00:19:47Z|00543|binding|INFO|Setting lport 89ad850c-a87f-489f-8c3e-51dfc078a374 up in Southbound
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.050 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.069 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f9df7ce2-0821-4d6b-a25f-bf5e379fcc0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.074 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b5409cfc-f344-4a6a-a4e4-8bf490b885fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:47 compute-0 NetworkManager[55139]: <info>  [1769041187.0940] device (tap7347045a-f0): carrier: link connected
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.101 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a7bf3f-0d6f-43d6-9fa8-c5b0dc412359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.117 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ccb453be-702b-4da3-862b-d80fc31e757a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7347045a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:da:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568983, 'reachable_time': 41442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235454, 'error': None, 'target': 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.133 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[00118302-b543-4dd3-b7a7-e82821125275]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:da8d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 568983, 'tstamp': 568983}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235455, 'error': None, 'target': 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.150 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[61de61fd-6d78-4aba-8c4d-ee9eef494c04]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7347045a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:da:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 174], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568983, 'reachable_time': 41442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235456, 'error': None, 'target': 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.186 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2147cef7-fccf-4959-8a81-3eb39e407263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.243 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c993b017-0362-4da8-afe6-3e726530d17e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.244 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7347045a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.245 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.245 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7347045a-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:47 compute-0 kernel: tap7347045a-f0: entered promiscuous mode
Jan 22 00:19:47 compute-0 NetworkManager[55139]: <info>  [1769041187.2477] manager: (tap7347045a-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.249 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.249 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7347045a-f0, col_values=(('external_ids', {'iface-id': 'ce8d757a-1822-40f4-bb02-e88d8e0a4e11'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:47 compute-0 ovn_controller[95047]: 2026-01-22T00:19:47Z|00544|binding|INFO|Releasing lport ce8d757a-1822-40f4-bb02-e88d8e0a4e11 from this chassis (sb_readonly=0)
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.261 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.262 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7347045a-f38e-4f56-a03a-a68e0fbe1e8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7347045a-f38e-4f56-a03a-a68e0fbe1e8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.263 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ffdce601-49e6-44cb-96af-109e045eb186]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.264 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-7347045a-f38e-4f56-a03a-a68e0fbe1e8d
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/7347045a-f38e-4f56-a03a-a68e0fbe1e8d.pid.haproxy
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 7347045a-f38e-4f56-a03a-a68e0fbe1e8d
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:19:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:47.264 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'env', 'PROCESS_TAG=haproxy-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7347045a-f38e-4f56-a03a-a68e0fbe1e8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.354 182939 DEBUG nova.compute.manager [req-854d04d5-609c-442b-8a6d-6a0faaba4441 req-dbb8f614-f3a8-4e49-abae-d99f68367670 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.355 182939 DEBUG oslo_concurrency.lockutils [req-854d04d5-609c-442b-8a6d-6a0faaba4441 req-dbb8f614-f3a8-4e49-abae-d99f68367670 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.356 182939 DEBUG oslo_concurrency.lockutils [req-854d04d5-609c-442b-8a6d-6a0faaba4441 req-dbb8f614-f3a8-4e49-abae-d99f68367670 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.356 182939 DEBUG oslo_concurrency.lockutils [req-854d04d5-609c-442b-8a6d-6a0faaba4441 req-dbb8f614-f3a8-4e49-abae-d99f68367670 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.357 182939 DEBUG nova.compute.manager [req-854d04d5-609c-442b-8a6d-6a0faaba4441 req-dbb8f614-f3a8-4e49-abae-d99f68367670 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.358 182939 WARNING nova.compute.manager [req-854d04d5-609c-442b-8a6d-6a0faaba4441 req-dbb8f614-f3a8-4e49-abae-d99f68367670 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state active and task_state resize_finish.
Jan 22 00:19:47 compute-0 podman[235488]: 2026-01-22 00:19:47.599141339 +0000 UTC m=+0.021111743 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.794 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041187.7939265, d59e0943-5372-4680-af52-c9af874c8578 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.795 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] VM Resumed (Lifecycle Event)
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.797 182939 DEBUG nova.compute.manager [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.800 182939 INFO nova.virt.libvirt.driver [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance running successfully.
Jan 22 00:19:47 compute-0 virtqemud[182477]: argument unsupported: QEMU guest agent is not configured
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.804 182939 DEBUG nova.virt.libvirt.guest [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.805 182939 DEBUG nova.virt.libvirt.driver [None req-c40b7605-b7ba-4db2-a319-a3b7ffe69f2c 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 22 00:19:47 compute-0 ovn_controller[95047]: 2026-01-22T00:19:47Z|00545|binding|INFO|Releasing lport ce8d757a-1822-40f4-bb02-e88d8e0a4e11 from this chassis (sb_readonly=0)
Jan 22 00:19:47 compute-0 ovn_controller[95047]: 2026-01-22T00:19:47Z|00546|binding|INFO|Releasing lport 1ae0dbff-a7cd-4db8-afc3-1d102fdd130f from this chassis (sb_readonly=0)
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.832 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.835 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.849 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.885 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.885 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041187.795398, d59e0943-5372-4680-af52-c9af874c8578 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.886 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] VM Started (Lifecycle Event)
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.929 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.934 182939 DEBUG nova.network.neutron [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updated VIF entry in instance network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.935 182939 DEBUG nova.network.neutron [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:47 compute-0 nova_compute[182935]: 2026-01-22 00:19:47.937 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:19:48 compute-0 nova_compute[182935]: 2026-01-22 00:19:48.049 182939 DEBUG oslo_concurrency.lockutils [req-0cb395dc-ae4b-42c4-9284-05103b89d69b req-6509a67f-79e9-458e-ae50-38cae1eab53b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:48 compute-0 nova_compute[182935]: 2026-01-22 00:19:48.050 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:48 compute-0 nova_compute[182935]: 2026-01-22 00:19:48.053 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:19:48 compute-0 nova_compute[182935]: 2026-01-22 00:19:48.053 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:48 compute-0 nova_compute[182935]: 2026-01-22 00:19:48.114 182939 DEBUG oslo_concurrency.lockutils [None req-417325bf-d8aa-4871-880c-f1daee76562c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "af4c2559-6129-4c4e-92f5-c991d58e4225" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:48 compute-0 nova_compute[182935]: 2026-01-22 00:19:48.115 182939 DEBUG oslo_concurrency.lockutils [None req-417325bf-d8aa-4871-880c-f1daee76562c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:48 compute-0 nova_compute[182935]: 2026-01-22 00:19:48.115 182939 DEBUG nova.compute.manager [None req-417325bf-d8aa-4871-880c-f1daee76562c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:19:48 compute-0 nova_compute[182935]: 2026-01-22 00:19:48.121 182939 DEBUG nova.compute.manager [None req-417325bf-d8aa-4871-880c-f1daee76562c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 22 00:19:48 compute-0 nova_compute[182935]: 2026-01-22 00:19:48.122 182939 DEBUG nova.objects.instance [None req-417325bf-d8aa-4871-880c-f1daee76562c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'flavor' on Instance uuid af4c2559-6129-4c4e-92f5-c991d58e4225 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:48 compute-0 nova_compute[182935]: 2026-01-22 00:19:48.156 182939 DEBUG nova.objects.instance [None req-417325bf-d8aa-4871-880c-f1daee76562c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'info_cache' on Instance uuid af4c2559-6129-4c4e-92f5-c991d58e4225 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:48 compute-0 nova_compute[182935]: 2026-01-22 00:19:48.186 182939 DEBUG nova.virt.libvirt.driver [None req-417325bf-d8aa-4871-880c-f1daee76562c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:19:48 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:48.804 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:48 compute-0 podman[235488]: 2026-01-22 00:19:48.853468791 +0000 UTC m=+1.275439185 container create 25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:19:48 compute-0 systemd[1]: Started libpod-conmon-25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e.scope.
Jan 22 00:19:48 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:19:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc254547ba57aac8d0f918b2026048f50172115849098be9bec55d696085d478/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:19:48 compute-0 podman[235488]: 2026-01-22 00:19:48.984668415 +0000 UTC m=+1.406638799 container init 25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:19:48 compute-0 podman[235488]: 2026-01-22 00:19:48.991571569 +0000 UTC m=+1.413541953 container start 25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:19:49 compute-0 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[235512]: [NOTICE]   (235516) : New worker (235518) forked
Jan 22 00:19:49 compute-0 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[235512]: [NOTICE]   (235516) : Loading success.
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.465 182939 DEBUG nova.compute.manager [req-848200e0-7403-438e-a753-c614a2e7c466 req-12aa1f5f-fe45-4ecc-9ef9-69408d9f9859 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.466 182939 DEBUG oslo_concurrency.lockutils [req-848200e0-7403-438e-a753-c614a2e7c466 req-12aa1f5f-fe45-4ecc-9ef9-69408d9f9859 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.467 182939 DEBUG oslo_concurrency.lockutils [req-848200e0-7403-438e-a753-c614a2e7c466 req-12aa1f5f-fe45-4ecc-9ef9-69408d9f9859 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.467 182939 DEBUG oslo_concurrency.lockutils [req-848200e0-7403-438e-a753-c614a2e7c466 req-12aa1f5f-fe45-4ecc-9ef9-69408d9f9859 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.467 182939 DEBUG nova.compute.manager [req-848200e0-7403-438e-a753-c614a2e7c466 req-12aa1f5f-fe45-4ecc-9ef9-69408d9f9859 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.468 182939 WARNING nova.compute.manager [req-848200e0-7403-438e-a753-c614a2e7c466 req-12aa1f5f-fe45-4ecc-9ef9-69408d9f9859 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state resized and task_state None.
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.600 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.624 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.624 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.625 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.625 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.626 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.626 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.651 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.652 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.653 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.653 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.774 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.837 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.839 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.897 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.905 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.981 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:49 compute-0 nova_compute[182935]: 2026-01-22 00:19:49.982 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.052 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.230 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.232 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5389MB free_disk=73.09789276123047GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.232 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.233 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.299 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Applying migration context for instance d59e0943-5372-4680-af52-c9af874c8578 as it has an incoming, in-progress migration 70071767-0c89-48d9-81cd-be888f6d084e. Migration status is reverting _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.300 182939 INFO nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating resource usage from migration 70071767-0c89-48d9-81cd-be888f6d084e
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.322 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance af4c2559-6129-4c4e-92f5-c991d58e4225 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.323 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance d59e0943-5372-4680-af52-c9af874c8578 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.323 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.324 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.346 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.434 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.435 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.452 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.455 182939 DEBUG nova.network.neutron [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Port 89ad850c-a87f-489f-8c3e-51dfc078a374 binding to destination host compute-0.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.455 182939 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.456 182939 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.456 182939 DEBUG nova.network.neutron [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.482 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.565 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.587 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.611 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.612 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.379s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:50 compute-0 nova_compute[182935]: 2026-01-22 00:19:50.780 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:50 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 00:19:50 compute-0 systemd[235164]: Activating special unit Exit the Session...
Jan 22 00:19:50 compute-0 systemd[235164]: Stopped target Main User Target.
Jan 22 00:19:50 compute-0 systemd[235164]: Stopped target Basic System.
Jan 22 00:19:50 compute-0 systemd[235164]: Stopped target Paths.
Jan 22 00:19:50 compute-0 systemd[235164]: Stopped target Sockets.
Jan 22 00:19:50 compute-0 systemd[235164]: Stopped target Timers.
Jan 22 00:19:50 compute-0 systemd[235164]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:19:50 compute-0 systemd[235164]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 00:19:50 compute-0 systemd[235164]: Closed D-Bus User Message Bus Socket.
Jan 22 00:19:50 compute-0 systemd[235164]: Stopped Create User's Volatile Files and Directories.
Jan 22 00:19:50 compute-0 systemd[235164]: Removed slice User Application Slice.
Jan 22 00:19:50 compute-0 systemd[235164]: Reached target Shutdown.
Jan 22 00:19:50 compute-0 systemd[235164]: Finished Exit the Session.
Jan 22 00:19:50 compute-0 systemd[235164]: Reached target Exit the Session.
Jan 22 00:19:50 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 00:19:50 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 00:19:50 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 00:19:50 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 00:19:50 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 00:19:50 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 00:19:50 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 00:19:51 compute-0 nova_compute[182935]: 2026-01-22 00:19:51.043 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:51 compute-0 nova_compute[182935]: 2026-01-22 00:19:51.740 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.481 182939 DEBUG nova.network.neutron [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.498 182939 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.514 182939 DEBUG nova.virt.libvirt.driver [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Creating tmpfile /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578/tmp6q6k5pp3 to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618
Jan 22 00:19:52 compute-0 kernel: tap89ad850c-a8 (unregistering): left promiscuous mode
Jan 22 00:19:52 compute-0 NetworkManager[55139]: <info>  [1769041192.5510] device (tap89ad850c-a8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:19:52 compute-0 ovn_controller[95047]: 2026-01-22T00:19:52Z|00547|binding|INFO|Releasing lport 89ad850c-a87f-489f-8c3e-51dfc078a374 from this chassis (sb_readonly=0)
Jan 22 00:19:52 compute-0 ovn_controller[95047]: 2026-01-22T00:19:52Z|00548|binding|INFO|Setting lport 89ad850c-a87f-489f-8c3e-51dfc078a374 down in Southbound
Jan 22 00:19:52 compute-0 ovn_controller[95047]: 2026-01-22T00:19:52Z|00549|binding|INFO|Removing iface tap89ad850c-a8 ovn-installed in OVS
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.563 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:52.570 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9d:17:06 10.100.0.5'], port_security=['fa:16:3e:9d:17:06 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'd59e0943-5372-4680-af52-c9af874c8578', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1bc50146-1f14-43fa-a2db-2904419fa654', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.194', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f19555ab-2ed1-467b-9e13-e9518e9577aa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=89ad850c-a87f-489f-8c3e-51dfc078a374) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:19:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:52.572 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 89ad850c-a87f-489f-8c3e-51dfc078a374 in datapath 7347045a-f38e-4f56-a03a-a68e0fbe1e8d unbound from our chassis
Jan 22 00:19:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:52.574 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7347045a-f38e-4f56-a03a-a68e0fbe1e8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:19:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:52.575 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[06183a64-74bc-4e93-b2b8-50db3b70b839]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:52.576 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d namespace which is not needed anymore
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.588 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:52 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000091.scope: Deactivated successfully.
Jan 22 00:19:52 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000091.scope: Consumed 5.595s CPU time.
Jan 22 00:19:52 compute-0 systemd-machined[154182]: Machine qemu-73-instance-00000091 terminated.
Jan 22 00:19:52 compute-0 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[235512]: [NOTICE]   (235516) : haproxy version is 2.8.14-c23fe91
Jan 22 00:19:52 compute-0 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[235512]: [NOTICE]   (235516) : path to executable is /usr/sbin/haproxy
Jan 22 00:19:52 compute-0 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[235512]: [WARNING]  (235516) : Exiting Master process...
Jan 22 00:19:52 compute-0 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[235512]: [WARNING]  (235516) : Exiting Master process...
Jan 22 00:19:52 compute-0 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[235512]: [ALERT]    (235516) : Current worker (235518) exited with code 143 (Terminated)
Jan 22 00:19:52 compute-0 neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d[235512]: [WARNING]  (235516) : All workers exited. Exiting... (0)
Jan 22 00:19:52 compute-0 systemd[1]: libpod-25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e.scope: Deactivated successfully.
Jan 22 00:19:52 compute-0 podman[235564]: 2026-01-22 00:19:52.751968632 +0000 UTC m=+0.078207283 container died 25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.796 182939 INFO nova.virt.libvirt.driver [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] Instance destroyed successfully.
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.797 182939 DEBUG nova.objects.instance [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.815 182939 DEBUG nova.virt.libvirt.vif [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:18:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-2079598704',display_name='tempest-TestNetworkAdvancedServerOps-server-2079598704',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-2079598704',id=145,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIY3eqmdW0m2q20hwTxy7fCq5RPOCY+KqJLqriFpcPzIAlQnzQNfW6TIp9Y1voEv/PtpLpDAT0kqBnGToo/qNh+oTys/PEZ/7XtlTWunC6nPRFTGOxMn536DUj7Tail8LA==',key_name='tempest-TestNetworkAdvancedServerOps-1230336080',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:19:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-zrj70p8y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:19:47Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=d59e0943-5372-4680-af52-c9af874c8578,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.816 182939 DEBUG nova.network.os_vif_util [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.817 182939 DEBUG nova.network.os_vif_util [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.817 182939 DEBUG os_vif [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.820 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.821 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89ad850c-a8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.823 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.826 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.829 182939 INFO os_vif [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9d:17:06,bridge_name='br-int',has_traffic_filtering=True,id=89ad850c-a87f-489f-8c3e-51dfc078a374,network=Network(7347045a-f38e-4f56-a03a-a68e0fbe1e8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap89ad850c-a8')
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.829 182939 INFO nova.virt.libvirt.driver [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Deleting instance files /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_del
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.837 182939 INFO nova.virt.libvirt.driver [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Deletion of /var/lib/nova/instances/d59e0943-5372-4680-af52-c9af874c8578_del complete
Jan 22 00:19:52 compute-0 nova_compute[182935]: 2026-01-22 00:19:52.852 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e-userdata-shm.mount: Deactivated successfully.
Jan 22 00:19:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-bc254547ba57aac8d0f918b2026048f50172115849098be9bec55d696085d478-merged.mount: Deactivated successfully.
Jan 22 00:19:52 compute-0 podman[235564]: 2026-01-22 00:19:52.983706211 +0000 UTC m=+0.309944862 container cleanup 25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:19:53 compute-0 podman[235584]: 2026-01-22 00:19:53.007588419 +0000 UTC m=+0.235175650 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:19:53 compute-0 systemd[1]: libpod-conmon-25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e.scope: Deactivated successfully.
Jan 22 00:19:53 compute-0 podman[235624]: 2026-01-22 00:19:53.052390186 +0000 UTC m=+0.048194548 container remove 25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:19:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:53.058 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4abf1e79-9766-4b80-972e-72bf313daf3c]: (4, ('Thu Jan 22 12:19:52 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d (25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e)\n25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e\nThu Jan 22 12:19:52 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d (25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e)\n25337e7e618e2c355ed3feec0ca23599be9406278b31369137222ea5e766ad1e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:53.060 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[35cfe44a-7b12-4caa-920e-86cef30c880a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:53.061 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7347045a-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.063 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:53 compute-0 kernel: tap7347045a-f0: left promiscuous mode
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.077 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:53.082 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a509d9b3-fa0d-4a37-a90a-51a3ed93d662]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:53.104 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[436526aa-9b4c-4eda-8602-71e17394b1ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:53.106 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c524b4ad-051c-4f40-a236-e6f245c42470]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:53.122 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[abfed440-097e-46ed-952c-f67c3507302e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568973, 'reachable_time': 16002, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235640, 'error': None, 'target': 'ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d7347045a\x2df38e\x2d4f56\x2da03a\x2da68e0fbe1e8d.mount: Deactivated successfully.
Jan 22 00:19:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:53.128 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7347045a-f38e-4f56-a03a-a68e0fbe1e8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:19:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:19:53.128 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[8c92397c-f91f-4769-b04d-5ac5f85d2d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.162 182939 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.164 182939 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.193 182939 DEBUG nova.objects.instance [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid d59e0943-5372-4680-af52-c9af874c8578 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.303 182939 DEBUG nova.compute.provider_tree [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.325 182939 DEBUG nova.scheduler.client.report [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.388 182939 DEBUG oslo_concurrency.lockutils [None req-9407b51e-2df7-478b-9165-75c7adecac6d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.225s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.670 182939 DEBUG nova.compute.manager [req-14ad724d-d432-4538-9f79-eae07079bc11 req-12b00f9e-14aa-4078-9f98-7bd9275c0722 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.671 182939 DEBUG oslo_concurrency.lockutils [req-14ad724d-d432-4538-9f79-eae07079bc11 req-12b00f9e-14aa-4078-9f98-7bd9275c0722 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.671 182939 DEBUG oslo_concurrency.lockutils [req-14ad724d-d432-4538-9f79-eae07079bc11 req-12b00f9e-14aa-4078-9f98-7bd9275c0722 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.672 182939 DEBUG oslo_concurrency.lockutils [req-14ad724d-d432-4538-9f79-eae07079bc11 req-12b00f9e-14aa-4078-9f98-7bd9275c0722 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.672 182939 DEBUG nova.compute.manager [req-14ad724d-d432-4538-9f79-eae07079bc11 req-12b00f9e-14aa-4078-9f98-7bd9275c0722 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:53 compute-0 nova_compute[182935]: 2026-01-22 00:19:53.672 182939 WARNING nova.compute.manager [req-14ad724d-d432-4538-9f79-eae07079bc11 req-12b00f9e-14aa-4078-9f98-7bd9275c0722 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-unplugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:19:54 compute-0 nova_compute[182935]: 2026-01-22 00:19:54.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:54 compute-0 nova_compute[182935]: 2026-01-22 00:19:54.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:19:55 compute-0 nova_compute[182935]: 2026-01-22 00:19:55.975 182939 DEBUG nova.compute.manager [req-a2b1674a-4f5f-43cc-a4d7-51e01e542473 req-05949937-cd76-4b36-a3ae-e362138e41cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:55 compute-0 nova_compute[182935]: 2026-01-22 00:19:55.976 182939 DEBUG oslo_concurrency.lockutils [req-a2b1674a-4f5f-43cc-a4d7-51e01e542473 req-05949937-cd76-4b36-a3ae-e362138e41cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:19:55 compute-0 nova_compute[182935]: 2026-01-22 00:19:55.977 182939 DEBUG oslo_concurrency.lockutils [req-a2b1674a-4f5f-43cc-a4d7-51e01e542473 req-05949937-cd76-4b36-a3ae-e362138e41cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:19:55 compute-0 nova_compute[182935]: 2026-01-22 00:19:55.977 182939 DEBUG oslo_concurrency.lockutils [req-a2b1674a-4f5f-43cc-a4d7-51e01e542473 req-05949937-cd76-4b36-a3ae-e362138e41cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:19:55 compute-0 nova_compute[182935]: 2026-01-22 00:19:55.978 182939 DEBUG nova.compute.manager [req-a2b1674a-4f5f-43cc-a4d7-51e01e542473 req-05949937-cd76-4b36-a3ae-e362138e41cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:19:55 compute-0 nova_compute[182935]: 2026-01-22 00:19:55.978 182939 WARNING nova.compute.manager [req-a2b1674a-4f5f-43cc-a4d7-51e01e542473 req-05949937-cd76-4b36-a3ae-e362138e41cb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:19:57 compute-0 ovn_controller[95047]: 2026-01-22T00:19:57Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:56:8b:61 10.100.0.9
Jan 22 00:19:57 compute-0 ovn_controller[95047]: 2026-01-22T00:19:57Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:56:8b:61 10.100.0.9
Jan 22 00:19:57 compute-0 nova_compute[182935]: 2026-01-22 00:19:57.823 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:57 compute-0 nova_compute[182935]: 2026-01-22 00:19:57.852 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:19:58 compute-0 nova_compute[182935]: 2026-01-22 00:19:58.096 182939 DEBUG nova.compute.manager [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:19:58 compute-0 nova_compute[182935]: 2026-01-22 00:19:58.096 182939 DEBUG nova.compute.manager [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing instance network info cache due to event network-changed-89ad850c-a87f-489f-8c3e-51dfc078a374. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:19:58 compute-0 nova_compute[182935]: 2026-01-22 00:19:58.097 182939 DEBUG oslo_concurrency.lockutils [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:19:58 compute-0 nova_compute[182935]: 2026-01-22 00:19:58.097 182939 DEBUG oslo_concurrency.lockutils [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:19:58 compute-0 nova_compute[182935]: 2026-01-22 00:19:58.097 182939 DEBUG nova.network.neutron [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Refreshing network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:19:58 compute-0 nova_compute[182935]: 2026-01-22 00:19:58.237 182939 DEBUG nova.virt.libvirt.driver [None req-417325bf-d8aa-4871-880c-f1daee76562c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Jan 22 00:19:59 compute-0 nova_compute[182935]: 2026-01-22 00:19:59.661 182939 DEBUG nova.network.neutron [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updated VIF entry in instance network info cache for port 89ad850c-a87f-489f-8c3e-51dfc078a374. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:19:59 compute-0 nova_compute[182935]: 2026-01-22 00:19:59.661 182939 DEBUG nova.network.neutron [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Updating instance_info_cache with network_info: [{"id": "89ad850c-a87f-489f-8c3e-51dfc078a374", "address": "fa:16:3e:9d:17:06", "network": {"id": "7347045a-f38e-4f56-a03a-a68e0fbe1e8d", "bridge": "br-int", "label": "tempest-network-smoke--1118354446", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap89ad850c-a8", "ovs_interfaceid": "89ad850c-a87f-489f-8c3e-51dfc078a374", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:19:59 compute-0 podman[235662]: 2026-01-22 00:19:59.681591159 +0000 UTC m=+0.057322886 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:19:59 compute-0 podman[235661]: 2026-01-22 00:19:59.682732197 +0000 UTC m=+0.060483472 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 00:19:59 compute-0 nova_compute[182935]: 2026-01-22 00:19:59.684 182939 DEBUG oslo_concurrency.lockutils [req-fef03a7b-2703-4d14-8389-6e8bae4e6f02 req-2f3075f5-572f-4da6-bc20-6c2954e7e6a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-d59e0943-5372-4680-af52-c9af874c8578" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:19:59 compute-0 nova_compute[182935]: 2026-01-22 00:19:59.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.263 182939 DEBUG nova.compute.manager [req-88ae0c87-6bab-40b8-943c-13b89efc546b req-e22361d3-1355-4a46-af49-ebc751ae30f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.264 182939 DEBUG oslo_concurrency.lockutils [req-88ae0c87-6bab-40b8-943c-13b89efc546b req-e22361d3-1355-4a46-af49-ebc751ae30f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "d59e0943-5372-4680-af52-c9af874c8578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.264 182939 DEBUG oslo_concurrency.lockutils [req-88ae0c87-6bab-40b8-943c-13b89efc546b req-e22361d3-1355-4a46-af49-ebc751ae30f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.264 182939 DEBUG oslo_concurrency.lockutils [req-88ae0c87-6bab-40b8-943c-13b89efc546b req-e22361d3-1355-4a46-af49-ebc751ae30f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "d59e0943-5372-4680-af52-c9af874c8578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.264 182939 DEBUG nova.compute.manager [req-88ae0c87-6bab-40b8-943c-13b89efc546b req-e22361d3-1355-4a46-af49-ebc751ae30f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] No waiting events found dispatching network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.265 182939 WARNING nova.compute.manager [req-88ae0c87-6bab-40b8-943c-13b89efc546b req-e22361d3-1355-4a46-af49-ebc751ae30f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: d59e0943-5372-4680-af52-c9af874c8578] Received unexpected event network-vif-plugged-89ad850c-a87f-489f-8c3e-51dfc078a374 for instance with vm_state resized and task_state resize_reverting.
Jan 22 00:20:00 compute-0 kernel: tap1703eb18-c4 (unregistering): left promiscuous mode
Jan 22 00:20:00 compute-0 NetworkManager[55139]: <info>  [1769041200.4371] device (tap1703eb18-c4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:20:00 compute-0 ovn_controller[95047]: 2026-01-22T00:20:00Z|00550|binding|INFO|Releasing lport 1703eb18-c425-4648-b9be-39cfa1090d95 from this chassis (sb_readonly=0)
Jan 22 00:20:00 compute-0 ovn_controller[95047]: 2026-01-22T00:20:00Z|00551|binding|INFO|Setting lport 1703eb18-c425-4648-b9be-39cfa1090d95 down in Southbound
Jan 22 00:20:00 compute-0 ovn_controller[95047]: 2026-01-22T00:20:00Z|00552|binding|INFO|Removing iface tap1703eb18-c4 ovn-installed in OVS
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.451 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.452 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:00.464 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:8b:61 10.100.0.9'], port_security=['fa:16:3e:56:8b:61 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'af4c2559-6129-4c4e-92f5-c991d58e4225', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aabf11c6-ef94-408a-8148-6c6400566606', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e408650207b498c8d115fd0c4f776dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd88f438e-f9bb-4593-93a6-6ce5aa939167', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfd57084-623a-46cf-a9c5-71a440a640c6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=1703eb18-c425-4648-b9be-39cfa1090d95) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:20:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:00.466 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 1703eb18-c425-4648-b9be-39cfa1090d95 in datapath aabf11c6-ef94-408a-8148-6c6400566606 unbound from our chassis
Jan 22 00:20:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:00.468 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aabf11c6-ef94-408a-8148-6c6400566606, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:20:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:00.469 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4c0b17a8-6c17-4672-8ef0-1f0a12ba0ed2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:00.470 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 namespace which is not needed anymore
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.471 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:00 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000093.scope: Deactivated successfully.
Jan 22 00:20:00 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000093.scope: Consumed 14.994s CPU time.
Jan 22 00:20:00 compute-0 systemd-machined[154182]: Machine qemu-72-instance-00000093 terminated.
Jan 22 00:20:00 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[235306]: [NOTICE]   (235310) : haproxy version is 2.8.14-c23fe91
Jan 22 00:20:00 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[235306]: [NOTICE]   (235310) : path to executable is /usr/sbin/haproxy
Jan 22 00:20:00 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[235306]: [WARNING]  (235310) : Exiting Master process...
Jan 22 00:20:00 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[235306]: [ALERT]    (235310) : Current worker (235312) exited with code 143 (Terminated)
Jan 22 00:20:00 compute-0 neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606[235306]: [WARNING]  (235310) : All workers exited. Exiting... (0)
Jan 22 00:20:00 compute-0 systemd[1]: libpod-b509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4.scope: Deactivated successfully.
Jan 22 00:20:00 compute-0 podman[235728]: 2026-01-22 00:20:00.672532899 +0000 UTC m=+0.117420818 container died b509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:20:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4-userdata-shm.mount: Deactivated successfully.
Jan 22 00:20:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-431d9f4661f435f3dd6f46868c1d8d6f982bb23ae1347493f4575c2a9149d4b3-merged.mount: Deactivated successfully.
Jan 22 00:20:00 compute-0 podman[235728]: 2026-01-22 00:20:00.735506798 +0000 UTC m=+0.180394717 container cleanup b509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 00:20:00 compute-0 systemd[1]: libpod-conmon-b509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4.scope: Deactivated successfully.
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.760 182939 DEBUG nova.compute.manager [req-59173ae3-855b-4116-8ec8-caa0945933e1 req-65b09209-b713-4891-88d2-8b7a8c2a6f28 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Received event network-vif-unplugged-1703eb18-c425-4648-b9be-39cfa1090d95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.760 182939 DEBUG oslo_concurrency.lockutils [req-59173ae3-855b-4116-8ec8-caa0945933e1 req-65b09209-b713-4891-88d2-8b7a8c2a6f28 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.761 182939 DEBUG oslo_concurrency.lockutils [req-59173ae3-855b-4116-8ec8-caa0945933e1 req-65b09209-b713-4891-88d2-8b7a8c2a6f28 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.761 182939 DEBUG oslo_concurrency.lockutils [req-59173ae3-855b-4116-8ec8-caa0945933e1 req-65b09209-b713-4891-88d2-8b7a8c2a6f28 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.761 182939 DEBUG nova.compute.manager [req-59173ae3-855b-4116-8ec8-caa0945933e1 req-65b09209-b713-4891-88d2-8b7a8c2a6f28 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] No waiting events found dispatching network-vif-unplugged-1703eb18-c425-4648-b9be-39cfa1090d95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:20:00 compute-0 nova_compute[182935]: 2026-01-22 00:20:00.761 182939 WARNING nova.compute.manager [req-59173ae3-855b-4116-8ec8-caa0945933e1 req-65b09209-b713-4891-88d2-8b7a8c2a6f28 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Received unexpected event network-vif-unplugged-1703eb18-c425-4648-b9be-39cfa1090d95 for instance with vm_state active and task_state powering-off.
Jan 22 00:20:01 compute-0 nova_compute[182935]: 2026-01-22 00:20:01.189 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:01 compute-0 nova_compute[182935]: 2026-01-22 00:20:01.249 182939 INFO nova.virt.libvirt.driver [None req-417325bf-d8aa-4871-880c-f1daee76562c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Instance shutdown successfully after 13 seconds.
Jan 22 00:20:01 compute-0 nova_compute[182935]: 2026-01-22 00:20:01.255 182939 INFO nova.virt.libvirt.driver [-] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Instance destroyed successfully.
Jan 22 00:20:01 compute-0 nova_compute[182935]: 2026-01-22 00:20:01.255 182939 DEBUG nova.objects.instance [None req-417325bf-d8aa-4871-880c-f1daee76562c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'numa_topology' on Instance uuid af4c2559-6129-4c4e-92f5-c991d58e4225 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:20:01 compute-0 nova_compute[182935]: 2026-01-22 00:20:01.275 182939 DEBUG nova.compute.manager [None req-417325bf-d8aa-4871-880c-f1daee76562c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:01 compute-0 podman[235771]: 2026-01-22 00:20:01.282931786 +0000 UTC m=+0.526499710 container remove b509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 00:20:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:01.289 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d071a81d-29e6-4a81-b527-199e7060a02f]: (4, ('Thu Jan 22 12:20:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 (b509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4)\nb509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4\nThu Jan 22 12:20:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 (b509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4)\nb509e1beb000fdddc21718ba590a547be5f47435a6adc196319e030e1834dad4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:01.291 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3aea531d-f0d8-49b7-8501-0671204c52da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:01.292 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaabf11c6-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:01 compute-0 nova_compute[182935]: 2026-01-22 00:20:01.294 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:01 compute-0 kernel: tapaabf11c6-e0: left promiscuous mode
Jan 22 00:20:01 compute-0 nova_compute[182935]: 2026-01-22 00:20:01.309 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:01.313 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b111ed46-d11f-4cbd-b714-b87a237f6730]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:01.332 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8898ca-f959-49c9-a0d3-980be17e2b39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:01.333 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[515a0597-f817-487f-8155-e646b9f94756]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:01.349 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[58e6e552-fa74-4157-8b4b-44f159de924b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 568426, 'reachable_time': 22046, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235790, 'error': None, 'target': 'ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:01.352 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-aabf11c6-ef94-408a-8148-6c6400566606 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:20:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:01.352 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[9cff31e2-e28e-48d8-af06-b85a1a410175]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:01 compute-0 systemd[1]: run-netns-ovnmeta\x2daabf11c6\x2def94\x2d408a\x2d8148\x2d6c6400566606.mount: Deactivated successfully.
Jan 22 00:20:01 compute-0 nova_compute[182935]: 2026-01-22 00:20:01.405 182939 DEBUG oslo_concurrency.lockutils [None req-417325bf-d8aa-4871-880c-f1daee76562c 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:02 compute-0 nova_compute[182935]: 2026-01-22 00:20:02.827 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:02 compute-0 nova_compute[182935]: 2026-01-22 00:20:02.853 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:02 compute-0 nova_compute[182935]: 2026-01-22 00:20:02.910 182939 DEBUG nova.compute.manager [req-62089da2-21c7-419b-b1cd-57275586edac req-87094a21-fbfb-4616-bea9-a63f05df4c97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Received event network-vif-plugged-1703eb18-c425-4648-b9be-39cfa1090d95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:02 compute-0 nova_compute[182935]: 2026-01-22 00:20:02.911 182939 DEBUG oslo_concurrency.lockutils [req-62089da2-21c7-419b-b1cd-57275586edac req-87094a21-fbfb-4616-bea9-a63f05df4c97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:02 compute-0 nova_compute[182935]: 2026-01-22 00:20:02.911 182939 DEBUG oslo_concurrency.lockutils [req-62089da2-21c7-419b-b1cd-57275586edac req-87094a21-fbfb-4616-bea9-a63f05df4c97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:02 compute-0 nova_compute[182935]: 2026-01-22 00:20:02.911 182939 DEBUG oslo_concurrency.lockutils [req-62089da2-21c7-419b-b1cd-57275586edac req-87094a21-fbfb-4616-bea9-a63f05df4c97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:02 compute-0 nova_compute[182935]: 2026-01-22 00:20:02.911 182939 DEBUG nova.compute.manager [req-62089da2-21c7-419b-b1cd-57275586edac req-87094a21-fbfb-4616-bea9-a63f05df4c97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] No waiting events found dispatching network-vif-plugged-1703eb18-c425-4648-b9be-39cfa1090d95 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:20:02 compute-0 nova_compute[182935]: 2026-01-22 00:20:02.912 182939 WARNING nova.compute.manager [req-62089da2-21c7-419b-b1cd-57275586edac req-87094a21-fbfb-4616-bea9-a63f05df4c97 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Received unexpected event network-vif-plugged-1703eb18-c425-4648-b9be-39cfa1090d95 for instance with vm_state stopped and task_state None.
Jan 22 00:20:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:03.217 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:03.218 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:03.218 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:03 compute-0 nova_compute[182935]: 2026-01-22 00:20:03.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:04 compute-0 nova_compute[182935]: 2026-01-22 00:20:04.989 182939 DEBUG oslo_concurrency.lockutils [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "af4c2559-6129-4c4e-92f5-c991d58e4225" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:04 compute-0 nova_compute[182935]: 2026-01-22 00:20:04.989 182939 DEBUG oslo_concurrency.lockutils [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:04 compute-0 nova_compute[182935]: 2026-01-22 00:20:04.990 182939 DEBUG oslo_concurrency.lockutils [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:04 compute-0 nova_compute[182935]: 2026-01-22 00:20:04.990 182939 DEBUG oslo_concurrency.lockutils [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:04 compute-0 nova_compute[182935]: 2026-01-22 00:20:04.990 182939 DEBUG oslo_concurrency.lockutils [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.001 182939 INFO nova.compute.manager [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Terminating instance
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.014 182939 DEBUG nova.compute.manager [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.021 182939 INFO nova.virt.libvirt.driver [-] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Instance destroyed successfully.
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.021 182939 DEBUG nova.objects.instance [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lazy-loading 'resources' on Instance uuid af4c2559-6129-4c4e-92f5-c991d58e4225 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.036 182939 DEBUG nova.virt.libvirt.vif [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:19:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1466679265',display_name='tempest-Íñstáñcé-352924762',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1466679265',id=147,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:19:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3e408650207b498c8d115fd0c4f776dc',ramdisk_id='',reservation_id='r-3i4g2m0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-374007797',owner_user_name='tempest-ServersTestJSON-374007797-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:20:03Z,user_data=None,user_id='5eb4e81f0cef4003ae49faa67b3f17c3',uuid=af4c2559-6129-4c4e-92f5-c991d58e4225,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "1703eb18-c425-4648-b9be-39cfa1090d95", "address": "fa:16:3e:56:8b:61", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1703eb18-c4", "ovs_interfaceid": "1703eb18-c425-4648-b9be-39cfa1090d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.037 182939 DEBUG nova.network.os_vif_util [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converting VIF {"id": "1703eb18-c425-4648-b9be-39cfa1090d95", "address": "fa:16:3e:56:8b:61", "network": {"id": "aabf11c6-ef94-408a-8148-6c6400566606", "bridge": "br-int", "label": "tempest-ServersTestJSON-341875047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3e408650207b498c8d115fd0c4f776dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1703eb18-c4", "ovs_interfaceid": "1703eb18-c425-4648-b9be-39cfa1090d95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.037 182939 DEBUG nova.network.os_vif_util [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:8b:61,bridge_name='br-int',has_traffic_filtering=True,id=1703eb18-c425-4648-b9be-39cfa1090d95,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1703eb18-c4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.038 182939 DEBUG os_vif [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:8b:61,bridge_name='br-int',has_traffic_filtering=True,id=1703eb18-c425-4648-b9be-39cfa1090d95,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1703eb18-c4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.039 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.039 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1703eb18-c4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.040 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.042 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.044 182939 INFO os_vif [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:8b:61,bridge_name='br-int',has_traffic_filtering=True,id=1703eb18-c425-4648-b9be-39cfa1090d95,network=Network(aabf11c6-ef94-408a-8148-6c6400566606),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1703eb18-c4')
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.044 182939 INFO nova.virt.libvirt.driver [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Deleting instance files /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225_del
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.045 182939 INFO nova.virt.libvirt.driver [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Deletion of /var/lib/nova/instances/af4c2559-6129-4c4e-92f5-c991d58e4225_del complete
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.397 182939 INFO nova.compute.manager [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.398 182939 DEBUG oslo.service.loopingcall [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.399 182939 DEBUG nova.compute.manager [-] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:20:05 compute-0 nova_compute[182935]: 2026-01-22 00:20:05.400 182939 DEBUG nova.network.neutron [-] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:20:06 compute-0 nova_compute[182935]: 2026-01-22 00:20:06.608 182939 DEBUG nova.network.neutron [-] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:20:06 compute-0 nova_compute[182935]: 2026-01-22 00:20:06.626 182939 INFO nova.compute.manager [-] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Took 1.23 seconds to deallocate network for instance.
Jan 22 00:20:06 compute-0 nova_compute[182935]: 2026-01-22 00:20:06.709 182939 DEBUG oslo_concurrency.lockutils [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:06 compute-0 nova_compute[182935]: 2026-01-22 00:20:06.710 182939 DEBUG oslo_concurrency.lockutils [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:06 compute-0 nova_compute[182935]: 2026-01-22 00:20:06.734 182939 DEBUG nova.compute.manager [req-abff8664-a9d3-4b26-9a82-945528cfb18c req-87baa193-8c22-4bd2-a64f-5e202edbaa6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Received event network-vif-deleted-1703eb18-c425-4648-b9be-39cfa1090d95 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:06 compute-0 nova_compute[182935]: 2026-01-22 00:20:06.821 182939 DEBUG nova.compute.provider_tree [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:20:06 compute-0 nova_compute[182935]: 2026-01-22 00:20:06.839 182939 DEBUG nova.scheduler.client.report [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:20:06 compute-0 nova_compute[182935]: 2026-01-22 00:20:06.846 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:06 compute-0 nova_compute[182935]: 2026-01-22 00:20:06.847 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:06 compute-0 nova_compute[182935]: 2026-01-22 00:20:06.880 182939 DEBUG oslo_concurrency.lockutils [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:06 compute-0 nova_compute[182935]: 2026-01-22 00:20:06.895 182939 DEBUG nova.compute.manager [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:20:06 compute-0 nova_compute[182935]: 2026-01-22 00:20:06.929 182939 INFO nova.scheduler.client.report [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Deleted allocations for instance af4c2559-6129-4c4e-92f5-c991d58e4225
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.021 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.021 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.029 182939 DEBUG nova.virt.hardware [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.029 182939 INFO nova.compute.claims [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.039 182939 DEBUG oslo_concurrency.lockutils [None req-096fb042-6203-4244-9ff5-47b91f405ea0 5eb4e81f0cef4003ae49faa67b3f17c3 3e408650207b498c8d115fd0c4f776dc - - default default] Lock "af4c2559-6129-4c4e-92f5-c991d58e4225" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.210 182939 DEBUG nova.compute.provider_tree [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.227 182939 DEBUG nova.scheduler.client.report [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.263 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.264 182939 DEBUG nova.compute.manager [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.335 182939 DEBUG nova.compute.manager [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.336 182939 DEBUG nova.network.neutron [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.375 182939 INFO nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.407 182939 DEBUG nova.compute.manager [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.550 182939 DEBUG nova.compute.manager [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.551 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.552 182939 INFO nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Creating image(s)
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.552 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "/var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.553 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.553 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.565 182939 DEBUG nova.policy [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.568 182939 DEBUG oslo_concurrency.processutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.628 182939 DEBUG oslo_concurrency.processutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.629 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.630 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.648 182939 DEBUG oslo_concurrency.processutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.711 182939 DEBUG oslo_concurrency.processutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.712 182939 DEBUG oslo_concurrency.processutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.794 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041192.7938344, d59e0943-5372-4680-af52-c9af874c8578 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.795 182939 INFO nova.compute.manager [-] [instance: d59e0943-5372-4680-af52-c9af874c8578] VM Stopped (Lifecycle Event)
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.824 182939 DEBUG nova.compute.manager [None req-40e22ad8-face-4cf0-b401-8eb94ea3bc83 - - - - - -] [instance: d59e0943-5372-4680-af52-c9af874c8578] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.856 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.906 182939 DEBUG oslo_concurrency.processutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk 1073741824" returned: 0 in 0.194s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.907 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.907 182939 DEBUG oslo_concurrency.processutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.967 182939 DEBUG oslo_concurrency.processutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.968 182939 DEBUG nova.virt.disk.api [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Checking if we can resize image /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:20:07 compute-0 nova_compute[182935]: 2026-01-22 00:20:07.968 182939 DEBUG oslo_concurrency.processutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:08 compute-0 nova_compute[182935]: 2026-01-22 00:20:08.025 182939 DEBUG oslo_concurrency.processutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:08 compute-0 nova_compute[182935]: 2026-01-22 00:20:08.026 182939 DEBUG nova.virt.disk.api [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Cannot resize image /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:20:08 compute-0 nova_compute[182935]: 2026-01-22 00:20:08.027 182939 DEBUG nova.objects.instance [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'migration_context' on Instance uuid b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:20:08 compute-0 nova_compute[182935]: 2026-01-22 00:20:08.042 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:20:08 compute-0 nova_compute[182935]: 2026-01-22 00:20:08.042 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Ensure instance console log exists: /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:20:08 compute-0 nova_compute[182935]: 2026-01-22 00:20:08.043 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:08 compute-0 nova_compute[182935]: 2026-01-22 00:20:08.043 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:08 compute-0 nova_compute[182935]: 2026-01-22 00:20:08.043 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:09 compute-0 nova_compute[182935]: 2026-01-22 00:20:09.446 182939 DEBUG nova.network.neutron [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Successfully created port: 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:20:10 compute-0 nova_compute[182935]: 2026-01-22 00:20:10.041 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:11 compute-0 nova_compute[182935]: 2026-01-22 00:20:11.518 182939 DEBUG nova.network.neutron [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Successfully updated port: 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:20:11 compute-0 nova_compute[182935]: 2026-01-22 00:20:11.537 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "refresh_cache-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:20:11 compute-0 nova_compute[182935]: 2026-01-22 00:20:11.537 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquired lock "refresh_cache-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:20:11 compute-0 nova_compute[182935]: 2026-01-22 00:20:11.538 182939 DEBUG nova.network.neutron [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:20:11 compute-0 nova_compute[182935]: 2026-01-22 00:20:11.710 182939 DEBUG nova.network.neutron [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:20:11 compute-0 nova_compute[182935]: 2026-01-22 00:20:11.835 182939 DEBUG nova.compute.manager [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Received event network-changed-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:11 compute-0 nova_compute[182935]: 2026-01-22 00:20:11.835 182939 DEBUG nova.compute.manager [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Refreshing instance network info cache due to event network-changed-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:20:11 compute-0 nova_compute[182935]: 2026-01-22 00:20:11.836 182939 DEBUG oslo_concurrency.lockutils [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:20:12 compute-0 nova_compute[182935]: 2026-01-22 00:20:12.858 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.283 182939 DEBUG nova.network.neutron [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Updating instance_info_cache with network_info: [{"id": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "address": "fa:16:3e:a0:6d:17", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96c8ac3c-cf", "ovs_interfaceid": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.311 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Releasing lock "refresh_cache-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.312 182939 DEBUG nova.compute.manager [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Instance network_info: |[{"id": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "address": "fa:16:3e:a0:6d:17", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96c8ac3c-cf", "ovs_interfaceid": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.312 182939 DEBUG oslo_concurrency.lockutils [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.312 182939 DEBUG nova.network.neutron [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Refreshing network info cache for port 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.316 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Start _get_guest_xml network_info=[{"id": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "address": "fa:16:3e:a0:6d:17", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96c8ac3c-cf", "ovs_interfaceid": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.320 182939 WARNING nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.325 182939 DEBUG nova.virt.libvirt.host [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.325 182939 DEBUG nova.virt.libvirt.host [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.331 182939 DEBUG nova.virt.libvirt.host [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.332 182939 DEBUG nova.virt.libvirt.host [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.333 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.334 182939 DEBUG nova.virt.hardware [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.334 182939 DEBUG nova.virt.hardware [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.335 182939 DEBUG nova.virt.hardware [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.335 182939 DEBUG nova.virt.hardware [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.335 182939 DEBUG nova.virt.hardware [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.335 182939 DEBUG nova.virt.hardware [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.335 182939 DEBUG nova.virt.hardware [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.336 182939 DEBUG nova.virt.hardware [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.336 182939 DEBUG nova.virt.hardware [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.336 182939 DEBUG nova.virt.hardware [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.336 182939 DEBUG nova.virt.hardware [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.340 182939 DEBUG nova.virt.libvirt.vif [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=150,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFOgFr0loz0o97S1yJic425BuuGnqIIzzaQU+1FOWYN8VLWjMOBgkt02kLpdfipR3QnvdUvT3mVD/diPnm35tClCs6BoaTbQN3VWq8tyqhLXUA2JeTkyyUA3yLrgO9t4ag==',key_name='tempest-TestSecurityGroupsBasicOps-1152614963',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-jryb8knd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:20:07Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "address": "fa:16:3e:a0:6d:17", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96c8ac3c-cf", "ovs_interfaceid": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.340 182939 DEBUG nova.network.os_vif_util [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "address": "fa:16:3e:a0:6d:17", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96c8ac3c-cf", "ovs_interfaceid": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.341 182939 DEBUG nova.network.os_vif_util [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:6d:17,bridge_name='br-int',has_traffic_filtering=True,id=96c8ac3c-cf3f-4189-9267-c0a4096a0fb1,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96c8ac3c-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.342 182939 DEBUG nova.objects.instance [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'pci_devices' on Instance uuid b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.366 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:20:13 compute-0 nova_compute[182935]:   <uuid>b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd</uuid>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   <name>instance-00000096</name>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333</nova:name>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:20:13</nova:creationTime>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:20:13 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:20:13 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:20:13 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:20:13 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:20:13 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:20:13 compute-0 nova_compute[182935]:         <nova:user uuid="a60ce2b7b7ae47b484de12add551b287">tempest-TestSecurityGroupsBasicOps-1492736128-project-member</nova:user>
Jan 22 00:20:13 compute-0 nova_compute[182935]:         <nova:project uuid="02bcfc5f1f1044a3856e73a5938ff011">tempest-TestSecurityGroupsBasicOps-1492736128</nova:project>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:20:13 compute-0 nova_compute[182935]:         <nova:port uuid="96c8ac3c-cf3f-4189-9267-c0a4096a0fb1">
Jan 22 00:20:13 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <system>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <entry name="serial">b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd</entry>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <entry name="uuid">b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd</entry>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     </system>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   <os>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   </os>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   <features>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   </features>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.config"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:a0:6d:17"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <target dev="tap96c8ac3c-cf"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/console.log" append="off"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <video>
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     </video>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:20:13 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:20:13 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:20:13 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:20:13 compute-0 nova_compute[182935]: </domain>
Jan 22 00:20:13 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.367 182939 DEBUG nova.compute.manager [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Preparing to wait for external event network-vif-plugged-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.367 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.368 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.368 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.368 182939 DEBUG nova.virt.libvirt.vif [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=150,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFOgFr0loz0o97S1yJic425BuuGnqIIzzaQU+1FOWYN8VLWjMOBgkt02kLpdfipR3QnvdUvT3mVD/diPnm35tClCs6BoaTbQN3VWq8tyqhLXUA2JeTkyyUA3yLrgO9t4ag==',key_name='tempest-TestSecurityGroupsBasicOps-1152614963',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-jryb8knd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:20:07Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "address": "fa:16:3e:a0:6d:17", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96c8ac3c-cf", "ovs_interfaceid": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.369 182939 DEBUG nova.network.os_vif_util [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "address": "fa:16:3e:a0:6d:17", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96c8ac3c-cf", "ovs_interfaceid": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.369 182939 DEBUG nova.network.os_vif_util [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:6d:17,bridge_name='br-int',has_traffic_filtering=True,id=96c8ac3c-cf3f-4189-9267-c0a4096a0fb1,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96c8ac3c-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.369 182939 DEBUG os_vif [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:6d:17,bridge_name='br-int',has_traffic_filtering=True,id=96c8ac3c-cf3f-4189-9267-c0a4096a0fb1,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96c8ac3c-cf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.370 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.370 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.371 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.373 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.374 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap96c8ac3c-cf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.375 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap96c8ac3c-cf, col_values=(('external_ids', {'iface-id': '96c8ac3c-cf3f-4189-9267-c0a4096a0fb1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:6d:17', 'vm-uuid': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.377 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:13 compute-0 NetworkManager[55139]: <info>  [1769041213.3782] manager: (tap96c8ac3c-cf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.380 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.381 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.382 182939 INFO os_vif [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:6d:17,bridge_name='br-int',has_traffic_filtering=True,id=96c8ac3c-cf3f-4189-9267-c0a4096a0fb1,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96c8ac3c-cf')
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.594 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.594 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.595 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No VIF found with MAC fa:16:3e:a0:6d:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:20:13 compute-0 nova_compute[182935]: 2026-01-22 00:20:13.596 182939 INFO nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Using config drive
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.348 182939 INFO nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Creating config drive at /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.config
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.354 182939 DEBUG oslo_concurrency.processutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxwatbn8n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.481 182939 DEBUG oslo_concurrency.processutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxwatbn8n" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:14 compute-0 kernel: tap96c8ac3c-cf: entered promiscuous mode
Jan 22 00:20:14 compute-0 NetworkManager[55139]: <info>  [1769041214.5696] manager: (tap96c8ac3c-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Jan 22 00:20:14 compute-0 ovn_controller[95047]: 2026-01-22T00:20:14Z|00553|binding|INFO|Claiming lport 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 for this chassis.
Jan 22 00:20:14 compute-0 ovn_controller[95047]: 2026-01-22T00:20:14Z|00554|binding|INFO|96c8ac3c-cf3f-4189-9267-c0a4096a0fb1: Claiming fa:16:3e:a0:6d:17 10.100.0.10
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.569 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.578 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:6d:17 10.100.0.10'], port_security=['fa:16:3e:a0:6d:17 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65641ee3-5688-4f52-8e2b-2aae97505b84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9d5746ab-567f-4771-baec-483e6edef99f ac71d239-4d62-4e17-b02e-055a8db336af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e3d414ff-1f29-4cd2-96c4-c90cd0d603fc, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=96c8ac3c-cf3f-4189-9267-c0a4096a0fb1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.579 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 in datapath 65641ee3-5688-4f52-8e2b-2aae97505b84 bound to our chassis
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.581 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 65641ee3-5688-4f52-8e2b-2aae97505b84
Jan 22 00:20:14 compute-0 ovn_controller[95047]: 2026-01-22T00:20:14Z|00555|binding|INFO|Setting lport 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 ovn-installed in OVS
Jan 22 00:20:14 compute-0 ovn_controller[95047]: 2026-01-22T00:20:14Z|00556|binding|INFO|Setting lport 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 up in Southbound
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.588 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.591 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.595 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[682b77dc-ff49-4de4-86f7-ad5012b162e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.597 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap65641ee3-51 in ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.599 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap65641ee3-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.599 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[954f3775-a1ed-48ea-b932-a8a7b92c2ff7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.600 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4aee03d7-8784-4b37-b5b7-cae97e163e32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 systemd-machined[154182]: New machine qemu-74-instance-00000096.
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.614 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[85123493-7f69-4385-8e22-4b866346fb9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 systemd-udevd[235829]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.628 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c8d9ad21-29e4-4ed0-98d3-d269b1eda7d6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 systemd[1]: Started Virtual Machine qemu-74-instance-00000096.
Jan 22 00:20:14 compute-0 NetworkManager[55139]: <info>  [1769041214.6341] device (tap96c8ac3c-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:20:14 compute-0 NetworkManager[55139]: <info>  [1769041214.6353] device (tap96c8ac3c-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.660 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[848c642e-abd0-4d65-a8f7-1f194e22c511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.665 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ff9bf490-4b80-421b-ab4d-7b6e65061a02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 NetworkManager[55139]: <info>  [1769041214.6669] manager: (tap65641ee3-50): new Veth device (/org/freedesktop/NetworkManager/Devices/268)
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.700 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[e514d7ce-c71c-4335-8281-40021b38e9d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.704 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[e0f186e4-8b94-45e0-8c44-19bebc0cc36f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 NetworkManager[55139]: <info>  [1769041214.7276] device (tap65641ee3-50): carrier: link connected
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.732 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[0f7b2a76-8853-409e-80fc-e55d0e2cf9ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.751 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9be2f22e-a6b1-4898-9f33-04f1c4db04d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65641ee3-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:51:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571746, 'reachable_time': 43329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235860, 'error': None, 'target': 'ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.767 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[432fef57-7734-454b-90fc-9da089debfe6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedf:51e7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 571746, 'tstamp': 571746}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235861, 'error': None, 'target': 'ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.784 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d5dd3da6-d194-4da4-b01e-ef77960f69b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65641ee3-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:df:51:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 178], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571746, 'reachable_time': 43329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235862, 'error': None, 'target': 'ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.807 182939 DEBUG nova.compute.manager [req-117cadcf-1668-42e5-a20e-620c2f25b60b req-4a271164-098b-4adc-8828-b9683c3dc556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Received event network-vif-plugged-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.808 182939 DEBUG oslo_concurrency.lockutils [req-117cadcf-1668-42e5-a20e-620c2f25b60b req-4a271164-098b-4adc-8828-b9683c3dc556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.808 182939 DEBUG oslo_concurrency.lockutils [req-117cadcf-1668-42e5-a20e-620c2f25b60b req-4a271164-098b-4adc-8828-b9683c3dc556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.808 182939 DEBUG oslo_concurrency.lockutils [req-117cadcf-1668-42e5-a20e-620c2f25b60b req-4a271164-098b-4adc-8828-b9683c3dc556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.809 182939 DEBUG nova.compute.manager [req-117cadcf-1668-42e5-a20e-620c2f25b60b req-4a271164-098b-4adc-8828-b9683c3dc556 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Processing event network-vif-plugged-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.813 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[33126fc4-c3aa-4e06-9c81-b2bc23762dcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.885 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e52fb6f3-0651-48da-afae-70a3fb2d7d1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.887 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65641ee3-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.888 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.888 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65641ee3-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:14 compute-0 NetworkManager[55139]: <info>  [1769041214.8911] manager: (tap65641ee3-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Jan 22 00:20:14 compute-0 kernel: tap65641ee3-50: entered promiscuous mode
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.892 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.894 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap65641ee3-50, col_values=(('external_ids', {'iface-id': '737a2d1f-ad8c-46d7-ba36-880bbc6b5728'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:14 compute-0 ovn_controller[95047]: 2026-01-22T00:20:14Z|00557|binding|INFO|Releasing lport 737a2d1f-ad8c-46d7-ba36-880bbc6b5728 from this chassis (sb_readonly=0)
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.897 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/65641ee3-5688-4f52-8e2b-2aae97505b84.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/65641ee3-5688-4f52-8e2b-2aae97505b84.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.907 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cf08a979-a6f3-434f-b2d0-0a6ad9553df5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.908 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-65641ee3-5688-4f52-8e2b-2aae97505b84
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/65641ee3-5688-4f52-8e2b-2aae97505b84.pid.haproxy
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 65641ee3-5688-4f52-8e2b-2aae97505b84
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:20:14 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:14.909 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84', 'env', 'PROCESS_TAG=haproxy-65641ee3-5688-4f52-8e2b-2aae97505b84', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/65641ee3-5688-4f52-8e2b-2aae97505b84.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:20:14 compute-0 nova_compute[182935]: 2026-01-22 00:20:14.909 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.093 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041215.092337, b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.093 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] VM Started (Lifecycle Event)
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.097 182939 DEBUG nova.compute.manager [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.101 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.105 182939 INFO nova.virt.libvirt.driver [-] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Instance spawned successfully.
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.106 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.128 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.143 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.150 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.151 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.152 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.152 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.152 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.153 182939 DEBUG nova.virt.libvirt.driver [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.195 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.195 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041215.0925872, b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.195 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] VM Paused (Lifecycle Event)
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.248 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.252 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041215.100198, b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.252 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] VM Resumed (Lifecycle Event)
Jan 22 00:20:15 compute-0 podman[235897]: 2026-01-22 00:20:15.277502607 +0000 UTC m=+0.051392784 container create 51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.298 182939 INFO nova.compute.manager [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Took 7.75 seconds to spawn the instance on the hypervisor.
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.299 182939 DEBUG nova.compute.manager [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.300 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.307 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:20:15 compute-0 systemd[1]: Started libpod-conmon-51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60.scope.
Jan 22 00:20:15 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:20:15 compute-0 podman[235897]: 2026-01-22 00:20:15.24822291 +0000 UTC m=+0.022112897 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:20:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79f14246ff4056350e67e9ac4fe9fb224554c6e15ccf66975861a2c0f404c5b0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:20:15 compute-0 podman[235897]: 2026-01-22 00:20:15.360311468 +0000 UTC m=+0.134201425 container init 51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.360 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:20:15 compute-0 podman[235897]: 2026-01-22 00:20:15.366650419 +0000 UTC m=+0.140540386 container start 51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:20:15 compute-0 podman[235913]: 2026-01-22 00:20:15.371494434 +0000 UTC m=+0.054876497 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:20:15 compute-0 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[235918]: [NOTICE]   (235959) : New worker (235965) forked
Jan 22 00:20:15 compute-0 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[235918]: [NOTICE]   (235959) : Loading success.
Jan 22 00:20:15 compute-0 podman[235910]: 2026-01-22 00:20:15.4066352 +0000 UTC m=+0.090554305 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.423 182939 INFO nova.compute.manager [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Took 8.45 seconds to build instance.
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.441 182939 DEBUG oslo_concurrency.lockutils [None req-7ad0df0a-7c1f-4745-ac73-b00bbf54ec13 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.726 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041200.7245088, af4c2559-6129-4c4e-92f5-c991d58e4225 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.727 182939 INFO nova.compute.manager [-] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] VM Stopped (Lifecycle Event)
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.752 182939 DEBUG nova.compute.manager [None req-f33aa483-6676-418b-a877-ef8226e384ad - - - - - -] [instance: af4c2559-6129-4c4e-92f5-c991d58e4225] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.800 182939 DEBUG nova.network.neutron [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Updated VIF entry in instance network info cache for port 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.801 182939 DEBUG nova.network.neutron [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Updating instance_info_cache with network_info: [{"id": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "address": "fa:16:3e:a0:6d:17", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96c8ac3c-cf", "ovs_interfaceid": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:20:15 compute-0 nova_compute[182935]: 2026-01-22 00:20:15.861 182939 DEBUG oslo_concurrency.lockutils [req-7f4a3e93-5e15-4766-b7b7-3349fe4c4cd2 req-0998ee2a-167b-40d1-a525-9cc60ba3fb6f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:20:16 compute-0 nova_compute[182935]: 2026-01-22 00:20:16.924 182939 DEBUG nova.compute.manager [req-3f66aa1b-e913-425e-b35e-638997b57d84 req-791a3be1-d51b-4f02-b83b-ee30ec09c0cd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Received event network-vif-plugged-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:16 compute-0 nova_compute[182935]: 2026-01-22 00:20:16.925 182939 DEBUG oslo_concurrency.lockutils [req-3f66aa1b-e913-425e-b35e-638997b57d84 req-791a3be1-d51b-4f02-b83b-ee30ec09c0cd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:16 compute-0 nova_compute[182935]: 2026-01-22 00:20:16.925 182939 DEBUG oslo_concurrency.lockutils [req-3f66aa1b-e913-425e-b35e-638997b57d84 req-791a3be1-d51b-4f02-b83b-ee30ec09c0cd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:16 compute-0 nova_compute[182935]: 2026-01-22 00:20:16.926 182939 DEBUG oslo_concurrency.lockutils [req-3f66aa1b-e913-425e-b35e-638997b57d84 req-791a3be1-d51b-4f02-b83b-ee30ec09c0cd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:16 compute-0 nova_compute[182935]: 2026-01-22 00:20:16.926 182939 DEBUG nova.compute.manager [req-3f66aa1b-e913-425e-b35e-638997b57d84 req-791a3be1-d51b-4f02-b83b-ee30ec09c0cd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] No waiting events found dispatching network-vif-plugged-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:20:16 compute-0 nova_compute[182935]: 2026-01-22 00:20:16.926 182939 WARNING nova.compute.manager [req-3f66aa1b-e913-425e-b35e-638997b57d84 req-791a3be1-d51b-4f02-b83b-ee30ec09c0cd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Received unexpected event network-vif-plugged-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 for instance with vm_state active and task_state None.
Jan 22 00:20:17 compute-0 podman[235976]: 2026-01-22 00:20:17.674706515 +0000 UTC m=+0.051793054 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:20:17 compute-0 nova_compute[182935]: 2026-01-22 00:20:17.860 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:18 compute-0 nova_compute[182935]: 2026-01-22 00:20:18.378 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-0 ovn_controller[95047]: 2026-01-22T00:20:19Z|00558|binding|INFO|Releasing lport 737a2d1f-ad8c-46d7-ba36-880bbc6b5728 from this chassis (sb_readonly=0)
Jan 22 00:20:19 compute-0 nova_compute[182935]: 2026-01-22 00:20:19.226 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:19 compute-0 ovn_controller[95047]: 2026-01-22T00:20:19Z|00559|binding|INFO|Releasing lport 737a2d1f-ad8c-46d7-ba36-880bbc6b5728 from this chassis (sb_readonly=0)
Jan 22 00:20:19 compute-0 nova_compute[182935]: 2026-01-22 00:20:19.366 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:20 compute-0 nova_compute[182935]: 2026-01-22 00:20:20.629 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:20 compute-0 NetworkManager[55139]: <info>  [1769041220.6307] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Jan 22 00:20:20 compute-0 NetworkManager[55139]: <info>  [1769041220.6320] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Jan 22 00:20:20 compute-0 nova_compute[182935]: 2026-01-22 00:20:20.757 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:20 compute-0 ovn_controller[95047]: 2026-01-22T00:20:20Z|00560|binding|INFO|Releasing lport 737a2d1f-ad8c-46d7-ba36-880bbc6b5728 from this chassis (sb_readonly=0)
Jan 22 00:20:20 compute-0 nova_compute[182935]: 2026-01-22 00:20:20.774 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:21 compute-0 nova_compute[182935]: 2026-01-22 00:20:21.441 182939 DEBUG nova.compute.manager [req-9c6ae6d0-d766-4bc7-bb1c-06d2d9222776 req-ad586a88-0480-423e-ae3b-77544f56f9ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Received event network-changed-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:21 compute-0 nova_compute[182935]: 2026-01-22 00:20:21.442 182939 DEBUG nova.compute.manager [req-9c6ae6d0-d766-4bc7-bb1c-06d2d9222776 req-ad586a88-0480-423e-ae3b-77544f56f9ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Refreshing instance network info cache due to event network-changed-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:20:21 compute-0 nova_compute[182935]: 2026-01-22 00:20:21.443 182939 DEBUG oslo_concurrency.lockutils [req-9c6ae6d0-d766-4bc7-bb1c-06d2d9222776 req-ad586a88-0480-423e-ae3b-77544f56f9ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:20:21 compute-0 nova_compute[182935]: 2026-01-22 00:20:21.443 182939 DEBUG oslo_concurrency.lockutils [req-9c6ae6d0-d766-4bc7-bb1c-06d2d9222776 req-ad586a88-0480-423e-ae3b-77544f56f9ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:20:21 compute-0 nova_compute[182935]: 2026-01-22 00:20:21.444 182939 DEBUG nova.network.neutron [req-9c6ae6d0-d766-4bc7-bb1c-06d2d9222776 req-ad586a88-0480-423e-ae3b-77544f56f9ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Refreshing network info cache for port 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:20:22 compute-0 sshd-session[236004]: Invalid user redis from 188.166.69.60 port 44164
Jan 22 00:20:22 compute-0 sshd-session[236004]: Connection closed by invalid user redis 188.166.69.60 port 44164 [preauth]
Jan 22 00:20:22 compute-0 nova_compute[182935]: 2026-01-22 00:20:22.861 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.318 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000096', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '02bcfc5f1f1044a3856e73a5938ff011', 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'hostId': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.319 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.333 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.334 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '722e4ebb-a724-40fc-bd0a-7ac672c49a31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-vda', 'timestamp': '2026-01-22T00:20:23.320119', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249a72da-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.114396816, 'message_signature': 'd5a32f1673c4067122034dca921edb01eb8692732fbe89a8109592895a7983ff'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-sda', 'timestamp': '2026-01-22T00:20:23.320119', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249a8676-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.114396816, 'message_signature': '9dc2159f26a9e2b61504b16495e9e290d878fb9288dc9faef7c9f2ab9dd6039d'}]}, 'timestamp': '2026-01-22 00:20:23.334683', '_unique_id': '4aa4bb04066740b0b52bc07b3cc4d9ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.337 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.338 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.338 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.339 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82af63fe-8137-4dc7-9618-ec6b56b97b60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-vda', 'timestamp': '2026-01-22T00:20:23.338886', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '249b39cc-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.114396816, 'message_signature': '5e759ceceb31a1a8cb60755c5d01288dca0ceec133f32643dfddf140d3419d03'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-sda', 'timestamp': '2026-01-22T00:20:23.338886', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '249b437c-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.114396816, 'message_signature': 'a302a0a89505d6b0b596f4b6a1c27c981cf5df78ddfc0a9917e0cbe54d98298a'}]}, 'timestamp': '2026-01-22 00:20:23.339452', '_unique_id': 'e6dfb0a68eb44739ae7f7fc39b35f69b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.340 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.341 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.344 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd / tap96c8ac3c-cf inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.344 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c276fbee-aa0e-4fc0-8972-4d01615f1392', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-00000096-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-tap96c8ac3c-cf', 'timestamp': '2026-01-22T00:20:23.341493', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'tap96c8ac3c-cf', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:6d:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96c8ac3c-cf'}, 'message_id': '249c290e-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.135760524, 'message_signature': 'a31098430f5c5d2da32dcb41782b9d30cd9bc878258fd6dc3f205d54ce3cf1d3'}]}, 'timestamp': '2026-01-22 00:20:23.345466', '_unique_id': '7fffe8da6b1d429e896ebd4c899d325e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.346 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.347 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.347 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.347 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333>]
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.348 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.348 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f81d8ff2-3c25-4907-a550-9d229be3b498', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-00000096-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-tap96c8ac3c-cf', 'timestamp': '2026-01-22T00:20:23.348329', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'tap96c8ac3c-cf', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:6d:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96c8ac3c-cf'}, 'message_id': '249caba4-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.135760524, 'message_signature': 'e6b23b0ffb32cd51e291c6a286cd581813366358c6225f8aff66dc718ee3d562'}]}, 'timestamp': '2026-01-22 00:20:23.348640', '_unique_id': '9d6e3f98c2ab42cd8d8ab865a25bc3f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.349 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.350 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.350 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af405ff0-78bd-4e99-8467-6eb815de112b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-00000096-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-tap96c8ac3c-cf', 'timestamp': '2026-01-22T00:20:23.350277', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'tap96c8ac3c-cf', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:6d:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96c8ac3c-cf'}, 'message_id': '249cf67c-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.135760524, 'message_signature': '7225c41ee04b65829ea11c76199a6c4d97b97a055521190ccc587bc5f5340515'}]}, 'timestamp': '2026-01-22 00:20:23.350586', '_unique_id': 'cc83149aeb8b4c9cab17826f4a9645f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.351 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:20:23 compute-0 nova_compute[182935]: 2026-01-22 00:20:23.380 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.383 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.384 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33c026e3-e7ca-47fd-9d74-0aeb7a8a323b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-vda', 'timestamp': '2026-01-22T00:20:23.352444', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '24a22a7a-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.146723345, 'message_signature': 'cd3ead87087bf6fa88c68ab0a07b54d86ce567bffbbc666160db8100000905f0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-sda', 'timestamp': '2026-01-22T00:20:23.352444', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '24a23a88-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.146723345, 'message_signature': 'b723f96ed28578d555525538b5024bee3c3f9290d09e6a211302f8f3bc29d748'}]}, 'timestamp': '2026-01-22 00:20:23.385069', '_unique_id': '2fcb1b6ad19c4180b8694f9d2a5e41bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.386 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.387 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.387 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bffd0da-1a33-400f-8728-3b0bb3ea7f27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-00000096-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-tap96c8ac3c-cf', 'timestamp': '2026-01-22T00:20:23.387271', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'tap96c8ac3c-cf', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:6d:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96c8ac3c-cf'}, 'message_id': '24a29c08-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.135760524, 'message_signature': 'edeeb746b2a4164e7383985134af243309117484829e076e883d618c32ef1962'}]}, 'timestamp': '2026-01-22 00:20:23.387581', '_unique_id': 'c3a889d1a22e495fa9e4fa82e7751168'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.388 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.389 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.389 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.389 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333>]
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.389 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.389 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b20e3e20-c024-4d6d-af57-b0640ee8986b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-00000096-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-tap96c8ac3c-cf', 'timestamp': '2026-01-22T00:20:23.389958', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'tap96c8ac3c-cf', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:6d:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96c8ac3c-cf'}, 'message_id': '24a304a4-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.135760524, 'message_signature': '2f8b6a6b15bbf41e3020d92a2a48656eecc3c419342edad92ca5eab4b1c6b986'}]}, 'timestamp': '2026-01-22 00:20:23.390242', '_unique_id': '674334f917a14be28432ca67d1b6cab6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.390 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.391 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.391 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.read.latency volume: 146626274 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.read.latency volume: 693036 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcd24e5d-3d8c-44b2-b3fc-84cef6c5b538', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 146626274, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-vda', 'timestamp': '2026-01-22T00:20:23.391878', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '24a34f72-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.146723345, 'message_signature': '221aaab18df224efc2a70d7aac3cf7ff4968fc55d9451562ac8fdb1fefe850cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 693036, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-sda', 'timestamp': '2026-01-22T00:20:23.391878', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '24a35a58-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.146723345, 'message_signature': 'f8703ec535f75689213af9b20993fa9d89eaaf36feee013a01cdc9db523fb49e'}]}, 'timestamp': '2026-01-22 00:20:23.392414', '_unique_id': '1d0becd200c44a8e92b186d376d8384e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.392 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.393 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.393 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '357dd50c-0936-436b-a94d-0e2f27f710c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-vda', 'timestamp': '2026-01-22T00:20:23.393897', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '24a39e46-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.146723345, 'message_signature': '3689bfb89a3c9dbe178081f863ee2889c6d44bdb26755ee683cb3c376b147985'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-sda', 'timestamp': '2026-01-22T00:20:23.393897', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '24a3a79c-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.146723345, 'message_signature': 'a7108abe4bf07e950ec86b458866d5e0532fbfbbc50585ddffccc89de16f430b'}]}, 'timestamp': '2026-01-22 00:20:23.394391', '_unique_id': '38b0878e44834b53a5bb9a97a00a0585'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.394 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.395 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.395 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a76aa531-dc5e-47b6-9e9c-21daea46263e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-00000096-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-tap96c8ac3c-cf', 'timestamp': '2026-01-22T00:20:23.395877', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'tap96c8ac3c-cf', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:6d:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96c8ac3c-cf'}, 'message_id': '24a3ebc6-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.135760524, 'message_signature': '078da8f97cb08ad7c5531d10b0cf8aac20fb662d6ea2fdc862b65ef09c067f7b'}]}, 'timestamp': '2026-01-22 00:20:23.396152', '_unique_id': '0ab5758ce32847288be3ba17206476be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.396 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.397 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.397 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.397 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e590748-96f3-4a20-beec-e5c5576fcb2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-vda', 'timestamp': '2026-01-22T00:20:23.397660', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '24a43216-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.114396816, 'message_signature': '66c851acefd11c97cb21b89450909f6fda671fb2bf1fa7334a795809cf48a78b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-sda', 'timestamp': '2026-01-22T00:20:23.397660', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '24a43d06-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.114396816, 'message_signature': '8325d4329dc7581446550b695456c7713b25670ceca1a2a7635f62fa0886d37f'}]}, 'timestamp': '2026-01-22 00:20:23.398217', '_unique_id': '7064ff0b49cd499cbf28e89978bcf53c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.399 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.399 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.399 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333>]
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.400 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.416 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/cpu volume: 8030000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9a8cb09-bf48-48b1-a6ad-70655f6c827e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8030000000, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'timestamp': '2026-01-22T00:20:23.400117', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '24a7217e-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.210737339, 'message_signature': 'afc14ad88c447743b6438fbcd4afa92c767c03d5a1c82fa088877ec5c686d2c3'}]}, 'timestamp': '2026-01-22 00:20:23.417277', '_unique_id': 'ea25ac814326447082b339dea7205270'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.418 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.419 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.419 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73c2cf21-317a-47b9-8576-76b8f60e621a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-00000096-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-tap96c8ac3c-cf', 'timestamp': '2026-01-22T00:20:23.419254', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'tap96c8ac3c-cf', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:6d:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96c8ac3c-cf'}, 'message_id': '24a77d4a-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.135760524, 'message_signature': '427d1064f274a70212ae6ae08af1672841f59ed041dfc5b2a8e13de530428cff'}]}, 'timestamp': '2026-01-22 00:20:23.419556', '_unique_id': '405ec14cd6454e409415062eb19cb4e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.420 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.421 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.421 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd: ceilometer.compute.pollsters.NoVolumeException
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.421 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.421 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.421 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10c5906a-f739-43f7-885f-f1447af8a3f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-vda', 'timestamp': '2026-01-22T00:20:23.421546', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '24a7d6a0-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.146723345, 'message_signature': 'a324b2b50e36a60160937d7336d841ea3eef986a2a177d092656602a1a9546e9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-sda', 'timestamp': '2026-01-22T00:20:23.421546', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '24a7e190-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.146723345, 'message_signature': 'eacfcfaac978ad8b0c26c8f4fef8f3faefe705fd7fa929309a2f91d3fb57c5e8'}]}, 'timestamp': '2026-01-22 00:20:23.422092', '_unique_id': 'fd8e8274b5be4ad193df58c66d1aed01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.423 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.423 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e67a7df6-d21a-4053-beb1-c42aaab50fd6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-00000096-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-tap96c8ac3c-cf', 'timestamp': '2026-01-22T00:20:23.423657', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'tap96c8ac3c-cf', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:6d:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96c8ac3c-cf'}, 'message_id': '24a829a2-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.135760524, 'message_signature': 'a8c5efc232231a8bcdce386132d3278e396b164a4e2629daea50cbf5d87e95eb'}]}, 'timestamp': '2026-01-22 00:20:23.423953', '_unique_id': 'e41d01fc8a014fd995a489dd391ba8a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.424 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.425 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.425 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.425 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ecef209-6fd3-41b0-a027-019f3ec39420', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-vda', 'timestamp': '2026-01-22T00:20:23.425566', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '24a873a8-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.146723345, 'message_signature': '01657eae3d60da558049cd0aa51a3168099c3db004f46365c8313e1efbebe5d2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-sda', 'timestamp': '2026-01-22T00:20:23.425566', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '24a87fce-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.146723345, 'message_signature': '9eb421652a902887b29dae0a63fac65db636ad6b4f38a544f1d3cb966522cf8a'}]}, 'timestamp': '2026-01-22 00:20:23.426175', '_unique_id': 'cdd80baedcb54c2e8b268d2cc07c9fd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.427 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.427 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34eb832a-a2b9-4711-b4ff-a027fcdbedd8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-00000096-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-tap96c8ac3c-cf', 'timestamp': '2026-01-22T00:20:23.427671', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'tap96c8ac3c-cf', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:6d:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96c8ac3c-cf'}, 'message_id': '24a8c678-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.135760524, 'message_signature': '1d263ad11d50812c6c3f5d0ba3aa745bbb2536778ade107337b7ad68307094b2'}]}, 'timestamp': '2026-01-22 00:20:23.427978', '_unique_id': 'b9ff3ce14c154c5e80f91bfc1a13fd4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.428 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.429 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.429 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6be08f74-ecb7-45c9-a807-e35d59b190ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'instance-00000096-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-tap96c8ac3c-cf', 'timestamp': '2026-01-22T00:20:23.429648', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'tap96c8ac3c-cf', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:6d:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap96c8ac3c-cf'}, 'message_id': '24a91560-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.135760524, 'message_signature': 'b35d09db3bf6f0be0a24848958028802f6150dd709a5fe2252a5fed0ae52ebfd'}]}, 'timestamp': '2026-01-22 00:20:23.430024', '_unique_id': '7d191340c8224183b9a1a7c6001bf7d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.430 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.431 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.431 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.431 12 DEBUG ceilometer.compute.pollsters [-] b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '117977bf-83ee-4706-9f8e-73ca44b3a78b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-vda', 'timestamp': '2026-01-22T00:20:23.431602', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '24a95f2a-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.146723345, 'message_signature': '19bc617c609e68b49cdda5fab32826b461858c38e5f1c18cc99be1a3a947289d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_name': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_name': None, 'resource_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-sda', 'timestamp': '2026-01-22T00:20:23.431602', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333', 'name': 'instance-00000096', 'instance_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'instance_type': 'm1.nano', 'host': 'abc6f880999cf71abfae9428442b523cbe2a63b2375db9a4c764decb', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '24a96a56-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5726.146723345, 'message_signature': '6291839f16864c979121d18e1b5e9fb554a62c9731d0cdb41a82045e9f93a73d'}]}, 'timestamp': '2026-01-22 00:20:23.432148', '_unique_id': '95e309bd848c4e2c9035a8c882a01dfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.432 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.433 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.433 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:20:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:20:23.433 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333>]
Jan 22 00:20:23 compute-0 podman[236006]: 2026-01-22 00:20:23.686632224 +0000 UTC m=+0.052449339 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:20:24 compute-0 nova_compute[182935]: 2026-01-22 00:20:24.197 182939 DEBUG nova.network.neutron [req-9c6ae6d0-d766-4bc7-bb1c-06d2d9222776 req-ad586a88-0480-423e-ae3b-77544f56f9ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Updated VIF entry in instance network info cache for port 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:20:24 compute-0 nova_compute[182935]: 2026-01-22 00:20:24.198 182939 DEBUG nova.network.neutron [req-9c6ae6d0-d766-4bc7-bb1c-06d2d9222776 req-ad586a88-0480-423e-ae3b-77544f56f9ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Updating instance_info_cache with network_info: [{"id": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "address": "fa:16:3e:a0:6d:17", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96c8ac3c-cf", "ovs_interfaceid": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:20:24 compute-0 nova_compute[182935]: 2026-01-22 00:20:24.244 182939 DEBUG oslo_concurrency.lockutils [req-9c6ae6d0-d766-4bc7-bb1c-06d2d9222776 req-ad586a88-0480-423e-ae3b-77544f56f9ff 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:20:26 compute-0 ovn_controller[95047]: 2026-01-22T00:20:26Z|00561|binding|INFO|Releasing lport 737a2d1f-ad8c-46d7-ba36-880bbc6b5728 from this chassis (sb_readonly=0)
Jan 22 00:20:27 compute-0 nova_compute[182935]: 2026-01-22 00:20:27.071 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:27 compute-0 ovn_controller[95047]: 2026-01-22T00:20:27Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a0:6d:17 10.100.0.10
Jan 22 00:20:27 compute-0 ovn_controller[95047]: 2026-01-22T00:20:27Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a0:6d:17 10.100.0.10
Jan 22 00:20:27 compute-0 nova_compute[182935]: 2026-01-22 00:20:27.863 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:28 compute-0 nova_compute[182935]: 2026-01-22 00:20:28.383 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:30 compute-0 podman[236043]: 2026-01-22 00:20:30.68422502 +0000 UTC m=+0.054703732 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, distribution-scope=public, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:20:30 compute-0 podman[236044]: 2026-01-22 00:20:30.71109988 +0000 UTC m=+0.074316820 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:20:32 compute-0 nova_compute[182935]: 2026-01-22 00:20:32.865 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:33 compute-0 nova_compute[182935]: 2026-01-22 00:20:33.385 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:37 compute-0 nova_compute[182935]: 2026-01-22 00:20:37.867 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:38 compute-0 nova_compute[182935]: 2026-01-22 00:20:38.387 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:40 compute-0 nova_compute[182935]: 2026-01-22 00:20:40.233 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:42 compute-0 nova_compute[182935]: 2026-01-22 00:20:42.868 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:43 compute-0 nova_compute[182935]: 2026-01-22 00:20:43.390 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:45 compute-0 podman[236085]: 2026-01-22 00:20:45.681934177 +0000 UTC m=+0.052905650 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:20:45 compute-0 podman[236084]: 2026-01-22 00:20:45.751873382 +0000 UTC m=+0.121853381 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:20:45 compute-0 nova_compute[182935]: 2026-01-22 00:20:45.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:45 compute-0 nova_compute[182935]: 2026-01-22 00:20:45.882 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:45 compute-0 nova_compute[182935]: 2026-01-22 00:20:45.883 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:45 compute-0 nova_compute[182935]: 2026-01-22 00:20:45.883 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:45 compute-0 nova_compute[182935]: 2026-01-22 00:20:45.883 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:20:45 compute-0 nova_compute[182935]: 2026-01-22 00:20:45.982 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.052 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.053 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.115 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.270 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.272 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5520MB free_disk=73.09762954711914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.272 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.273 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.381 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.382 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.382 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.435 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.482 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.506 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.506 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:46.766 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:20:46 compute-0 nova_compute[182935]: 2026-01-22 00:20:46.766 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:46.767 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:20:47 compute-0 nova_compute[182935]: 2026-01-22 00:20:47.508 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:47 compute-0 nova_compute[182935]: 2026-01-22 00:20:47.508 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:20:47 compute-0 nova_compute[182935]: 2026-01-22 00:20:47.534 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:20:47 compute-0 nova_compute[182935]: 2026-01-22 00:20:47.535 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:47 compute-0 nova_compute[182935]: 2026-01-22 00:20:47.535 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:20:47 compute-0 nova_compute[182935]: 2026-01-22 00:20:47.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:47 compute-0 nova_compute[182935]: 2026-01-22 00:20:47.870 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:48 compute-0 nova_compute[182935]: 2026-01-22 00:20:48.392 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:48 compute-0 podman[236138]: 2026-01-22 00:20:48.673692024 +0000 UTC m=+0.049994141 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:20:49 compute-0 nova_compute[182935]: 2026-01-22 00:20:49.442 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:49.769 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:51 compute-0 nova_compute[182935]: 2026-01-22 00:20:51.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:52 compute-0 nova_compute[182935]: 2026-01-22 00:20:52.872 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:52 compute-0 nova_compute[182935]: 2026-01-22 00:20:52.977 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:52 compute-0 nova_compute[182935]: 2026-01-22 00:20:52.977 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.123 182939 DEBUG nova.compute.manager [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.238 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.239 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.249 182939 DEBUG nova.virt.hardware [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.249 182939 INFO nova.compute.claims [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.394 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.448 182939 DEBUG nova.compute.provider_tree [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.481 182939 DEBUG nova.scheduler.client.report [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.524 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.525 182939 DEBUG nova.compute.manager [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.637 182939 DEBUG nova.compute.manager [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.637 182939 DEBUG nova.network.neutron [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.657 182939 INFO nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.671 182939 DEBUG nova.compute.manager [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.814 182939 DEBUG nova.compute.manager [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.815 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.816 182939 INFO nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Creating image(s)
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.816 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.817 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.817 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.836 182939 DEBUG nova.policy [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.841 182939 DEBUG oslo_concurrency.processutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.903 182939 DEBUG oslo_concurrency.processutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.904 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.905 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.916 182939 DEBUG oslo_concurrency.processutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.992 182939 DEBUG oslo_concurrency.processutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:53 compute-0 nova_compute[182935]: 2026-01-22 00:20:53.993 182939 DEBUG oslo_concurrency.processutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.029 182939 DEBUG oslo_concurrency.processutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.030 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.031 182939 DEBUG oslo_concurrency.processutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.087 182939 DEBUG oslo_concurrency.processutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.088 182939 DEBUG nova.virt.disk.api [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Checking if we can resize image /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.089 182939 DEBUG oslo_concurrency.processutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.144 182939 DEBUG oslo_concurrency.processutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.145 182939 DEBUG nova.virt.disk.api [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Cannot resize image /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.145 182939 DEBUG nova.objects.instance [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.164 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.165 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Ensure instance console log exists: /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.165 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.166 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:54 compute-0 nova_compute[182935]: 2026-01-22 00:20:54.166 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:54 compute-0 podman[236178]: 2026-01-22 00:20:54.671664473 +0000 UTC m=+0.047877241 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 22 00:20:55 compute-0 nova_compute[182935]: 2026-01-22 00:20:55.435 182939 DEBUG nova.network.neutron [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Successfully created port: 2de3f942-6922-4800-9d3a-d06aa1263f44 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:20:55 compute-0 nova_compute[182935]: 2026-01-22 00:20:55.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:56 compute-0 nova_compute[182935]: 2026-01-22 00:20:56.644 182939 DEBUG nova.network.neutron [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Successfully updated port: 2de3f942-6922-4800-9d3a-d06aa1263f44 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:20:56 compute-0 nova_compute[182935]: 2026-01-22 00:20:56.662 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:20:56 compute-0 nova_compute[182935]: 2026-01-22 00:20:56.662 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:20:56 compute-0 nova_compute[182935]: 2026-01-22 00:20:56.662 182939 DEBUG nova.network.neutron [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:20:56 compute-0 nova_compute[182935]: 2026-01-22 00:20:56.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:20:56 compute-0 nova_compute[182935]: 2026-01-22 00:20:56.820 182939 DEBUG nova.compute.manager [req-60121c2b-86ef-4b39-a562-e242ac68fc11 req-bbf9b0ce-0893-4a85-afa3-3ee7c8a4c1bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-changed-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:56 compute-0 nova_compute[182935]: 2026-01-22 00:20:56.820 182939 DEBUG nova.compute.manager [req-60121c2b-86ef-4b39-a562-e242ac68fc11 req-bbf9b0ce-0893-4a85-afa3-3ee7c8a4c1bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Refreshing instance network info cache due to event network-changed-2de3f942-6922-4800-9d3a-d06aa1263f44. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:20:56 compute-0 nova_compute[182935]: 2026-01-22 00:20:56.820 182939 DEBUG oslo_concurrency.lockutils [req-60121c2b-86ef-4b39-a562-e242ac68fc11 req-bbf9b0ce-0893-4a85-afa3-3ee7c8a4c1bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:20:56 compute-0 nova_compute[182935]: 2026-01-22 00:20:56.913 182939 DEBUG nova.network.neutron [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:20:57 compute-0 nova_compute[182935]: 2026-01-22 00:20:57.921 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.396 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.478 182939 DEBUG nova.network.neutron [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updating instance_info_cache with network_info: [{"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.533 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.534 182939 DEBUG nova.compute.manager [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Instance network_info: |[{"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.534 182939 DEBUG oslo_concurrency.lockutils [req-60121c2b-86ef-4b39-a562-e242ac68fc11 req-bbf9b0ce-0893-4a85-afa3-3ee7c8a4c1bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.534 182939 DEBUG nova.network.neutron [req-60121c2b-86ef-4b39-a562-e242ac68fc11 req-bbf9b0ce-0893-4a85-afa3-3ee7c8a4c1bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Refreshing network info cache for port 2de3f942-6922-4800-9d3a-d06aa1263f44 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.537 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Start _get_guest_xml network_info=[{"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.541 182939 WARNING nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.546 182939 DEBUG nova.virt.libvirt.host [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.546 182939 DEBUG nova.virt.libvirt.host [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.551 182939 DEBUG nova.virt.libvirt.host [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.551 182939 DEBUG nova.virt.libvirt.host [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.552 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.552 182939 DEBUG nova.virt.hardware [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.553 182939 DEBUG nova.virt.hardware [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.553 182939 DEBUG nova.virt.hardware [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.553 182939 DEBUG nova.virt.hardware [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.553 182939 DEBUG nova.virt.hardware [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.554 182939 DEBUG nova.virt.hardware [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.554 182939 DEBUG nova.virt.hardware [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.554 182939 DEBUG nova.virt.hardware [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.554 182939 DEBUG nova.virt.hardware [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.554 182939 DEBUG nova.virt.hardware [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.555 182939 DEBUG nova.virt.hardware [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.561 182939 DEBUG nova.virt.libvirt.vif [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:20:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-7270034',display_name='tempest-TestNetworkAdvancedServerOps-server-7270034',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-7270034',id=152,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKCbkoaVuA62O+ECqX+7Ohn7GbIbEVQCxvPvCXrKqpOjrukjt8m0tS2UNeW9SghkNu53IZT4aL6S7PqVShWjvxQooRpiJSuxHQi4r5UNidyhtoE0twes7RsZVTczYedVHA==',key_name='tempest-TestNetworkAdvancedServerOps-2075791651',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1trsnn3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:20:53Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.561 182939 DEBUG nova.network.os_vif_util [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.562 182939 DEBUG nova.network.os_vif_util [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.563 182939 DEBUG nova.objects.instance [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.583 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:20:58 compute-0 nova_compute[182935]:   <uuid>dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2</uuid>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   <name>instance-00000098</name>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-7270034</nova:name>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:20:58</nova:creationTime>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:20:58 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:20:58 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:20:58 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:20:58 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:20:58 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:20:58 compute-0 nova_compute[182935]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:20:58 compute-0 nova_compute[182935]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:20:58 compute-0 nova_compute[182935]:         <nova:port uuid="2de3f942-6922-4800-9d3a-d06aa1263f44">
Jan 22 00:20:58 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <system>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <entry name="serial">dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2</entry>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <entry name="uuid">dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2</entry>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     </system>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   <os>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   </os>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   <features>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   </features>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.config"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:13:64:5d"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <target dev="tap2de3f942-69"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/console.log" append="off"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <video>
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     </video>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:20:58 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:20:58 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:20:58 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:20:58 compute-0 nova_compute[182935]: </domain>
Jan 22 00:20:58 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.585 182939 DEBUG nova.compute.manager [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Preparing to wait for external event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.585 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.586 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.586 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.587 182939 DEBUG nova.virt.libvirt.vif [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:20:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-7270034',display_name='tempest-TestNetworkAdvancedServerOps-server-7270034',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-7270034',id=152,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKCbkoaVuA62O+ECqX+7Ohn7GbIbEVQCxvPvCXrKqpOjrukjt8m0tS2UNeW9SghkNu53IZT4aL6S7PqVShWjvxQooRpiJSuxHQi4r5UNidyhtoE0twes7RsZVTczYedVHA==',key_name='tempest-TestNetworkAdvancedServerOps-2075791651',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1trsnn3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:20:53Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.587 182939 DEBUG nova.network.os_vif_util [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.588 182939 DEBUG nova.network.os_vif_util [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.588 182939 DEBUG os_vif [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.589 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.589 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.590 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.595 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.595 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2de3f942-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.596 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2de3f942-69, col_values=(('external_ids', {'iface-id': '2de3f942-6922-4800-9d3a-d06aa1263f44', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:13:64:5d', 'vm-uuid': 'dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.598 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:58 compute-0 NetworkManager[55139]: <info>  [1769041258.5993] manager: (tap2de3f942-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.602 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.605 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.606 182939 INFO os_vif [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69')
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.671 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.672 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.672 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No VIF found with MAC fa:16:3e:13:64:5d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:20:58 compute-0 nova_compute[182935]: 2026-01-22 00:20:58.673 182939 INFO nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Using config drive
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.108 182939 INFO nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Creating config drive at /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.config
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.112 182939 DEBUG oslo_concurrency.processutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphab7kzju execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.239 182939 DEBUG oslo_concurrency.processutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphab7kzju" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:20:59 compute-0 kernel: tap2de3f942-69: entered promiscuous mode
Jan 22 00:20:59 compute-0 ovn_controller[95047]: 2026-01-22T00:20:59Z|00562|binding|INFO|Claiming lport 2de3f942-6922-4800-9d3a-d06aa1263f44 for this chassis.
Jan 22 00:20:59 compute-0 ovn_controller[95047]: 2026-01-22T00:20:59Z|00563|binding|INFO|2de3f942-6922-4800-9d3a-d06aa1263f44: Claiming fa:16:3e:13:64:5d 10.100.0.4
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.302 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:59 compute-0 NetworkManager[55139]: <info>  [1769041259.3028] manager: (tap2de3f942-69): new Tun device (/org/freedesktop/NetworkManager/Devices/273)
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.309 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:64:5d 10.100.0.4'], port_security=['fa:16:3e:13:64:5d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '61ee06fa-a63f-42b6-8f38-7bda03f7a2d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5715df4-0e68-4951-9b87-9601d69c7054, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=2de3f942-6922-4800-9d3a-d06aa1263f44) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.311 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 2de3f942-6922-4800-9d3a-d06aa1263f44 in datapath 7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d bound to our chassis
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.312 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d
Jan 22 00:20:59 compute-0 ovn_controller[95047]: 2026-01-22T00:20:59Z|00564|binding|INFO|Setting lport 2de3f942-6922-4800-9d3a-d06aa1263f44 ovn-installed in OVS
Jan 22 00:20:59 compute-0 ovn_controller[95047]: 2026-01-22T00:20:59Z|00565|binding|INFO|Setting lport 2de3f942-6922-4800-9d3a-d06aa1263f44 up in Southbound
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.319 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.326 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf26cf6-3680-4897-9727-358840821e63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.327 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7a7a8118-c1 in ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.330 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7a7a8118-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.330 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3193379b-0ba0-4f73-bd0a-7b2fcb489666]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.331 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[363a1ca7-7d33-48d3-9fa6-3fa93a8da8fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 systemd-udevd[236216]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:20:59 compute-0 systemd-machined[154182]: New machine qemu-75-instance-00000098.
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.343 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[0167fb11-11e8-4da3-abe6-ff71e54a89a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 NetworkManager[55139]: <info>  [1769041259.3469] device (tap2de3f942-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:20:59 compute-0 NetworkManager[55139]: <info>  [1769041259.3476] device (tap2de3f942-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.360 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e47785a6-d600-4299-831d-a14b644718c6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 systemd[1]: Started Virtual Machine qemu-75-instance-00000098.
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.386 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[91c52900-e97c-4efd-80ed-e8524f7bfd21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 ovn_controller[95047]: 2026-01-22T00:20:59Z|00566|binding|INFO|Releasing lport 737a2d1f-ad8c-46d7-ba36-880bbc6b5728 from this chassis (sb_readonly=0)
Jan 22 00:20:59 compute-0 systemd-udevd[236220]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.393 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cece03bf-01cf-43d5-8d8b-37817f28803c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 NetworkManager[55139]: <info>  [1769041259.3943] manager: (tap7a7a8118-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/274)
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.430 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9f05f8e1-14bc-4ed8-b571-9c73bdb8e737]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.433 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b149ef1f-b2bc-430f-ad79-6273e4b1fab6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.443 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:59 compute-0 NetworkManager[55139]: <info>  [1769041259.4581] device (tap7a7a8118-c0): carrier: link connected
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.464 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7e25e7f8-379f-463a-9d2c-796865112fb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.480 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cc50b4ab-d8f5-4817-8366-3342592b3144]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a7a8118-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:58:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576219, 'reachable_time': 16067, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236249, 'error': None, 'target': 'ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.500 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[67f60b99-5aa1-4727-8ed9-22d05c2d4426]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1c:58f4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 576219, 'tstamp': 576219}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236250, 'error': None, 'target': 'ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.517 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[965ae0de-54fe-4a44-ba52-4f40a305f2f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a7a8118-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:58:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576219, 'reachable_time': 16067, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236251, 'error': None, 'target': 'ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.559 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fc2fdfb2-cef0-48f8-8d48-2e489268b7ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.630 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[55ab3deb-53ac-466e-bba8-da062d4b945e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.632 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a7a8118-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.632 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.633 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a7a8118-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:59 compute-0 NetworkManager[55139]: <info>  [1769041259.6355] manager: (tap7a7a8118-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/275)
Jan 22 00:20:59 compute-0 kernel: tap7a7a8118-c0: entered promiscuous mode
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.637 182939 DEBUG nova.compute.manager [req-94478e60-8a42-4b2b-83aa-a8e5cf4a5eea req-fe7027f5-82e0-43b4-bd16-70a97dfc0707 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.637 182939 DEBUG oslo_concurrency.lockutils [req-94478e60-8a42-4b2b-83aa-a8e5cf4a5eea req-fe7027f5-82e0-43b4-bd16-70a97dfc0707 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.638 182939 DEBUG oslo_concurrency.lockutils [req-94478e60-8a42-4b2b-83aa-a8e5cf4a5eea req-fe7027f5-82e0-43b4-bd16-70a97dfc0707 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.638 182939 DEBUG oslo_concurrency.lockutils [req-94478e60-8a42-4b2b-83aa-a8e5cf4a5eea req-fe7027f5-82e0-43b4-bd16-70a97dfc0707 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.638 182939 DEBUG nova.compute.manager [req-94478e60-8a42-4b2b-83aa-a8e5cf4a5eea req-fe7027f5-82e0-43b4-bd16-70a97dfc0707 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Processing event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.639 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.640 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7a7a8118-c0, col_values=(('external_ids', {'iface-id': 'd1e02cd9-8126-49f7-b1af-0e9e0399bf45'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:20:59 compute-0 ovn_controller[95047]: 2026-01-22T00:20:59Z|00567|binding|INFO|Releasing lport d1e02cd9-8126-49f7-b1af-0e9e0399bf45 from this chassis (sb_readonly=0)
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.655 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.657 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.658 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[40656257-990a-4c97-a42c-979ea9de8eb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.659 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d.pid.haproxy
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:20:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:20:59.659 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'env', 'PROCESS_TAG=haproxy-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.685 182939 DEBUG nova.compute.manager [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.685 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041259.684381, dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.686 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] VM Started (Lifecycle Event)
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.688 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.691 182939 INFO nova.virt.libvirt.driver [-] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Instance spawned successfully.
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.692 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.708 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.713 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.716 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.717 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.717 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.717 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.718 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.718 182939 DEBUG nova.virt.libvirt.driver [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.756 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.757 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041259.6846654, dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.757 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] VM Paused (Lifecycle Event)
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.785 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.788 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041259.6877193, dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.789 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] VM Resumed (Lifecycle Event)
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.816 182939 INFO nova.compute.manager [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Took 6.00 seconds to spawn the instance on the hypervisor.
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.817 182939 DEBUG nova.compute.manager [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.828 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.830 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.859 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.904 182939 INFO nova.compute.manager [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Took 6.71 seconds to build instance.
Jan 22 00:20:59 compute-0 nova_compute[182935]: 2026-01-22 00:20:59.926 182939 DEBUG oslo_concurrency.lockutils [None req-0cd583fb-cfdc-49fd-a38e-c913e1ad2483 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.948s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:00 compute-0 podman[236287]: 2026-01-22 00:21:00.023417262 +0000 UTC m=+0.055221455 container create 8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:21:00 compute-0 systemd[1]: Started libpod-conmon-8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9.scope.
Jan 22 00:21:00 compute-0 podman[236287]: 2026-01-22 00:20:59.994872873 +0000 UTC m=+0.026677086 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:21:00 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:21:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ddd75296d95dee910c9cf63a5185698460f21896a3820e83071bc5cc3e9cb3dc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:21:00 compute-0 podman[236287]: 2026-01-22 00:21:00.111114019 +0000 UTC m=+0.142918232 container init 8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 00:21:00 compute-0 podman[236287]: 2026-01-22 00:21:00.117078081 +0000 UTC m=+0.148882274 container start 8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 00:21:00 compute-0 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[236302]: [NOTICE]   (236306) : New worker (236308) forked
Jan 22 00:21:00 compute-0 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[236302]: [NOTICE]   (236306) : Loading success.
Jan 22 00:21:01 compute-0 nova_compute[182935]: 2026-01-22 00:21:01.278 182939 DEBUG nova.network.neutron [req-60121c2b-86ef-4b39-a562-e242ac68fc11 req-bbf9b0ce-0893-4a85-afa3-3ee7c8a4c1bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updated VIF entry in instance network info cache for port 2de3f942-6922-4800-9d3a-d06aa1263f44. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:21:01 compute-0 nova_compute[182935]: 2026-01-22 00:21:01.279 182939 DEBUG nova.network.neutron [req-60121c2b-86ef-4b39-a562-e242ac68fc11 req-bbf9b0ce-0893-4a85-afa3-3ee7c8a4c1bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updating instance_info_cache with network_info: [{"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:01 compute-0 nova_compute[182935]: 2026-01-22 00:21:01.297 182939 DEBUG oslo_concurrency.lockutils [req-60121c2b-86ef-4b39-a562-e242ac68fc11 req-bbf9b0ce-0893-4a85-afa3-3ee7c8a4c1bb 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:21:01 compute-0 podman[236318]: 2026-01-22 00:21:01.682277399 +0000 UTC m=+0.056819314 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:21:01 compute-0 podman[236317]: 2026-01-22 00:21:01.685603737 +0000 UTC m=+0.060845108 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container)
Jan 22 00:21:01 compute-0 nova_compute[182935]: 2026-01-22 00:21:01.733 182939 DEBUG nova.compute.manager [req-07762f34-dc4d-4ef3-b287-4e334ae05286 req-58dc993c-0d3a-48a6-aee5-10c4396a81af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:01 compute-0 nova_compute[182935]: 2026-01-22 00:21:01.733 182939 DEBUG oslo_concurrency.lockutils [req-07762f34-dc4d-4ef3-b287-4e334ae05286 req-58dc993c-0d3a-48a6-aee5-10c4396a81af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:01 compute-0 nova_compute[182935]: 2026-01-22 00:21:01.733 182939 DEBUG oslo_concurrency.lockutils [req-07762f34-dc4d-4ef3-b287-4e334ae05286 req-58dc993c-0d3a-48a6-aee5-10c4396a81af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:01 compute-0 nova_compute[182935]: 2026-01-22 00:21:01.734 182939 DEBUG oslo_concurrency.lockutils [req-07762f34-dc4d-4ef3-b287-4e334ae05286 req-58dc993c-0d3a-48a6-aee5-10c4396a81af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:01 compute-0 nova_compute[182935]: 2026-01-22 00:21:01.734 182939 DEBUG nova.compute.manager [req-07762f34-dc4d-4ef3-b287-4e334ae05286 req-58dc993c-0d3a-48a6-aee5-10c4396a81af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] No waiting events found dispatching network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:01 compute-0 nova_compute[182935]: 2026-01-22 00:21:01.734 182939 WARNING nova.compute.manager [req-07762f34-dc4d-4ef3-b287-4e334ae05286 req-58dc993c-0d3a-48a6-aee5-10c4396a81af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received unexpected event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 for instance with vm_state active and task_state None.
Jan 22 00:21:02 compute-0 nova_compute[182935]: 2026-01-22 00:21:02.923 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:03.219 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:03.219 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:03.220 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:03 compute-0 nova_compute[182935]: 2026-01-22 00:21:03.598 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:03 compute-0 nova_compute[182935]: 2026-01-22 00:21:03.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:05 compute-0 sshd-session[236356]: Invalid user redis from 188.166.69.60 port 43716
Jan 22 00:21:05 compute-0 sshd-session[236356]: Connection closed by invalid user redis 188.166.69.60 port 43716 [preauth]
Jan 22 00:21:05 compute-0 nova_compute[182935]: 2026-01-22 00:21:05.294 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:07 compute-0 nova_compute[182935]: 2026-01-22 00:21:07.926 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:08 compute-0 nova_compute[182935]: 2026-01-22 00:21:08.600 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:09 compute-0 nova_compute[182935]: 2026-01-22 00:21:09.453 182939 DEBUG nova.compute.manager [req-ef25d27d-add1-4b41-9767-72d9b4380ba8 req-0b399ffd-a49c-48ae-b921-750686bb5723 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-changed-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:09 compute-0 nova_compute[182935]: 2026-01-22 00:21:09.454 182939 DEBUG nova.compute.manager [req-ef25d27d-add1-4b41-9767-72d9b4380ba8 req-0b399ffd-a49c-48ae-b921-750686bb5723 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Refreshing instance network info cache due to event network-changed-2de3f942-6922-4800-9d3a-d06aa1263f44. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:21:09 compute-0 nova_compute[182935]: 2026-01-22 00:21:09.455 182939 DEBUG oslo_concurrency.lockutils [req-ef25d27d-add1-4b41-9767-72d9b4380ba8 req-0b399ffd-a49c-48ae-b921-750686bb5723 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:21:09 compute-0 nova_compute[182935]: 2026-01-22 00:21:09.455 182939 DEBUG oslo_concurrency.lockutils [req-ef25d27d-add1-4b41-9767-72d9b4380ba8 req-0b399ffd-a49c-48ae-b921-750686bb5723 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:21:09 compute-0 nova_compute[182935]: 2026-01-22 00:21:09.455 182939 DEBUG nova.network.neutron [req-ef25d27d-add1-4b41-9767-72d9b4380ba8 req-0b399ffd-a49c-48ae-b921-750686bb5723 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Refreshing network info cache for port 2de3f942-6922-4800-9d3a-d06aa1263f44 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:21:11 compute-0 nova_compute[182935]: 2026-01-22 00:21:11.153 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:12 compute-0 nova_compute[182935]: 2026-01-22 00:21:12.708 182939 DEBUG nova.network.neutron [req-ef25d27d-add1-4b41-9767-72d9b4380ba8 req-0b399ffd-a49c-48ae-b921-750686bb5723 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updated VIF entry in instance network info cache for port 2de3f942-6922-4800-9d3a-d06aa1263f44. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:21:12 compute-0 nova_compute[182935]: 2026-01-22 00:21:12.709 182939 DEBUG nova.network.neutron [req-ef25d27d-add1-4b41-9767-72d9b4380ba8 req-0b399ffd-a49c-48ae-b921-750686bb5723 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updating instance_info_cache with network_info: [{"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:12 compute-0 nova_compute[182935]: 2026-01-22 00:21:12.739 182939 DEBUG oslo_concurrency.lockutils [req-ef25d27d-add1-4b41-9767-72d9b4380ba8 req-0b399ffd-a49c-48ae-b921-750686bb5723 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:21:12 compute-0 nova_compute[182935]: 2026-01-22 00:21:12.927 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:13 compute-0 nova_compute[182935]: 2026-01-22 00:21:13.604 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:14 compute-0 ovn_controller[95047]: 2026-01-22T00:21:14Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:13:64:5d 10.100.0.4
Jan 22 00:21:14 compute-0 ovn_controller[95047]: 2026-01-22T00:21:14Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:13:64:5d 10.100.0.4
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.439 182939 DEBUG oslo_concurrency.lockutils [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.439 182939 DEBUG oslo_concurrency.lockutils [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.440 182939 DEBUG oslo_concurrency.lockutils [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.440 182939 DEBUG oslo_concurrency.lockutils [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.440 182939 DEBUG oslo_concurrency.lockutils [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.453 182939 INFO nova.compute.manager [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Terminating instance
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.468 182939 DEBUG nova.compute.manager [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:21:16 compute-0 kernel: tap96c8ac3c-cf (unregistering): left promiscuous mode
Jan 22 00:21:16 compute-0 NetworkManager[55139]: <info>  [1769041276.4915] device (tap96c8ac3c-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.504 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:16 compute-0 ovn_controller[95047]: 2026-01-22T00:21:16Z|00568|binding|INFO|Releasing lport 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 from this chassis (sb_readonly=0)
Jan 22 00:21:16 compute-0 ovn_controller[95047]: 2026-01-22T00:21:16Z|00569|binding|INFO|Setting lport 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 down in Southbound
Jan 22 00:21:16 compute-0 ovn_controller[95047]: 2026-01-22T00:21:16Z|00570|binding|INFO|Removing iface tap96c8ac3c-cf ovn-installed in OVS
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.506 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.513 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:6d:17 10.100.0.10'], port_security=['fa:16:3e:a0:6d:17 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65641ee3-5688-4f52-8e2b-2aae97505b84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9d5746ab-567f-4771-baec-483e6edef99f ac71d239-4d62-4e17-b02e-055a8db336af', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e3d414ff-1f29-4cd2-96c4-c90cd0d603fc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=96c8ac3c-cf3f-4189-9267-c0a4096a0fb1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.517 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 in datapath 65641ee3-5688-4f52-8e2b-2aae97505b84 unbound from our chassis
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.517 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.518 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 65641ee3-5688-4f52-8e2b-2aae97505b84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.520 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5fdc0439-aeac-40fd-a97b-6a2b66dd6fdb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.520 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84 namespace which is not needed anymore
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.524 182939 DEBUG nova.compute.manager [req-16b18583-14d4-4184-b15f-3360e7912b0b req-9376377d-b821-4a69-967d-626b6f1a7009 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Received event network-changed-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.525 182939 DEBUG nova.compute.manager [req-16b18583-14d4-4184-b15f-3360e7912b0b req-9376377d-b821-4a69-967d-626b6f1a7009 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Refreshing instance network info cache due to event network-changed-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.525 182939 DEBUG oslo_concurrency.lockutils [req-16b18583-14d4-4184-b15f-3360e7912b0b req-9376377d-b821-4a69-967d-626b6f1a7009 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.525 182939 DEBUG oslo_concurrency.lockutils [req-16b18583-14d4-4184-b15f-3360e7912b0b req-9376377d-b821-4a69-967d-626b6f1a7009 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.525 182939 DEBUG nova.network.neutron [req-16b18583-14d4-4184-b15f-3360e7912b0b req-9376377d-b821-4a69-967d-626b6f1a7009 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Refreshing network info cache for port 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:21:16 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000096.scope: Deactivated successfully.
Jan 22 00:21:16 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000096.scope: Consumed 14.885s CPU time.
Jan 22 00:21:16 compute-0 systemd-machined[154182]: Machine qemu-74-instance-00000096 terminated.
Jan 22 00:21:16 compute-0 podman[236379]: 2026-01-22 00:21:16.594936304 +0000 UTC m=+0.070017178 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:21:16 compute-0 podman[236376]: 2026-01-22 00:21:16.668110004 +0000 UTC m=+0.143296610 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:21:16 compute-0 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[235918]: [NOTICE]   (235959) : haproxy version is 2.8.14-c23fe91
Jan 22 00:21:16 compute-0 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[235918]: [NOTICE]   (235959) : path to executable is /usr/sbin/haproxy
Jan 22 00:21:16 compute-0 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[235918]: [WARNING]  (235959) : Exiting Master process...
Jan 22 00:21:16 compute-0 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[235918]: [ALERT]    (235959) : Current worker (235965) exited with code 143 (Terminated)
Jan 22 00:21:16 compute-0 neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84[235918]: [WARNING]  (235959) : All workers exited. Exiting... (0)
Jan 22 00:21:16 compute-0 systemd[1]: libpod-51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60.scope: Deactivated successfully.
Jan 22 00:21:16 compute-0 podman[236443]: 2026-01-22 00:21:16.717833579 +0000 UTC m=+0.055310009 container died 51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 00:21:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60-userdata-shm.mount: Deactivated successfully.
Jan 22 00:21:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-79f14246ff4056350e67e9ac4fe9fb224554c6e15ccf66975861a2c0f404c5b0-merged.mount: Deactivated successfully.
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.752 182939 INFO nova.virt.libvirt.driver [-] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Instance destroyed successfully.
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.752 182939 DEBUG nova.objects.instance [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'resources' on Instance uuid b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:21:16 compute-0 podman[236443]: 2026-01-22 00:21:16.764027218 +0000 UTC m=+0.101503648 container cleanup 51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:21:16 compute-0 systemd[1]: libpod-conmon-51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60.scope: Deactivated successfully.
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.779 182939 DEBUG nova.virt.libvirt.vif [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:20:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-1321304333',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=150,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFOgFr0loz0o97S1yJic425BuuGnqIIzzaQU+1FOWYN8VLWjMOBgkt02kLpdfipR3QnvdUvT3mVD/diPnm35tClCs6BoaTbQN3VWq8tyqhLXUA2JeTkyyUA3yLrgO9t4ag==',key_name='tempest-TestSecurityGroupsBasicOps-1152614963',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:20:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-jryb8knd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:20:15Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "address": "fa:16:3e:a0:6d:17", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96c8ac3c-cf", "ovs_interfaceid": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.780 182939 DEBUG nova.network.os_vif_util [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "address": "fa:16:3e:a0:6d:17", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96c8ac3c-cf", "ovs_interfaceid": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.781 182939 DEBUG nova.network.os_vif_util [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:6d:17,bridge_name='br-int',has_traffic_filtering=True,id=96c8ac3c-cf3f-4189-9267-c0a4096a0fb1,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96c8ac3c-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.781 182939 DEBUG os_vif [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:6d:17,bridge_name='br-int',has_traffic_filtering=True,id=96c8ac3c-cf3f-4189-9267-c0a4096a0fb1,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96c8ac3c-cf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.784 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.785 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96c8ac3c-cf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.791 182939 DEBUG nova.compute.manager [req-0215805f-1012-47ba-a045-d14ffe134d2b req-3f56f3f3-0cce-494a-80aa-cc46a7c723f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Received event network-vif-unplugged-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.791 182939 DEBUG oslo_concurrency.lockutils [req-0215805f-1012-47ba-a045-d14ffe134d2b req-3f56f3f3-0cce-494a-80aa-cc46a7c723f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.793 182939 DEBUG oslo_concurrency.lockutils [req-0215805f-1012-47ba-a045-d14ffe134d2b req-3f56f3f3-0cce-494a-80aa-cc46a7c723f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.794 182939 DEBUG oslo_concurrency.lockutils [req-0215805f-1012-47ba-a045-d14ffe134d2b req-3f56f3f3-0cce-494a-80aa-cc46a7c723f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.794 182939 DEBUG nova.compute.manager [req-0215805f-1012-47ba-a045-d14ffe134d2b req-3f56f3f3-0cce-494a-80aa-cc46a7c723f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] No waiting events found dispatching network-vif-unplugged-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.794 182939 DEBUG nova.compute.manager [req-0215805f-1012-47ba-a045-d14ffe134d2b req-3f56f3f3-0cce-494a-80aa-cc46a7c723f6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Received event network-vif-unplugged-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.795 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.796 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.799 182939 INFO os_vif [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:6d:17,bridge_name='br-int',has_traffic_filtering=True,id=96c8ac3c-cf3f-4189-9267-c0a4096a0fb1,network=Network(65641ee3-5688-4f52-8e2b-2aae97505b84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap96c8ac3c-cf')
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.800 182939 INFO nova.virt.libvirt.driver [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Deleting instance files /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd_del
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.801 182939 INFO nova.virt.libvirt.driver [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Deletion of /var/lib/nova/instances/b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd_del complete
Jan 22 00:21:16 compute-0 podman[236490]: 2026-01-22 00:21:16.837438574 +0000 UTC m=+0.046723012 container remove 51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.844 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4a59e70b-8191-46a3-b6d4-ba07ca17be46]: (4, ('Thu Jan 22 12:21:16 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84 (51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60)\n51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60\nThu Jan 22 12:21:16 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84 (51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60)\n51f3c742e54694c91d51e30bfffa768f4b5d1d07785ea264e85b9af618d4ea60\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.846 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c22da113-6809-4125-a405-35bc60c0a07f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.848 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65641ee3-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.851 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:16 compute-0 kernel: tap65641ee3-50: left promiscuous mode
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.866 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.870 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[049aac6a-71c7-4dfb-b8c7-7540cd0e837a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.884 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[544d950c-8306-4ec1-ab1b-617c3d7e5c62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.886 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1d85fbb1-dc27-44c6-bc3a-88ce054b6ef7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.905 182939 INFO nova.compute.manager [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Took 0.44 seconds to destroy the instance on the hypervisor.
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.905 182939 DEBUG oslo.service.loopingcall [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.906 182939 DEBUG nova.compute.manager [-] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:21:16 compute-0 nova_compute[182935]: 2026-01-22 00:21:16.906 182939 DEBUG nova.network.neutron [-] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.905 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[43ceb8bb-e1a8-4028-b65e-ce5a12c6200a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571739, 'reachable_time': 26372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236505, 'error': None, 'target': 'ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.908 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-65641ee3-5688-4f52-8e2b-2aae97505b84 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:21:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:16.908 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[6ce25e7e-586d-41ff-b092-42bcc85cb527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:16 compute-0 systemd[1]: run-netns-ovnmeta\x2d65641ee3\x2d5688\x2d4f52\x2d8e2b\x2d2aae97505b84.mount: Deactivated successfully.
Jan 22 00:21:17 compute-0 nova_compute[182935]: 2026-01-22 00:21:17.830 182939 DEBUG nova.network.neutron [-] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:17 compute-0 nova_compute[182935]: 2026-01-22 00:21:17.854 182939 INFO nova.compute.manager [-] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Took 0.95 seconds to deallocate network for instance.
Jan 22 00:21:17 compute-0 nova_compute[182935]: 2026-01-22 00:21:17.928 182939 DEBUG nova.compute.manager [req-9e5de81f-d8c3-42f6-8ced-b8acee4855ab req-556b5330-6236-4dc1-9481-e14c0d1386bc 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Received event network-vif-deleted-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:17 compute-0 nova_compute[182935]: 2026-01-22 00:21:17.929 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:17 compute-0 nova_compute[182935]: 2026-01-22 00:21:17.956 182939 DEBUG oslo_concurrency.lockutils [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:17 compute-0 nova_compute[182935]: 2026-01-22 00:21:17.957 182939 DEBUG oslo_concurrency.lockutils [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.035 182939 DEBUG nova.compute.provider_tree [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.052 182939 DEBUG nova.scheduler.client.report [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.080 182939 DEBUG oslo_concurrency.lockutils [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.118 182939 INFO nova.scheduler.client.report [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Deleted allocations for instance b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.156 182939 DEBUG nova.network.neutron [req-16b18583-14d4-4184-b15f-3360e7912b0b req-9376377d-b821-4a69-967d-626b6f1a7009 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Updated VIF entry in instance network info cache for port 96c8ac3c-cf3f-4189-9267-c0a4096a0fb1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.157 182939 DEBUG nova.network.neutron [req-16b18583-14d4-4184-b15f-3360e7912b0b req-9376377d-b821-4a69-967d-626b6f1a7009 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Updating instance_info_cache with network_info: [{"id": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "address": "fa:16:3e:a0:6d:17", "network": {"id": "65641ee3-5688-4f52-8e2b-2aae97505b84", "bridge": "br-int", "label": "tempest-network-smoke--2051804877", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96c8ac3c-cf", "ovs_interfaceid": "96c8ac3c-cf3f-4189-9267-c0a4096a0fb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.203 182939 DEBUG oslo_concurrency.lockutils [req-16b18583-14d4-4184-b15f-3360e7912b0b req-9376377d-b821-4a69-967d-626b6f1a7009 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.240 182939 DEBUG oslo_concurrency.lockutils [None req-bfca6201-1f40-415b-a12d-dd076ed25781 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.859 182939 DEBUG nova.compute.manager [req-2d761082-dc56-4f31-b3dc-fe254956ddb0 req-60e64ce1-c15e-4f99-a809-4fcb7f146144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Received event network-vif-plugged-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.860 182939 DEBUG oslo_concurrency.lockutils [req-2d761082-dc56-4f31-b3dc-fe254956ddb0 req-60e64ce1-c15e-4f99-a809-4fcb7f146144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.860 182939 DEBUG oslo_concurrency.lockutils [req-2d761082-dc56-4f31-b3dc-fe254956ddb0 req-60e64ce1-c15e-4f99-a809-4fcb7f146144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.860 182939 DEBUG oslo_concurrency.lockutils [req-2d761082-dc56-4f31-b3dc-fe254956ddb0 req-60e64ce1-c15e-4f99-a809-4fcb7f146144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.860 182939 DEBUG nova.compute.manager [req-2d761082-dc56-4f31-b3dc-fe254956ddb0 req-60e64ce1-c15e-4f99-a809-4fcb7f146144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] No waiting events found dispatching network-vif-plugged-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:18 compute-0 nova_compute[182935]: 2026-01-22 00:21:18.861 182939 WARNING nova.compute.manager [req-2d761082-dc56-4f31-b3dc-fe254956ddb0 req-60e64ce1-c15e-4f99-a809-4fcb7f146144 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Received unexpected event network-vif-plugged-96c8ac3c-cf3f-4189-9267-c0a4096a0fb1 for instance with vm_state deleted and task_state None.
Jan 22 00:21:19 compute-0 nova_compute[182935]: 2026-01-22 00:21:19.413 182939 INFO nova.compute.manager [None req-1b8a430c-df28-4b6d-b105-77dc10ada2da 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Get console output
Jan 22 00:21:19 compute-0 nova_compute[182935]: 2026-01-22 00:21:19.419 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:21:19 compute-0 podman[236506]: 2026-01-22 00:21:19.675944973 +0000 UTC m=+0.053870923 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:21:21 compute-0 nova_compute[182935]: 2026-01-22 00:21:21.141 182939 INFO nova.compute.manager [None req-bf8db4a4-492c-4f46-b138-82469b9bde5d 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Get console output
Jan 22 00:21:21 compute-0 nova_compute[182935]: 2026-01-22 00:21:21.147 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:21:21 compute-0 nova_compute[182935]: 2026-01-22 00:21:21.790 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:22 compute-0 nova_compute[182935]: 2026-01-22 00:21:22.935 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:23 compute-0 ovn_controller[95047]: 2026-01-22T00:21:23Z|00571|binding|INFO|Releasing lport d1e02cd9-8126-49f7-b1af-0e9e0399bf45 from this chassis (sb_readonly=0)
Jan 22 00:21:23 compute-0 nova_compute[182935]: 2026-01-22 00:21:23.466 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:24 compute-0 nova_compute[182935]: 2026-01-22 00:21:24.890 182939 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Check if temp file /var/lib/nova/instances/tmpcnxlrtdu exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Jan 22 00:21:24 compute-0 nova_compute[182935]: 2026-01-22 00:21:24.895 182939 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:24 compute-0 nova_compute[182935]: 2026-01-22 00:21:24.958 182939 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:24 compute-0 nova_compute[182935]: 2026-01-22 00:21:24.960 182939 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:25 compute-0 nova_compute[182935]: 2026-01-22 00:21:25.022 182939 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:25 compute-0 nova_compute[182935]: 2026-01-22 00:21:25.023 182939 DEBUG nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcnxlrtdu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Jan 22 00:21:25 compute-0 podman[236537]: 2026-01-22 00:21:25.672404134 +0000 UTC m=+0.050327319 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true)
Jan 22 00:21:25 compute-0 nova_compute[182935]: 2026-01-22 00:21:25.764 182939 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:25 compute-0 nova_compute[182935]: 2026-01-22 00:21:25.833 182939 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:25 compute-0 nova_compute[182935]: 2026-01-22 00:21:25.835 182939 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:25 compute-0 nova_compute[182935]: 2026-01-22 00:21:25.912 182939 DEBUG oslo_concurrency.processutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:26 compute-0 nova_compute[182935]: 2026-01-22 00:21:26.797 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:27 compute-0 nova_compute[182935]: 2026-01-22 00:21:27.606 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:28 compute-0 nova_compute[182935]: 2026-01-22 00:21:28.004 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:28 compute-0 sshd-session[236562]: Accepted publickey for nova from 192.168.122.101 port 53030 ssh2: ECDSA SHA256:F7R2p0Um/J2TH6GGZ4p2tVmjepU+Vjbmccv9PcuTsYI
Jan 22 00:21:28 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 22 00:21:28 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 22 00:21:28 compute-0 systemd-logind[784]: New session 61 of user nova.
Jan 22 00:21:28 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 22 00:21:28 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 22 00:21:28 compute-0 systemd[236566]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:21:28 compute-0 systemd[236566]: Queued start job for default target Main User Target.
Jan 22 00:21:28 compute-0 systemd[236566]: Created slice User Application Slice.
Jan 22 00:21:28 compute-0 systemd[236566]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:21:28 compute-0 systemd[236566]: Started Daily Cleanup of User's Temporary Directories.
Jan 22 00:21:28 compute-0 systemd[236566]: Reached target Paths.
Jan 22 00:21:28 compute-0 systemd[236566]: Reached target Timers.
Jan 22 00:21:28 compute-0 systemd[236566]: Starting D-Bus User Message Bus Socket...
Jan 22 00:21:28 compute-0 systemd[236566]: Starting Create User's Volatile Files and Directories...
Jan 22 00:21:28 compute-0 systemd[236566]: Listening on D-Bus User Message Bus Socket.
Jan 22 00:21:28 compute-0 systemd[236566]: Reached target Sockets.
Jan 22 00:21:28 compute-0 systemd[236566]: Finished Create User's Volatile Files and Directories.
Jan 22 00:21:28 compute-0 systemd[236566]: Reached target Basic System.
Jan 22 00:21:28 compute-0 systemd[236566]: Reached target Main User Target.
Jan 22 00:21:28 compute-0 systemd[236566]: Startup finished in 116ms.
Jan 22 00:21:28 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 22 00:21:28 compute-0 systemd[1]: Started Session 61 of User nova.
Jan 22 00:21:28 compute-0 sshd-session[236562]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 22 00:21:28 compute-0 sshd-session[236581]: Received disconnect from 192.168.122.101 port 53030:11: disconnected by user
Jan 22 00:21:28 compute-0 sshd-session[236581]: Disconnected from user nova 192.168.122.101 port 53030
Jan 22 00:21:28 compute-0 sshd-session[236562]: pam_unix(sshd:session): session closed for user nova
Jan 22 00:21:28 compute-0 systemd[1]: session-61.scope: Deactivated successfully.
Jan 22 00:21:28 compute-0 systemd-logind[784]: Session 61 logged out. Waiting for processes to exit.
Jan 22 00:21:28 compute-0 systemd-logind[784]: Removed session 61.
Jan 22 00:21:29 compute-0 nova_compute[182935]: 2026-01-22 00:21:29.717 182939 DEBUG nova.compute.manager [req-4e5f0bfd-2ba0-4b8b-a394-05292f5f4098 req-5b4c2b72-36f6-41ef-b7cf-d691a25e91e1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-unplugged-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:29 compute-0 nova_compute[182935]: 2026-01-22 00:21:29.718 182939 DEBUG oslo_concurrency.lockutils [req-4e5f0bfd-2ba0-4b8b-a394-05292f5f4098 req-5b4c2b72-36f6-41ef-b7cf-d691a25e91e1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:29 compute-0 nova_compute[182935]: 2026-01-22 00:21:29.718 182939 DEBUG oslo_concurrency.lockutils [req-4e5f0bfd-2ba0-4b8b-a394-05292f5f4098 req-5b4c2b72-36f6-41ef-b7cf-d691a25e91e1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:29 compute-0 nova_compute[182935]: 2026-01-22 00:21:29.718 182939 DEBUG oslo_concurrency.lockutils [req-4e5f0bfd-2ba0-4b8b-a394-05292f5f4098 req-5b4c2b72-36f6-41ef-b7cf-d691a25e91e1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:29 compute-0 nova_compute[182935]: 2026-01-22 00:21:29.719 182939 DEBUG nova.compute.manager [req-4e5f0bfd-2ba0-4b8b-a394-05292f5f4098 req-5b4c2b72-36f6-41ef-b7cf-d691a25e91e1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] No waiting events found dispatching network-vif-unplugged-2de3f942-6922-4800-9d3a-d06aa1263f44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:29 compute-0 nova_compute[182935]: 2026-01-22 00:21:29.719 182939 DEBUG nova.compute.manager [req-4e5f0bfd-2ba0-4b8b-a394-05292f5f4098 req-5b4c2b72-36f6-41ef-b7cf-d691a25e91e1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-unplugged-2de3f942-6922-4800-9d3a-d06aa1263f44 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.294 182939 INFO nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Took 4.38 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.295 182939 DEBUG nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.319 182939 DEBUG nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcnxlrtdu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(4b5e5f76-94be-4eef-9f9c-ff209c6c9f6c),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.342 182939 DEBUG nova.objects.instance [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lazy-loading 'migration_context' on Instance uuid dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.344 182939 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.346 182939 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.346 182939 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.368 182939 DEBUG nova.virt.libvirt.vif [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:20:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-7270034',display_name='tempest-TestNetworkAdvancedServerOps-server-7270034',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-7270034',id=152,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKCbkoaVuA62O+ECqX+7Ohn7GbIbEVQCxvPvCXrKqpOjrukjt8m0tS2UNeW9SghkNu53IZT4aL6S7PqVShWjvxQooRpiJSuxHQi4r5UNidyhtoE0twes7RsZVTczYedVHA==',key_name='tempest-TestNetworkAdvancedServerOps-2075791651',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:20:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1trsnn3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:20:59Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.369 182939 DEBUG nova.network.os_vif_util [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converting VIF {"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.370 182939 DEBUG nova.network.os_vif_util [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.370 182939 DEBUG nova.virt.libvirt.migration [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updating guest XML with vif config: <interface type="ethernet">
Jan 22 00:21:30 compute-0 nova_compute[182935]:   <mac address="fa:16:3e:13:64:5d"/>
Jan 22 00:21:30 compute-0 nova_compute[182935]:   <model type="virtio"/>
Jan 22 00:21:30 compute-0 nova_compute[182935]:   <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:21:30 compute-0 nova_compute[182935]:   <mtu size="1442"/>
Jan 22 00:21:30 compute-0 nova_compute[182935]:   <target dev="tap2de3f942-69"/>
Jan 22 00:21:30 compute-0 nova_compute[182935]: </interface>
Jan 22 00:21:30 compute-0 nova_compute[182935]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.371 182939 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.849 182939 DEBUG nova.virt.libvirt.migration [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.850 182939 INFO nova.virt.libvirt.migration [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Increasing downtime to 50 ms after 0 sec elapsed time
Jan 22 00:21:30 compute-0 nova_compute[182935]: 2026-01-22 00:21:30.939 182939 INFO nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.446 182939 DEBUG nova.virt.libvirt.migration [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.447 182939 DEBUG nova.virt.libvirt.migration [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.751 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041276.7496686, b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.751 182939 INFO nova.compute.manager [-] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] VM Stopped (Lifecycle Event)
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.785 182939 DEBUG nova.compute.manager [None req-28db366d-395c-42f3-af50-643419be46ac - - - - - -] [instance: b1f12cb7-29ef-47b8-8f2b-ab3171cad7cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.798 182939 DEBUG nova.compute.manager [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.799 182939 DEBUG oslo_concurrency.lockutils [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.799 182939 DEBUG oslo_concurrency.lockutils [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.799 182939 DEBUG oslo_concurrency.lockutils [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.800 182939 DEBUG nova.compute.manager [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] No waiting events found dispatching network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.800 182939 WARNING nova.compute.manager [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received unexpected event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 for instance with vm_state active and task_state migrating.
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.800 182939 DEBUG nova.compute.manager [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-changed-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.800 182939 DEBUG nova.compute.manager [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Refreshing instance network info cache due to event network-changed-2de3f942-6922-4800-9d3a-d06aa1263f44. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.800 182939 DEBUG oslo_concurrency.lockutils [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.801 182939 DEBUG oslo_concurrency.lockutils [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.801 182939 DEBUG nova.network.neutron [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Refreshing network info cache for port 2de3f942-6922-4800-9d3a-d06aa1263f44 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.802 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.910 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041291.9102552, dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.911 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] VM Paused (Lifecycle Event)
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.933 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.937 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.949 182939 DEBUG nova.virt.libvirt.migration [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.949 182939 DEBUG nova.virt.libvirt.migration [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Jan 22 00:21:31 compute-0 nova_compute[182935]: 2026-01-22 00:21:31.955 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] During sync_power_state the instance has a pending task (migrating). Skip.
Jan 22 00:21:32 compute-0 kernel: tap2de3f942-69 (unregistering): left promiscuous mode
Jan 22 00:21:32 compute-0 NetworkManager[55139]: <info>  [1769041292.0681] device (tap2de3f942-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:21:32 compute-0 ovn_controller[95047]: 2026-01-22T00:21:32Z|00572|binding|INFO|Releasing lport 2de3f942-6922-4800-9d3a-d06aa1263f44 from this chassis (sb_readonly=0)
Jan 22 00:21:32 compute-0 ovn_controller[95047]: 2026-01-22T00:21:32Z|00573|binding|INFO|Setting lport 2de3f942-6922-4800-9d3a-d06aa1263f44 down in Southbound
Jan 22 00:21:32 compute-0 nova_compute[182935]: 2026-01-22 00:21:32.076 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:32 compute-0 ovn_controller[95047]: 2026-01-22T00:21:32Z|00574|binding|INFO|Removing iface tap2de3f942-69 ovn-installed in OVS
Jan 22 00:21:32 compute-0 nova_compute[182935]: 2026-01-22 00:21:32.078 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:32 compute-0 nova_compute[182935]: 2026-01-22 00:21:32.091 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:32 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000098.scope: Deactivated successfully.
Jan 22 00:21:32 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000098.scope: Consumed 14.037s CPU time.
Jan 22 00:21:32 compute-0 systemd-machined[154182]: Machine qemu-75-instance-00000098 terminated.
Jan 22 00:21:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:32.132 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:64:5d 10.100.0.4'], port_security=['fa:16:3e:13:64:5d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '74526b6d-b1ca-423f-9094-b845f8b97526'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '8', 'neutron:security_group_ids': '61ee06fa-a63f-42b6-8f38-7bda03f7a2d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.218'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5715df4-0e68-4951-9b87-9601d69c7054, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=2de3f942-6922-4800-9d3a-d06aa1263f44) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:21:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:32.133 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 2de3f942-6922-4800-9d3a-d06aa1263f44 in datapath 7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d unbound from our chassis
Jan 22 00:21:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:32.134 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:21:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:32.136 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6ce1cb77-1ebc-4a31-92c3-ec4ee3d03b5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:32.137 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d namespace which is not needed anymore
Jan 22 00:21:32 compute-0 podman[236595]: 2026-01-22 00:21:32.173655877 +0000 UTC m=+0.068378308 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Jan 22 00:21:32 compute-0 podman[236591]: 2026-01-22 00:21:32.192688221 +0000 UTC m=+0.087842352 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64)
Jan 22 00:21:32 compute-0 nova_compute[182935]: 2026-01-22 00:21:32.317 182939 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Jan 22 00:21:32 compute-0 nova_compute[182935]: 2026-01-22 00:21:32.318 182939 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Jan 22 00:21:32 compute-0 nova_compute[182935]: 2026-01-22 00:21:32.318 182939 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Jan 22 00:21:32 compute-0 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[236302]: [NOTICE]   (236306) : haproxy version is 2.8.14-c23fe91
Jan 22 00:21:32 compute-0 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[236302]: [NOTICE]   (236306) : path to executable is /usr/sbin/haproxy
Jan 22 00:21:32 compute-0 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[236302]: [WARNING]  (236306) : Exiting Master process...
Jan 22 00:21:32 compute-0 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[236302]: [ALERT]    (236306) : Current worker (236308) exited with code 143 (Terminated)
Jan 22 00:21:32 compute-0 neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d[236302]: [WARNING]  (236306) : All workers exited. Exiting... (0)
Jan 22 00:21:32 compute-0 systemd[1]: libpod-8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9.scope: Deactivated successfully.
Jan 22 00:21:32 compute-0 nova_compute[182935]: 2026-01-22 00:21:32.452 182939 DEBUG nova.virt.libvirt.guest [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2' (instance-00000098) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Jan 22 00:21:32 compute-0 nova_compute[182935]: 2026-01-22 00:21:32.453 182939 INFO nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Migration operation has completed
Jan 22 00:21:32 compute-0 nova_compute[182935]: 2026-01-22 00:21:32.453 182939 INFO nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] _post_live_migration() is started..
Jan 22 00:21:32 compute-0 podman[236657]: 2026-01-22 00:21:32.457913982 +0000 UTC m=+0.242621905 container died 8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.006 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.243 182939 DEBUG nova.network.neutron [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Activated binding for port 2de3f942-6922-4800-9d3a-d06aa1263f44 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.244 182939 DEBUG nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.245 182939 DEBUG nova.virt.libvirt.vif [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:20:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-7270034',display_name='tempest-TestNetworkAdvancedServerOps-server-7270034',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-7270034',id=152,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKCbkoaVuA62O+ECqX+7Ohn7GbIbEVQCxvPvCXrKqpOjrukjt8m0tS2UNeW9SghkNu53IZT4aL6S7PqVShWjvxQooRpiJSuxHQi4r5UNidyhtoE0twes7RsZVTczYedVHA==',key_name='tempest-TestNetworkAdvancedServerOps-2075791651',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:20:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1trsnn3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:21:22Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.246 182939 DEBUG nova.network.os_vif_util [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converting VIF {"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.247 182939 DEBUG nova.network.os_vif_util [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.247 182939 DEBUG os_vif [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.249 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.250 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2de3f942-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.252 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.255 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.257 182939 INFO os_vif [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:13:64:5d,bridge_name='br-int',has_traffic_filtering=True,id=2de3f942-6922-4800-9d3a-d06aa1263f44,network=Network(7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2de3f942-69')
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.258 182939 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.258 182939 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.259 182939 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.259 182939 DEBUG nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.260 182939 INFO nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Deleting instance files /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2_del
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.261 182939 INFO nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Deletion of /var/lib/nova/instances/dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2_del complete
Jan 22 00:21:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-ddd75296d95dee910c9cf63a5185698460f21896a3820e83071bc5cc3e9cb3dc-merged.mount: Deactivated successfully.
Jan 22 00:21:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9-userdata-shm.mount: Deactivated successfully.
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.933 182939 DEBUG nova.compute.manager [req-35ef6a6b-ae6b-4442-923d-116deab5ea0a req-876a552b-5c0e-4da1-a388-da9f88c48a64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-unplugged-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.933 182939 DEBUG oslo_concurrency.lockutils [req-35ef6a6b-ae6b-4442-923d-116deab5ea0a req-876a552b-5c0e-4da1-a388-da9f88c48a64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.934 182939 DEBUG oslo_concurrency.lockutils [req-35ef6a6b-ae6b-4442-923d-116deab5ea0a req-876a552b-5c0e-4da1-a388-da9f88c48a64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.934 182939 DEBUG oslo_concurrency.lockutils [req-35ef6a6b-ae6b-4442-923d-116deab5ea0a req-876a552b-5c0e-4da1-a388-da9f88c48a64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.934 182939 DEBUG nova.compute.manager [req-35ef6a6b-ae6b-4442-923d-116deab5ea0a req-876a552b-5c0e-4da1-a388-da9f88c48a64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] No waiting events found dispatching network-vif-unplugged-2de3f942-6922-4800-9d3a-d06aa1263f44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:33 compute-0 nova_compute[182935]: 2026-01-22 00:21:33.934 182939 DEBUG nova.compute.manager [req-35ef6a6b-ae6b-4442-923d-116deab5ea0a req-876a552b-5c0e-4da1-a388-da9f88c48a64 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-unplugged-2de3f942-6922-4800-9d3a-d06aa1263f44 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:21:34 compute-0 podman[236657]: 2026-01-22 00:21:34.188115127 +0000 UTC m=+1.972823050 container cleanup 8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:21:34 compute-0 systemd[1]: libpod-conmon-8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9.scope: Deactivated successfully.
Jan 22 00:21:34 compute-0 podman[236705]: 2026-01-22 00:21:34.280964356 +0000 UTC m=+0.062641301 container remove 8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:21:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:34.286 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6c7693d8-3d7f-4b45-84c6-4bd28c145686]: (4, ('Thu Jan 22 12:21:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d (8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9)\n8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9\nThu Jan 22 12:21:34 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d (8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9)\n8504fafac5e2bda6633ba18e23a4a8c4cd1377d7a2fe5590b96984bda9c2d0e9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:34.288 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6e4f257b-6a26-47ef-b113-71397eb493df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:34.289 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a7a8118-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:34 compute-0 kernel: tap7a7a8118-c0: left promiscuous mode
Jan 22 00:21:34 compute-0 nova_compute[182935]: 2026-01-22 00:21:34.355 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:34.361 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[be10ba80-39cd-4936-a4e0-e7a23409d6d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:34 compute-0 nova_compute[182935]: 2026-01-22 00:21:34.371 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:34.383 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[32f30f59-f82a-432d-9960-aabdd1b52fb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:34.384 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd4aec3-5710-4b66-b31e-18a8ac29200f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:34.399 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5050abed-88fb-4ec3-adde-152f178c43f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 576212, 'reachable_time': 42291, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236722, 'error': None, 'target': 'ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d7a7a8118\x2dcbb9\x2d421a\x2db461\x2d2b8b9d4dfc6d.mount: Deactivated successfully.
Jan 22 00:21:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:34.405 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:21:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:34.405 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[de7ec239-2bb5-47e2-9cd9-b994a6af8685]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:34 compute-0 nova_compute[182935]: 2026-01-22 00:21:34.629 182939 DEBUG nova.network.neutron [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updated VIF entry in instance network info cache for port 2de3f942-6922-4800-9d3a-d06aa1263f44. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:21:34 compute-0 nova_compute[182935]: 2026-01-22 00:21:34.630 182939 DEBUG nova.network.neutron [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Updating instance_info_cache with network_info: [{"id": "2de3f942-6922-4800-9d3a-d06aa1263f44", "address": "fa:16:3e:13:64:5d", "network": {"id": "7a7a8118-cbb9-421a-b461-2b8b9d4dfc6d", "bridge": "br-int", "label": "tempest-network-smoke--1881000551", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2de3f942-69", "ovs_interfaceid": "2de3f942-6922-4800-9d3a-d06aa1263f44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:34 compute-0 nova_compute[182935]: 2026-01-22 00:21:34.653 182939 DEBUG oslo_concurrency.lockutils [req-d1b8d17d-8fcc-4cb7-95dc-03431dbcbff1 req-0de628f7-ac58-4654-9133-1a436ced93ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:21:35 compute-0 nova_compute[182935]: 2026-01-22 00:21:35.655 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.025 182939 DEBUG nova.compute.manager [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.026 182939 DEBUG oslo_concurrency.lockutils [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.026 182939 DEBUG oslo_concurrency.lockutils [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.026 182939 DEBUG oslo_concurrency.lockutils [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.027 182939 DEBUG nova.compute.manager [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] No waiting events found dispatching network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.027 182939 WARNING nova.compute.manager [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received unexpected event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 for instance with vm_state active and task_state migrating.
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.028 182939 DEBUG nova.compute.manager [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.028 182939 DEBUG oslo_concurrency.lockutils [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.029 182939 DEBUG oslo_concurrency.lockutils [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.029 182939 DEBUG oslo_concurrency.lockutils [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.030 182939 DEBUG nova.compute.manager [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] No waiting events found dispatching network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.030 182939 WARNING nova.compute.manager [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received unexpected event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 for instance with vm_state active and task_state migrating.
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.030 182939 DEBUG nova.compute.manager [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.031 182939 DEBUG oslo_concurrency.lockutils [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.031 182939 DEBUG oslo_concurrency.lockutils [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.032 182939 DEBUG oslo_concurrency.lockutils [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.032 182939 DEBUG nova.compute.manager [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] No waiting events found dispatching network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:36 compute-0 nova_compute[182935]: 2026-01-22 00:21:36.032 182939 WARNING nova.compute.manager [req-4f3ad6c3-c6e3-4540-a3dd-6885ed63eaef req-e2cd06e4-a28c-4e69-a8ed-ff698a2e0921 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Received unexpected event network-vif-plugged-2de3f942-6922-4800-9d3a-d06aa1263f44 for instance with vm_state active and task_state migrating.
Jan 22 00:21:38 compute-0 nova_compute[182935]: 2026-01-22 00:21:38.008 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:38 compute-0 nova_compute[182935]: 2026-01-22 00:21:38.252 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:39 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 22 00:21:39 compute-0 systemd[236566]: Activating special unit Exit the Session...
Jan 22 00:21:39 compute-0 systemd[236566]: Stopped target Main User Target.
Jan 22 00:21:39 compute-0 systemd[236566]: Stopped target Basic System.
Jan 22 00:21:39 compute-0 systemd[236566]: Stopped target Paths.
Jan 22 00:21:39 compute-0 systemd[236566]: Stopped target Sockets.
Jan 22 00:21:39 compute-0 systemd[236566]: Stopped target Timers.
Jan 22 00:21:39 compute-0 systemd[236566]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 22 00:21:39 compute-0 systemd[236566]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 22 00:21:39 compute-0 systemd[236566]: Closed D-Bus User Message Bus Socket.
Jan 22 00:21:39 compute-0 systemd[236566]: Stopped Create User's Volatile Files and Directories.
Jan 22 00:21:39 compute-0 systemd[236566]: Removed slice User Application Slice.
Jan 22 00:21:39 compute-0 systemd[236566]: Reached target Shutdown.
Jan 22 00:21:39 compute-0 systemd[236566]: Finished Exit the Session.
Jan 22 00:21:39 compute-0 systemd[236566]: Reached target Exit the Session.
Jan 22 00:21:39 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 22 00:21:39 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 22 00:21:39 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 22 00:21:39 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 22 00:21:39 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 22 00:21:39 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 22 00:21:39 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.004 182939 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.004 182939 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.005 182939 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.025 182939 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.025 182939 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.025 182939 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.026 182939 DEBUG nova.compute.resource_tracker [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.187 182939 WARNING nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.188 182939 DEBUG nova.compute.resource_tracker [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5699MB free_disk=73.12727355957031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.188 182939 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.188 182939 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.225 182939 DEBUG nova.compute.resource_tracker [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Migration for instance dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.247 182939 DEBUG nova.compute.resource_tracker [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.277 182939 DEBUG nova.compute.resource_tracker [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Migration 4b5e5f76-94be-4eef-9f9c-ff209c6c9f6c is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.277 182939 DEBUG nova.compute.resource_tracker [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.278 182939 DEBUG nova.compute.resource_tracker [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.324 182939 DEBUG nova.compute.provider_tree [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.341 182939 DEBUG nova.scheduler.client.report [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.361 182939 DEBUG nova.compute.resource_tracker [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.362 182939 DEBUG oslo_concurrency.lockutils [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.376 182939 INFO nova.compute.manager [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.499 182939 INFO nova.scheduler.client.report [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] Deleted allocation for migration 4b5e5f76-94be-4eef-9f9c-ff209c6c9f6c
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.500 182939 DEBUG nova.virt.libvirt.driver [None req-c561fdca-cc2a-4eda-a1c1-2fe887d6a841 74e6e619a2a649d8a98f51f794c814ec d7e5ce5531e5499fa8b5c71d40934672 - - default default] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.899 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "30efffcc-860c-4c44-a2d2-66d14866e670" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.899 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:40 compute-0 nova_compute[182935]: 2026-01-22 00:21:40.918 182939 DEBUG nova.compute.manager [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.008 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.008 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.016 182939 DEBUG nova.virt.hardware [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.016 182939 INFO nova.compute.claims [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.164 182939 DEBUG nova.compute.provider_tree [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.182 182939 DEBUG nova.scheduler.client.report [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.204 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.205 182939 DEBUG nova.compute.manager [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.269 182939 DEBUG nova.compute.manager [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.270 182939 DEBUG nova.network.neutron [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.288 182939 INFO nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.309 182939 DEBUG nova.compute.manager [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.442 182939 DEBUG nova.compute.manager [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.443 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.444 182939 INFO nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Creating image(s)
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.444 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "/var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.445 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.445 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.458 182939 DEBUG nova.policy [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.460 182939 DEBUG oslo_concurrency.processutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.520 182939 DEBUG oslo_concurrency.processutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.521 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.522 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.533 182939 DEBUG oslo_concurrency.processutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.589 182939 DEBUG oslo_concurrency.processutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.590 182939 DEBUG oslo_concurrency.processutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.641 182939 DEBUG oslo_concurrency.processutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.642 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.642 182939 DEBUG oslo_concurrency.processutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.701 182939 DEBUG oslo_concurrency.processutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.702 182939 DEBUG nova.virt.disk.api [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Checking if we can resize image /var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.703 182939 DEBUG oslo_concurrency.processutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.759 182939 DEBUG oslo_concurrency.processutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.760 182939 DEBUG nova.virt.disk.api [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Cannot resize image /var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.761 182939 DEBUG nova.objects.instance [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'migration_context' on Instance uuid 30efffcc-860c-4c44-a2d2-66d14866e670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.777 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.777 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Ensure instance console log exists: /var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.778 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.778 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:41 compute-0 nova_compute[182935]: 2026-01-22 00:21:41.778 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:43 compute-0 nova_compute[182935]: 2026-01-22 00:21:43.020 182939 DEBUG nova.network.neutron [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Successfully created port: 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:21:43 compute-0 nova_compute[182935]: 2026-01-22 00:21:43.061 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:43 compute-0 nova_compute[182935]: 2026-01-22 00:21:43.255 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:45 compute-0 nova_compute[182935]: 2026-01-22 00:21:45.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:45 compute-0 nova_compute[182935]: 2026-01-22 00:21:45.945 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:45 compute-0 nova_compute[182935]: 2026-01-22 00:21:45.945 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:45 compute-0 nova_compute[182935]: 2026-01-22 00:21:45.946 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:45 compute-0 nova_compute[182935]: 2026-01-22 00:21:45.946 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.116 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.118 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5702MB free_disk=73.1270637512207GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.118 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.118 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.225 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 30efffcc-860c-4c44-a2d2-66d14866e670 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.225 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.226 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.228 182939 DEBUG nova.network.neutron [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Successfully updated port: 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.544 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "refresh_cache-30efffcc-860c-4c44-a2d2-66d14866e670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.545 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquired lock "refresh_cache-30efffcc-860c-4c44-a2d2-66d14866e670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.545 182939 DEBUG nova.network.neutron [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.593 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.645 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.674 182939 DEBUG nova.compute.manager [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Received event network-changed-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.675 182939 DEBUG nova.compute.manager [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Refreshing instance network info cache due to event network-changed-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.676 182939 DEBUG oslo_concurrency.lockutils [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30efffcc-860c-4c44-a2d2-66d14866e670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.678 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:21:46 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.678 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:47 compute-0 nova_compute[182935]: 2026-01-22 00:21:46.999 182939 DEBUG nova.network.neutron [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:21:47 compute-0 sshd-session[236740]: Invalid user redis from 188.166.69.60 port 45310
Jan 22 00:21:47 compute-0 nova_compute[182935]: 2026-01-22 00:21:47.315 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041292.3140912, dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:21:47 compute-0 nova_compute[182935]: 2026-01-22 00:21:47.316 182939 INFO nova.compute.manager [-] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] VM Stopped (Lifecycle Event)
Jan 22 00:21:47 compute-0 podman[236743]: 2026-01-22 00:21:47.321908019 +0000 UTC m=+0.053040073 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:21:47 compute-0 nova_compute[182935]: 2026-01-22 00:21:47.350 182939 DEBUG nova.compute.manager [None req-cb11c144-476e-4e0f-9fb4-5ffb287ca204 - - - - - -] [instance: dc67a96b-f05e-4eb0-9e18-58d3f1ac99e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:21:47 compute-0 podman[236742]: 2026-01-22 00:21:47.364881242 +0000 UTC m=+0.095928854 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:21:47 compute-0 sshd-session[236740]: Connection closed by invalid user redis 188.166.69.60 port 45310 [preauth]
Jan 22 00:21:47 compute-0 nova_compute[182935]: 2026-01-22 00:21:47.680 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:47 compute-0 nova_compute[182935]: 2026-01-22 00:21:47.680 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:21:47 compute-0 nova_compute[182935]: 2026-01-22 00:21:47.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:47 compute-0 nova_compute[182935]: 2026-01-22 00:21:47.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:21:47 compute-0 nova_compute[182935]: 2026-01-22 00:21:47.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:21:47 compute-0 nova_compute[182935]: 2026-01-22 00:21:47.819 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 00:21:47 compute-0 nova_compute[182935]: 2026-01-22 00:21:47.819 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:21:47 compute-0 nova_compute[182935]: 2026-01-22 00:21:47.986 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.080 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.090 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.257 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.291 182939 DEBUG nova.network.neutron [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Updating instance_info_cache with network_info: [{"id": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "address": "fa:16:3e:9a:a4:80", "network": {"id": "24048504-9ea0-4a65-b29c-34c7f7bbd9c4", "bridge": "br-int", "label": "tempest-network-smoke--1978334606", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20c1ab7c-2f", "ovs_interfaceid": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.334 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Releasing lock "refresh_cache-30efffcc-860c-4c44-a2d2-66d14866e670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.335 182939 DEBUG nova.compute.manager [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Instance network_info: |[{"id": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "address": "fa:16:3e:9a:a4:80", "network": {"id": "24048504-9ea0-4a65-b29c-34c7f7bbd9c4", "bridge": "br-int", "label": "tempest-network-smoke--1978334606", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20c1ab7c-2f", "ovs_interfaceid": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.336 182939 DEBUG oslo_concurrency.lockutils [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30efffcc-860c-4c44-a2d2-66d14866e670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.336 182939 DEBUG nova.network.neutron [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Refreshing network info cache for port 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.340 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Start _get_guest_xml network_info=[{"id": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "address": "fa:16:3e:9a:a4:80", "network": {"id": "24048504-9ea0-4a65-b29c-34c7f7bbd9c4", "bridge": "br-int", "label": "tempest-network-smoke--1978334606", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20c1ab7c-2f", "ovs_interfaceid": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.345 182939 WARNING nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.351 182939 DEBUG nova.virt.libvirt.host [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.352 182939 DEBUG nova.virt.libvirt.host [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.360 182939 DEBUG nova.virt.libvirt.host [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.361 182939 DEBUG nova.virt.libvirt.host [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.362 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.362 182939 DEBUG nova.virt.hardware [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.363 182939 DEBUG nova.virt.hardware [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.363 182939 DEBUG nova.virt.hardware [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.363 182939 DEBUG nova.virt.hardware [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.363 182939 DEBUG nova.virt.hardware [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.364 182939 DEBUG nova.virt.hardware [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.364 182939 DEBUG nova.virt.hardware [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.364 182939 DEBUG nova.virt.hardware [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.364 182939 DEBUG nova.virt.hardware [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.365 182939 DEBUG nova.virt.hardware [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.365 182939 DEBUG nova.virt.hardware [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.368 182939 DEBUG nova.virt.libvirt.vif [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:21:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-469689177',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-469689177',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=154,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGZLqk154UMD/JDj7lJ3CfrfW6KwgJqSFgOc2vTC05c8jL4w4rDA6z5+qvxQD6Ac5j9KM/kE7VPjw8dhlQ124E2SmkNLT1csulZtkH6vWp6CkuwQZxALqaw7n6+otvEy3A==',key_name='tempest-TestSecurityGroupsBasicOps-1024996681',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-zo139yfy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:21:41Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=30efffcc-860c-4c44-a2d2-66d14866e670,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "address": "fa:16:3e:9a:a4:80", "network": {"id": "24048504-9ea0-4a65-b29c-34c7f7bbd9c4", "bridge": "br-int", "label": "tempest-network-smoke--1978334606", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20c1ab7c-2f", "ovs_interfaceid": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.369 182939 DEBUG nova.network.os_vif_util [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "address": "fa:16:3e:9a:a4:80", "network": {"id": "24048504-9ea0-4a65-b29c-34c7f7bbd9c4", "bridge": "br-int", "label": "tempest-network-smoke--1978334606", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20c1ab7c-2f", "ovs_interfaceid": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.369 182939 DEBUG nova.network.os_vif_util [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:a4:80,bridge_name='br-int',has_traffic_filtering=True,id=20c1ab7c-2f3b-4beb-b1b5-1291bf2db622,network=Network(24048504-9ea0-4a65-b29c-34c7f7bbd9c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20c1ab7c-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.370 182939 DEBUG nova.objects.instance [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'pci_devices' on Instance uuid 30efffcc-860c-4c44-a2d2-66d14866e670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.386 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:21:48 compute-0 nova_compute[182935]:   <uuid>30efffcc-860c-4c44-a2d2-66d14866e670</uuid>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   <name>instance-0000009a</name>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-469689177</nova:name>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:21:48</nova:creationTime>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:21:48 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:21:48 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:21:48 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:21:48 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:21:48 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:21:48 compute-0 nova_compute[182935]:         <nova:user uuid="a60ce2b7b7ae47b484de12add551b287">tempest-TestSecurityGroupsBasicOps-1492736128-project-member</nova:user>
Jan 22 00:21:48 compute-0 nova_compute[182935]:         <nova:project uuid="02bcfc5f1f1044a3856e73a5938ff011">tempest-TestSecurityGroupsBasicOps-1492736128</nova:project>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:21:48 compute-0 nova_compute[182935]:         <nova:port uuid="20c1ab7c-2f3b-4beb-b1b5-1291bf2db622">
Jan 22 00:21:48 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <system>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <entry name="serial">30efffcc-860c-4c44-a2d2-66d14866e670</entry>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <entry name="uuid">30efffcc-860c-4c44-a2d2-66d14866e670</entry>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     </system>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   <os>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   </os>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   <features>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   </features>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk.config"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:9a:a4:80"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <target dev="tap20c1ab7c-2f"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/console.log" append="off"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <video>
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     </video>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:21:48 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:21:48 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:21:48 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:21:48 compute-0 nova_compute[182935]: </domain>
Jan 22 00:21:48 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.388 182939 DEBUG nova.compute.manager [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Preparing to wait for external event network-vif-plugged-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.388 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.389 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.389 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.389 182939 DEBUG nova.virt.libvirt.vif [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:21:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-469689177',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-469689177',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=154,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGZLqk154UMD/JDj7lJ3CfrfW6KwgJqSFgOc2vTC05c8jL4w4rDA6z5+qvxQD6Ac5j9KM/kE7VPjw8dhlQ124E2SmkNLT1csulZtkH6vWp6CkuwQZxALqaw7n6+otvEy3A==',key_name='tempest-TestSecurityGroupsBasicOps-1024996681',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-zo139yfy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:21:41Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=30efffcc-860c-4c44-a2d2-66d14866e670,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "address": "fa:16:3e:9a:a4:80", "network": {"id": "24048504-9ea0-4a65-b29c-34c7f7bbd9c4", "bridge": "br-int", "label": "tempest-network-smoke--1978334606", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20c1ab7c-2f", "ovs_interfaceid": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.390 182939 DEBUG nova.network.os_vif_util [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "address": "fa:16:3e:9a:a4:80", "network": {"id": "24048504-9ea0-4a65-b29c-34c7f7bbd9c4", "bridge": "br-int", "label": "tempest-network-smoke--1978334606", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20c1ab7c-2f", "ovs_interfaceid": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.390 182939 DEBUG nova.network.os_vif_util [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:a4:80,bridge_name='br-int',has_traffic_filtering=True,id=20c1ab7c-2f3b-4beb-b1b5-1291bf2db622,network=Network(24048504-9ea0-4a65-b29c-34c7f7bbd9c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20c1ab7c-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.391 182939 DEBUG os_vif [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:a4:80,bridge_name='br-int',has_traffic_filtering=True,id=20c1ab7c-2f3b-4beb-b1b5-1291bf2db622,network=Network(24048504-9ea0-4a65-b29c-34c7f7bbd9c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20c1ab7c-2f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.391 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.392 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.392 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.396 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.396 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20c1ab7c-2f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.397 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap20c1ab7c-2f, col_values=(('external_ids', {'iface-id': '20c1ab7c-2f3b-4beb-b1b5-1291bf2db622', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:a4:80', 'vm-uuid': '30efffcc-860c-4c44-a2d2-66d14866e670'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.398 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:48 compute-0 NetworkManager[55139]: <info>  [1769041308.3996] manager: (tap20c1ab7c-2f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.402 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.407 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.408 182939 INFO os_vif [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:a4:80,bridge_name='br-int',has_traffic_filtering=True,id=20c1ab7c-2f3b-4beb-b1b5-1291bf2db622,network=Network(24048504-9ea0-4a65-b29c-34c7f7bbd9c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20c1ab7c-2f')
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.475 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.476 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.476 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No VIF found with MAC fa:16:3e:9a:a4:80, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:21:48 compute-0 nova_compute[182935]: 2026-01-22 00:21:48.477 182939 INFO nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Using config drive
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.034 182939 INFO nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Creating config drive at /var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk.config
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.044 182939 DEBUG oslo_concurrency.processutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0cj7fzes execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.178 182939 DEBUG oslo_concurrency.processutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0cj7fzes" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:21:49 compute-0 kernel: tap20c1ab7c-2f: entered promiscuous mode
Jan 22 00:21:49 compute-0 ovn_controller[95047]: 2026-01-22T00:21:49Z|00575|binding|INFO|Claiming lport 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 for this chassis.
Jan 22 00:21:49 compute-0 NetworkManager[55139]: <info>  [1769041309.2530] manager: (tap20c1ab7c-2f): new Tun device (/org/freedesktop/NetworkManager/Devices/277)
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.251 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:49 compute-0 ovn_controller[95047]: 2026-01-22T00:21:49Z|00576|binding|INFO|20c1ab7c-2f3b-4beb-b1b5-1291bf2db622: Claiming fa:16:3e:9a:a4:80 10.100.0.5
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.255 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.262 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.271 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:a4:80 10.100.0.5'], port_security=['fa:16:3e:9a:a4:80 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '30efffcc-860c-4c44-a2d2-66d14866e670', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24048504-9ea0-4a65-b29c-34c7f7bbd9c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b138b370-cfb0-4e39-a579-a30ab6369230 d4ef217a-4d54-46eb-8448-7681833ff463', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39658596-4398-41ca-8cad-4415b38fd4a6, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=20c1ab7c-2f3b-4beb-b1b5-1291bf2db622) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.273 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 in datapath 24048504-9ea0-4a65-b29c-34c7f7bbd9c4 bound to our chassis
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.275 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24048504-9ea0-4a65-b29c-34c7f7bbd9c4
Jan 22 00:21:49 compute-0 systemd-udevd[236809]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.286 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4821187c-d760-4245-811e-5bef442db162]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.287 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24048504-91 in ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.288 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24048504-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.288 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b423927d-10fe-44f5-8b82-cb1fd2c08f74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.289 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bc0b9a-8765-4d89-adc0-c8dcb8db523c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 systemd-machined[154182]: New machine qemu-76-instance-0000009a.
Jan 22 00:21:49 compute-0 NetworkManager[55139]: <info>  [1769041309.2991] device (tap20c1ab7c-2f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:21:49 compute-0 NetworkManager[55139]: <info>  [1769041309.2999] device (tap20c1ab7c-2f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.303 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[11b21881-fec2-416a-a9e3-f91d4dfaeabc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.314 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.316 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a30d995a-97e3-4b00-a544-db6e3379f5bf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 ovn_controller[95047]: 2026-01-22T00:21:49Z|00577|binding|INFO|Setting lport 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 ovn-installed in OVS
Jan 22 00:21:49 compute-0 ovn_controller[95047]: 2026-01-22T00:21:49Z|00578|binding|INFO|Setting lport 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 up in Southbound
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.318 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:49 compute-0 systemd[1]: Started Virtual Machine qemu-76-instance-0000009a.
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.348 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f450416c-12e1-466f-9f53-c7c5912c65c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 systemd-udevd[236814]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:21:49 compute-0 NetworkManager[55139]: <info>  [1769041309.3554] manager: (tap24048504-90): new Veth device (/org/freedesktop/NetworkManager/Devices/278)
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.355 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[212dd78f-2d08-4776-9f9b-f7d3f9bdd3e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.390 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c7328f29-7ef0-47b9-829d-8c89ba5136d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.393 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9c9f42ab-5f36-4212-95b8-1599103c9e16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 NetworkManager[55139]: <info>  [1769041309.4156] device (tap24048504-90): carrier: link connected
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.421 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[71494d59-420e-4a25-82e6-88c9bce74524]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.438 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[952d3a86-40ba-4096-9811-02de5df58b5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24048504-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:16:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581215, 'reachable_time': 36280, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236843, 'error': None, 'target': 'ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.454 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[98ba5aea-6459-4832-9363-cc77b0e5365b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe84:1688'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 581215, 'tstamp': 581215}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236844, 'error': None, 'target': 'ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.475 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[52d33b95-286a-4ab1-9d66-86541c76cb21]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24048504-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:16:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 184], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581215, 'reachable_time': 36280, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236845, 'error': None, 'target': 'ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.505 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[359cad04-3a1e-4bdf-9745-874bc8b2a622]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.566 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b83c7d43-4220-44a1-be49-0df38717e540]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.569 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24048504-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.569 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.570 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24048504-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:49 compute-0 NetworkManager[55139]: <info>  [1769041309.5742] manager: (tap24048504-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/279)
Jan 22 00:21:49 compute-0 kernel: tap24048504-90: entered promiscuous mode
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.575 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.579 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24048504-90, col_values=(('external_ids', {'iface-id': '8e10c195-4e3a-433c-8d05-fceb257b3abf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.581 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:49 compute-0 ovn_controller[95047]: 2026-01-22T00:21:49Z|00579|binding|INFO|Releasing lport 8e10c195-4e3a-433c-8d05-fceb257b3abf from this chassis (sb_readonly=0)
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.584 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24048504-9ea0-4a65-b29c-34c7f7bbd9c4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24048504-9ea0-4a65-b29c-34c7f7bbd9c4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.585 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[960b8b7a-2b50-47ee-8614-cdce842ece25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.586 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-24048504-9ea0-4a65-b29c-34c7f7bbd9c4
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/24048504-9ea0-4a65-b29c-34c7f7bbd9c4.pid.haproxy
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 24048504-9ea0-4a65-b29c-34c7f7bbd9c4
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:21:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:49.588 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4', 'env', 'PROCESS_TAG=haproxy-24048504-9ea0-4a65-b29c-34c7f7bbd9c4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24048504-9ea0-4a65-b29c-34c7f7bbd9c4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.592 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.637 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041309.636197, 30efffcc-860c-4c44-a2d2-66d14866e670 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.637 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] VM Started (Lifecycle Event)
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.654 182939 DEBUG nova.compute.manager [req-2081181e-1a24-4202-993a-69c015029a2c req-5df8951b-eb12-4931-b211-3589522c7b5c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Received event network-vif-plugged-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.655 182939 DEBUG oslo_concurrency.lockutils [req-2081181e-1a24-4202-993a-69c015029a2c req-5df8951b-eb12-4931-b211-3589522c7b5c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.655 182939 DEBUG oslo_concurrency.lockutils [req-2081181e-1a24-4202-993a-69c015029a2c req-5df8951b-eb12-4931-b211-3589522c7b5c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.655 182939 DEBUG oslo_concurrency.lockutils [req-2081181e-1a24-4202-993a-69c015029a2c req-5df8951b-eb12-4931-b211-3589522c7b5c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.656 182939 DEBUG nova.compute.manager [req-2081181e-1a24-4202-993a-69c015029a2c req-5df8951b-eb12-4931-b211-3589522c7b5c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Processing event network-vif-plugged-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.656 182939 DEBUG nova.compute.manager [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.660 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.663 182939 INFO nova.virt.libvirt.driver [-] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Instance spawned successfully.
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.663 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.666 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.669 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.690 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.690 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041309.6368632, 30efffcc-860c-4c44-a2d2-66d14866e670 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.691 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] VM Paused (Lifecycle Event)
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.693 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.694 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.694 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.694 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.695 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.695 182939 DEBUG nova.virt.libvirt.driver [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.723 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.729 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041309.6654491, 30efffcc-860c-4c44-a2d2-66d14866e670 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.729 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] VM Resumed (Lifecycle Event)
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.763 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.767 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.804 182939 INFO nova.compute.manager [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Took 8.36 seconds to spawn the instance on the hypervisor.
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.805 182939 DEBUG nova.compute.manager [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:21:49 compute-0 nova_compute[182935]: 2026-01-22 00:21:49.811 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:21:49 compute-0 podman[236883]: 2026-01-22 00:21:49.954135899 +0000 UTC m=+0.052000568 container create 1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:21:49 compute-0 systemd[1]: Started libpod-conmon-1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1.scope.
Jan 22 00:21:50 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:21:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a746f7c6453ce49efc4bd9012f50d9a350ebcc9c327797d92121081df10e05d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:21:50 compute-0 podman[236883]: 2026-01-22 00:21:50.022406704 +0000 UTC m=+0.120271383 container init 1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:21:50 compute-0 podman[236883]: 2026-01-22 00:21:49.930994599 +0000 UTC m=+0.028859288 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:21:50 compute-0 podman[236883]: 2026-01-22 00:21:50.028910929 +0000 UTC m=+0.126775598 container start 1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:21:50 compute-0 nova_compute[182935]: 2026-01-22 00:21:50.029 182939 INFO nova.compute.manager [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Took 9.05 seconds to build instance.
Jan 22 00:21:50 compute-0 podman[236896]: 2026-01-22 00:21:50.040614048 +0000 UTC m=+0.054784995 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:21:50 compute-0 nova_compute[182935]: 2026-01-22 00:21:50.051 182939 DEBUG oslo_concurrency.lockutils [None req-921bc169-e435-4f4f-a679-35e4e28d68ee a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:50 compute-0 neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4[236899]: [NOTICE]   (236917) : New worker (236926) forked
Jan 22 00:21:50 compute-0 neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4[236899]: [NOTICE]   (236917) : Loading success.
Jan 22 00:21:50 compute-0 nova_compute[182935]: 2026-01-22 00:21:50.111 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:50.111 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:21:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:50.112 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:21:50 compute-0 nova_compute[182935]: 2026-01-22 00:21:50.139 182939 DEBUG nova.network.neutron [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Updated VIF entry in instance network info cache for port 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:21:50 compute-0 nova_compute[182935]: 2026-01-22 00:21:50.140 182939 DEBUG nova.network.neutron [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Updating instance_info_cache with network_info: [{"id": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "address": "fa:16:3e:9a:a4:80", "network": {"id": "24048504-9ea0-4a65-b29c-34c7f7bbd9c4", "bridge": "br-int", "label": "tempest-network-smoke--1978334606", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20c1ab7c-2f", "ovs_interfaceid": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:50 compute-0 nova_compute[182935]: 2026-01-22 00:21:50.173 182939 DEBUG oslo_concurrency.lockutils [req-a01af865-8a39-4834-9d49-b57db172b781 req-d955154f-6436-4fd9-b251-878bc5cf3265 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30efffcc-860c-4c44-a2d2-66d14866e670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:21:51 compute-0 nova_compute[182935]: 2026-01-22 00:21:51.880 182939 DEBUG nova.compute.manager [req-8fdcb043-53f8-43aa-9f1d-e351a6e02829 req-12b72f26-5848-416e-bf67-771b69475eec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Received event network-vif-plugged-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:51 compute-0 nova_compute[182935]: 2026-01-22 00:21:51.880 182939 DEBUG oslo_concurrency.lockutils [req-8fdcb043-53f8-43aa-9f1d-e351a6e02829 req-12b72f26-5848-416e-bf67-771b69475eec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:21:51 compute-0 nova_compute[182935]: 2026-01-22 00:21:51.880 182939 DEBUG oslo_concurrency.lockutils [req-8fdcb043-53f8-43aa-9f1d-e351a6e02829 req-12b72f26-5848-416e-bf67-771b69475eec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:21:51 compute-0 nova_compute[182935]: 2026-01-22 00:21:51.881 182939 DEBUG oslo_concurrency.lockutils [req-8fdcb043-53f8-43aa-9f1d-e351a6e02829 req-12b72f26-5848-416e-bf67-771b69475eec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:21:51 compute-0 nova_compute[182935]: 2026-01-22 00:21:51.881 182939 DEBUG nova.compute.manager [req-8fdcb043-53f8-43aa-9f1d-e351a6e02829 req-12b72f26-5848-416e-bf67-771b69475eec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] No waiting events found dispatching network-vif-plugged-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:21:51 compute-0 nova_compute[182935]: 2026-01-22 00:21:51.881 182939 WARNING nova.compute.manager [req-8fdcb043-53f8-43aa-9f1d-e351a6e02829 req-12b72f26-5848-416e-bf67-771b69475eec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Received unexpected event network-vif-plugged-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 for instance with vm_state active and task_state None.
Jan 22 00:21:52 compute-0 nova_compute[182935]: 2026-01-22 00:21:52.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:53 compute-0 nova_compute[182935]: 2026-01-22 00:21:53.084 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:53 compute-0 nova_compute[182935]: 2026-01-22 00:21:53.399 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:54 compute-0 nova_compute[182935]: 2026-01-22 00:21:54.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:55 compute-0 NetworkManager[55139]: <info>  [1769041315.4330] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Jan 22 00:21:55 compute-0 NetworkManager[55139]: <info>  [1769041315.4339] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Jan 22 00:21:55 compute-0 nova_compute[182935]: 2026-01-22 00:21:55.434 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:55 compute-0 nova_compute[182935]: 2026-01-22 00:21:55.496 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:55 compute-0 ovn_controller[95047]: 2026-01-22T00:21:55Z|00580|binding|INFO|Releasing lport 8e10c195-4e3a-433c-8d05-fceb257b3abf from this chassis (sb_readonly=0)
Jan 22 00:21:55 compute-0 nova_compute[182935]: 2026-01-22 00:21:55.503 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:55 compute-0 nova_compute[182935]: 2026-01-22 00:21:55.888 182939 DEBUG nova.compute.manager [req-51a005b6-c460-48c4-bb0d-ae4ee471fcdd req-0013fde2-d11a-4fcb-b6cb-9e511cc1d3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Received event network-changed-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:21:55 compute-0 nova_compute[182935]: 2026-01-22 00:21:55.888 182939 DEBUG nova.compute.manager [req-51a005b6-c460-48c4-bb0d-ae4ee471fcdd req-0013fde2-d11a-4fcb-b6cb-9e511cc1d3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Refreshing instance network info cache due to event network-changed-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:21:55 compute-0 nova_compute[182935]: 2026-01-22 00:21:55.888 182939 DEBUG oslo_concurrency.lockutils [req-51a005b6-c460-48c4-bb0d-ae4ee471fcdd req-0013fde2-d11a-4fcb-b6cb-9e511cc1d3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30efffcc-860c-4c44-a2d2-66d14866e670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:21:55 compute-0 nova_compute[182935]: 2026-01-22 00:21:55.889 182939 DEBUG oslo_concurrency.lockutils [req-51a005b6-c460-48c4-bb0d-ae4ee471fcdd req-0013fde2-d11a-4fcb-b6cb-9e511cc1d3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30efffcc-860c-4c44-a2d2-66d14866e670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:21:55 compute-0 nova_compute[182935]: 2026-01-22 00:21:55.889 182939 DEBUG nova.network.neutron [req-51a005b6-c460-48c4-bb0d-ae4ee471fcdd req-0013fde2-d11a-4fcb-b6cb-9e511cc1d3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Refreshing network info cache for port 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:21:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:21:56.114 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:21:56 compute-0 podman[236936]: 2026-01-22 00:21:56.681711368 +0000 UTC m=+0.054538069 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 22 00:21:57 compute-0 nova_compute[182935]: 2026-01-22 00:21:57.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:57 compute-0 nova_compute[182935]: 2026-01-22 00:21:57.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:21:58 compute-0 nova_compute[182935]: 2026-01-22 00:21:58.009 182939 DEBUG nova.network.neutron [req-51a005b6-c460-48c4-bb0d-ae4ee471fcdd req-0013fde2-d11a-4fcb-b6cb-9e511cc1d3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Updated VIF entry in instance network info cache for port 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:21:58 compute-0 nova_compute[182935]: 2026-01-22 00:21:58.009 182939 DEBUG nova.network.neutron [req-51a005b6-c460-48c4-bb0d-ae4ee471fcdd req-0013fde2-d11a-4fcb-b6cb-9e511cc1d3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Updating instance_info_cache with network_info: [{"id": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "address": "fa:16:3e:9a:a4:80", "network": {"id": "24048504-9ea0-4a65-b29c-34c7f7bbd9c4", "bridge": "br-int", "label": "tempest-network-smoke--1978334606", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20c1ab7c-2f", "ovs_interfaceid": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:21:58 compute-0 nova_compute[182935]: 2026-01-22 00:21:58.032 182939 DEBUG oslo_concurrency.lockutils [req-51a005b6-c460-48c4-bb0d-ae4ee471fcdd req-0013fde2-d11a-4fcb-b6cb-9e511cc1d3ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30efffcc-860c-4c44-a2d2-66d14866e670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:21:58 compute-0 nova_compute[182935]: 2026-01-22 00:21:58.148 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:21:58 compute-0 nova_compute[182935]: 2026-01-22 00:21:58.401 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:02 compute-0 ovn_controller[95047]: 2026-01-22T00:22:02Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9a:a4:80 10.100.0.5
Jan 22 00:22:02 compute-0 ovn_controller[95047]: 2026-01-22T00:22:02Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9a:a4:80 10.100.0.5
Jan 22 00:22:02 compute-0 podman[236971]: 2026-01-22 00:22:02.681875907 +0000 UTC m=+0.053224628 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git)
Jan 22 00:22:02 compute-0 podman[236972]: 2026-01-22 00:22:02.690791498 +0000 UTC m=+0.059639250 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:22:02 compute-0 nova_compute[182935]: 2026-01-22 00:22:02.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:03 compute-0 nova_compute[182935]: 2026-01-22 00:22:03.148 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:03.219 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:03.220 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:03.220 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:03 compute-0 nova_compute[182935]: 2026-01-22 00:22:03.403 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:03 compute-0 nova_compute[182935]: 2026-01-22 00:22:03.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:05 compute-0 ovn_controller[95047]: 2026-01-22T00:22:05Z|00581|binding|INFO|Releasing lport 8e10c195-4e3a-433c-8d05-fceb257b3abf from this chassis (sb_readonly=0)
Jan 22 00:22:05 compute-0 nova_compute[182935]: 2026-01-22 00:22:05.430 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:05 compute-0 nova_compute[182935]: 2026-01-22 00:22:05.498 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:08 compute-0 nova_compute[182935]: 2026-01-22 00:22:08.150 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:08 compute-0 nova_compute[182935]: 2026-01-22 00:22:08.406 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:13 compute-0 nova_compute[182935]: 2026-01-22 00:22:13.152 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:13 compute-0 nova_compute[182935]: 2026-01-22 00:22:13.408 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.488 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.766 182939 DEBUG nova.compute.manager [req-7c73a11e-da40-489c-ab2a-4394342ceb01 req-ef2be8cb-fe74-449a-8214-4bbb57939fc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Received event network-changed-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.766 182939 DEBUG nova.compute.manager [req-7c73a11e-da40-489c-ab2a-4394342ceb01 req-ef2be8cb-fe74-449a-8214-4bbb57939fc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Refreshing instance network info cache due to event network-changed-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.767 182939 DEBUG oslo_concurrency.lockutils [req-7c73a11e-da40-489c-ab2a-4394342ceb01 req-ef2be8cb-fe74-449a-8214-4bbb57939fc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-30efffcc-860c-4c44-a2d2-66d14866e670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.767 182939 DEBUG oslo_concurrency.lockutils [req-7c73a11e-da40-489c-ab2a-4394342ceb01 req-ef2be8cb-fe74-449a-8214-4bbb57939fc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-30efffcc-860c-4c44-a2d2-66d14866e670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.767 182939 DEBUG nova.network.neutron [req-7c73a11e-da40-489c-ab2a-4394342ceb01 req-ef2be8cb-fe74-449a-8214-4bbb57939fc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Refreshing network info cache for port 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.863 182939 DEBUG oslo_concurrency.lockutils [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "30efffcc-860c-4c44-a2d2-66d14866e670" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.864 182939 DEBUG oslo_concurrency.lockutils [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.864 182939 DEBUG oslo_concurrency.lockutils [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.864 182939 DEBUG oslo_concurrency.lockutils [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.865 182939 DEBUG oslo_concurrency.lockutils [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.876 182939 INFO nova.compute.manager [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Terminating instance
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.890 182939 DEBUG nova.compute.manager [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:22:16 compute-0 kernel: tap20c1ab7c-2f (unregistering): left promiscuous mode
Jan 22 00:22:16 compute-0 NetworkManager[55139]: <info>  [1769041336.9213] device (tap20c1ab7c-2f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.933 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:16 compute-0 ovn_controller[95047]: 2026-01-22T00:22:16Z|00582|binding|INFO|Releasing lport 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 from this chassis (sb_readonly=0)
Jan 22 00:22:16 compute-0 ovn_controller[95047]: 2026-01-22T00:22:16Z|00583|binding|INFO|Setting lport 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 down in Southbound
Jan 22 00:22:16 compute-0 ovn_controller[95047]: 2026-01-22T00:22:16Z|00584|binding|INFO|Removing iface tap20c1ab7c-2f ovn-installed in OVS
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.935 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:16.950 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:a4:80 10.100.0.5'], port_security=['fa:16:3e:9a:a4:80 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '30efffcc-860c-4c44-a2d2-66d14866e670', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24048504-9ea0-4a65-b29c-34c7f7bbd9c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b138b370-cfb0-4e39-a579-a30ab6369230 d4ef217a-4d54-46eb-8448-7681833ff463', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39658596-4398-41ca-8cad-4415b38fd4a6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=20c1ab7c-2f3b-4beb-b1b5-1291bf2db622) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:22:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:16.951 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 in datapath 24048504-9ea0-4a65-b29c-34c7f7bbd9c4 unbound from our chassis
Jan 22 00:22:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:16.952 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24048504-9ea0-4a65-b29c-34c7f7bbd9c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:22:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:16.953 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac5cf00-6fdc-40b1-8470-a407a0f43241]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:16 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:16.954 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4 namespace which is not needed anymore
Jan 22 00:22:16 compute-0 nova_compute[182935]: 2026-01-22 00:22:16.956 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:16 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d0000009a.scope: Deactivated successfully.
Jan 22 00:22:16 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d0000009a.scope: Consumed 13.625s CPU time.
Jan 22 00:22:16 compute-0 systemd-machined[154182]: Machine qemu-76-instance-0000009a terminated.
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.153 182939 INFO nova.virt.libvirt.driver [-] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Instance destroyed successfully.
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.154 182939 DEBUG nova.objects.instance [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'resources' on Instance uuid 30efffcc-860c-4c44-a2d2-66d14866e670 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.180 182939 DEBUG nova.virt.libvirt.vif [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:21:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-469689177',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-access_point-469689177',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ac',id=154,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGZLqk154UMD/JDj7lJ3CfrfW6KwgJqSFgOc2vTC05c8jL4w4rDA6z5+qvxQD6Ac5j9KM/kE7VPjw8dhlQ124E2SmkNLT1csulZtkH6vWp6CkuwQZxALqaw7n6+otvEy3A==',key_name='tempest-TestSecurityGroupsBasicOps-1024996681',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:21:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-zo139yfy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:21:49Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=30efffcc-860c-4c44-a2d2-66d14866e670,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "address": "fa:16:3e:9a:a4:80", "network": {"id": "24048504-9ea0-4a65-b29c-34c7f7bbd9c4", "bridge": "br-int", "label": "tempest-network-smoke--1978334606", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20c1ab7c-2f", "ovs_interfaceid": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.181 182939 DEBUG nova.network.os_vif_util [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "address": "fa:16:3e:9a:a4:80", "network": {"id": "24048504-9ea0-4a65-b29c-34c7f7bbd9c4", "bridge": "br-int", "label": "tempest-network-smoke--1978334606", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20c1ab7c-2f", "ovs_interfaceid": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.181 182939 DEBUG nova.network.os_vif_util [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:a4:80,bridge_name='br-int',has_traffic_filtering=True,id=20c1ab7c-2f3b-4beb-b1b5-1291bf2db622,network=Network(24048504-9ea0-4a65-b29c-34c7f7bbd9c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20c1ab7c-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.182 182939 DEBUG os_vif [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:a4:80,bridge_name='br-int',has_traffic_filtering=True,id=20c1ab7c-2f3b-4beb-b1b5-1291bf2db622,network=Network(24048504-9ea0-4a65-b29c-34c7f7bbd9c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20c1ab7c-2f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.183 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.184 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20c1ab7c-2f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.185 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.188 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.190 182939 INFO os_vif [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:a4:80,bridge_name='br-int',has_traffic_filtering=True,id=20c1ab7c-2f3b-4beb-b1b5-1291bf2db622,network=Network(24048504-9ea0-4a65-b29c-34c7f7bbd9c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20c1ab7c-2f')
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.190 182939 INFO nova.virt.libvirt.driver [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Deleting instance files /var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670_del
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.191 182939 INFO nova.virt.libvirt.driver [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Deletion of /var/lib/nova/instances/30efffcc-860c-4c44-a2d2-66d14866e670_del complete
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.221 182939 DEBUG nova.compute.manager [req-f761d230-743b-480d-bce1-f57364e77340 req-72d593d6-cc6b-484f-ba19-49f97ca0c2cd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Received event network-vif-unplugged-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.221 182939 DEBUG oslo_concurrency.lockutils [req-f761d230-743b-480d-bce1-f57364e77340 req-72d593d6-cc6b-484f-ba19-49f97ca0c2cd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.221 182939 DEBUG oslo_concurrency.lockutils [req-f761d230-743b-480d-bce1-f57364e77340 req-72d593d6-cc6b-484f-ba19-49f97ca0c2cd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.222 182939 DEBUG oslo_concurrency.lockutils [req-f761d230-743b-480d-bce1-f57364e77340 req-72d593d6-cc6b-484f-ba19-49f97ca0c2cd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.222 182939 DEBUG nova.compute.manager [req-f761d230-743b-480d-bce1-f57364e77340 req-72d593d6-cc6b-484f-ba19-49f97ca0c2cd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] No waiting events found dispatching network-vif-unplugged-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.222 182939 DEBUG nova.compute.manager [req-f761d230-743b-480d-bce1-f57364e77340 req-72d593d6-cc6b-484f-ba19-49f97ca0c2cd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Received event network-vif-unplugged-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.286 182939 INFO nova.compute.manager [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.287 182939 DEBUG oslo.service.loopingcall [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.288 182939 DEBUG nova.compute.manager [-] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:22:17 compute-0 nova_compute[182935]: 2026-01-22 00:22:17.288 182939 DEBUG nova.network.neutron [-] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:22:17 compute-0 neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4[236899]: [NOTICE]   (236917) : haproxy version is 2.8.14-c23fe91
Jan 22 00:22:17 compute-0 neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4[236899]: [NOTICE]   (236917) : path to executable is /usr/sbin/haproxy
Jan 22 00:22:17 compute-0 neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4[236899]: [ALERT]    (236917) : Current worker (236926) exited with code 143 (Terminated)
Jan 22 00:22:17 compute-0 neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4[236899]: [WARNING]  (236917) : All workers exited. Exiting... (0)
Jan 22 00:22:17 compute-0 systemd[1]: libpod-1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1.scope: Deactivated successfully.
Jan 22 00:22:17 compute-0 podman[237037]: 2026-01-22 00:22:17.650342239 +0000 UTC m=+0.611953644 container died 1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:22:18 compute-0 nova_compute[182935]: 2026-01-22 00:22:18.154 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1-userdata-shm.mount: Deactivated successfully.
Jan 22 00:22:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-7a746f7c6453ce49efc4bd9012f50d9a350ebcc9c327797d92121081df10e05d-merged.mount: Deactivated successfully.
Jan 22 00:22:18 compute-0 podman[237068]: 2026-01-22 00:22:18.933797662 +0000 UTC m=+1.287067109 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:22:18 compute-0 podman[237037]: 2026-01-22 00:22:18.966781907 +0000 UTC m=+1.928393262 container cleanup 1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:22:18 compute-0 systemd[1]: libpod-conmon-1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1.scope: Deactivated successfully.
Jan 22 00:22:18 compute-0 podman[237067]: 2026-01-22 00:22:18.980639037 +0000 UTC m=+1.345905530 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.039 182939 DEBUG nova.network.neutron [-] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.060 182939 INFO nova.compute.manager [-] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Took 1.77 seconds to deallocate network for instance.
Jan 22 00:22:19 compute-0 podman[237134]: 2026-01-22 00:22:19.068277172 +0000 UTC m=+0.080231480 container remove 1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 22 00:22:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:19.074 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ba786131-f072-4824-9644-436ed91fb930]: (4, ('Thu Jan 22 12:22:17 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4 (1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1)\n1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1\nThu Jan 22 12:22:18 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4 (1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1)\n1141157e8969aff3ebece9ae8e5d1b9f8b40699939de76529b946d7ef56ab6b1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:19.076 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[38572be7-37b7-4a25-8bd4-02ccd7a51921]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:19.077 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24048504-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.079 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:19 compute-0 kernel: tap24048504-90: left promiscuous mode
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.090 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:19.093 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0c7fad2e-62eb-494f-9299-91bf150206e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:19.110 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8ecb10e7-5fd0-4d01-8888-80d58b96e52e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:19.112 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b2b99e18-bf66-47f6-8188-9d8eca36edcd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:19.128 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[defb34c2-1363-4e4b-bcee-d6b387543db3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 581208, 'reachable_time': 28495, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237153, 'error': None, 'target': 'ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:19 compute-0 systemd[1]: run-netns-ovnmeta\x2d24048504\x2d9ea0\x2d4a65\x2db29c\x2d34c7f7bbd9c4.mount: Deactivated successfully.
Jan 22 00:22:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:19.130 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24048504-9ea0-4a65-b29c-34c7f7bbd9c4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:22:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:19.131 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[b544f9f8-748b-4b87-840a-a874c953ae28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.171 182939 DEBUG oslo_concurrency.lockutils [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.172 182939 DEBUG oslo_concurrency.lockutils [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.178 182939 DEBUG nova.compute.manager [req-0e2ba942-3e25-4462-a1a4-3084bf77b724 req-792018a0-1435-40ba-bec8-3a8d7ac0721d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Received event network-vif-deleted-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.246 182939 DEBUG nova.compute.provider_tree [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.274 182939 DEBUG nova.scheduler.client.report [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.300 182939 DEBUG oslo_concurrency.lockutils [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.329 182939 INFO nova.scheduler.client.report [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Deleted allocations for instance 30efffcc-860c-4c44-a2d2-66d14866e670
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.334 182939 DEBUG nova.compute.manager [req-c9f3dcb2-992f-4129-b076-5d6d70b2391a req-a48e7d84-053e-40af-bc07-e34843d3e5d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Received event network-vif-plugged-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.335 182939 DEBUG oslo_concurrency.lockutils [req-c9f3dcb2-992f-4129-b076-5d6d70b2391a req-a48e7d84-053e-40af-bc07-e34843d3e5d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.335 182939 DEBUG oslo_concurrency.lockutils [req-c9f3dcb2-992f-4129-b076-5d6d70b2391a req-a48e7d84-053e-40af-bc07-e34843d3e5d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.335 182939 DEBUG oslo_concurrency.lockutils [req-c9f3dcb2-992f-4129-b076-5d6d70b2391a req-a48e7d84-053e-40af-bc07-e34843d3e5d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.336 182939 DEBUG nova.compute.manager [req-c9f3dcb2-992f-4129-b076-5d6d70b2391a req-a48e7d84-053e-40af-bc07-e34843d3e5d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] No waiting events found dispatching network-vif-plugged-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.336 182939 WARNING nova.compute.manager [req-c9f3dcb2-992f-4129-b076-5d6d70b2391a req-a48e7d84-053e-40af-bc07-e34843d3e5d2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Received unexpected event network-vif-plugged-20c1ab7c-2f3b-4beb-b1b5-1291bf2db622 for instance with vm_state deleted and task_state None.
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.442 182939 DEBUG oslo_concurrency.lockutils [None req-07c8f479-1336-45c1-af6d-3609794557ae a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "30efffcc-860c-4c44-a2d2-66d14866e670" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.643 182939 DEBUG nova.network.neutron [req-7c73a11e-da40-489c-ab2a-4394342ceb01 req-ef2be8cb-fe74-449a-8214-4bbb57939fc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Updated VIF entry in instance network info cache for port 20c1ab7c-2f3b-4beb-b1b5-1291bf2db622. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.644 182939 DEBUG nova.network.neutron [req-7c73a11e-da40-489c-ab2a-4394342ceb01 req-ef2be8cb-fe74-449a-8214-4bbb57939fc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Updating instance_info_cache with network_info: [{"id": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "address": "fa:16:3e:9a:a4:80", "network": {"id": "24048504-9ea0-4a65-b29c-34c7f7bbd9c4", "bridge": "br-int", "label": "tempest-network-smoke--1978334606", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20c1ab7c-2f", "ovs_interfaceid": "20c1ab7c-2f3b-4beb-b1b5-1291bf2db622", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:22:19 compute-0 nova_compute[182935]: 2026-01-22 00:22:19.664 182939 DEBUG oslo_concurrency.lockutils [req-7c73a11e-da40-489c-ab2a-4394342ceb01 req-ef2be8cb-fe74-449a-8214-4bbb57939fc3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-30efffcc-860c-4c44-a2d2-66d14866e670" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:22:20 compute-0 podman[237154]: 2026-01-22 00:22:20.671681329 +0000 UTC m=+0.046241081 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:22:22 compute-0 nova_compute[182935]: 2026-01-22 00:22:22.188 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:23 compute-0 nova_compute[182935]: 2026-01-22 00:22:23.157 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:22:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:22:26 compute-0 nova_compute[182935]: 2026-01-22 00:22:26.932 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:26 compute-0 nova_compute[182935]: 2026-01-22 00:22:26.986 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:27 compute-0 nova_compute[182935]: 2026-01-22 00:22:27.190 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:27 compute-0 podman[237179]: 2026-01-22 00:22:27.667685957 +0000 UTC m=+0.046670441 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 00:22:28 compute-0 nova_compute[182935]: 2026-01-22 00:22:28.159 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:30 compute-0 sshd-session[237199]: Invalid user redis from 188.166.69.60 port 49504
Jan 22 00:22:30 compute-0 sshd-session[237199]: Connection closed by invalid user redis 188.166.69.60 port 49504 [preauth]
Jan 22 00:22:31 compute-0 nova_compute[182935]: 2026-01-22 00:22:31.786 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:31 compute-0 nova_compute[182935]: 2026-01-22 00:22:31.787 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.107 182939 DEBUG nova.compute.manager [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.153 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041337.1517186, 30efffcc-860c-4c44-a2d2-66d14866e670 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.153 182939 INFO nova.compute.manager [-] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] VM Stopped (Lifecycle Event)
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.194 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.401 182939 DEBUG nova.compute.manager [None req-9ad0a8eb-6813-4880-a6f0-4922a5f495c4 - - - - - -] [instance: 30efffcc-860c-4c44-a2d2-66d14866e670] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.641 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.642 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.649 182939 DEBUG nova.virt.hardware [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.649 182939 INFO nova.compute.claims [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.894 182939 DEBUG nova.compute.provider_tree [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.920 182939 DEBUG nova.scheduler.client.report [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.942 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:32 compute-0 nova_compute[182935]: 2026-01-22 00:22:32.942 182939 DEBUG nova.compute.manager [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.004 182939 DEBUG nova.compute.manager [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.005 182939 DEBUG nova.network.neutron [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.032 182939 INFO nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.049 182939 DEBUG nova.compute.manager [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.160 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.204 182939 DEBUG nova.compute.manager [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.205 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.205 182939 INFO nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Creating image(s)
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.206 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquiring lock "/var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.206 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "/var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.207 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "/var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.219 182939 DEBUG oslo_concurrency.processutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.278 182939 DEBUG oslo_concurrency.processutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.279 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.280 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.291 182939 DEBUG oslo_concurrency.processutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.346 182939 DEBUG oslo_concurrency.processutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.348 182939 DEBUG oslo_concurrency.processutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.384 182939 DEBUG oslo_concurrency.processutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.386 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.386 182939 DEBUG oslo_concurrency.processutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.446 182939 DEBUG oslo_concurrency.processutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.448 182939 DEBUG nova.virt.disk.api [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Checking if we can resize image /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.450 182939 DEBUG oslo_concurrency.processutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.526 182939 DEBUG oslo_concurrency.processutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.527 182939 DEBUG nova.virt.disk.api [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Cannot resize image /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.528 182939 DEBUG nova.objects.instance [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lazy-loading 'migration_context' on Instance uuid c9bdc52e-a3e4-4ebb-999e-39628e000115 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.548 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.549 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Ensure instance console log exists: /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.550 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.550 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.550 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:33 compute-0 podman[237217]: 2026-01-22 00:22:33.695978635 +0000 UTC m=+0.068009769 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 00:22:33 compute-0 podman[237216]: 2026-01-22 00:22:33.715058309 +0000 UTC m=+0.091718573 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:22:33 compute-0 nova_compute[182935]: 2026-01-22 00:22:33.820 182939 DEBUG nova.policy [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '509108e93a554166b18e91e34ad8ed64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '893df2b0226a4f55801c6f14b12f84d5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:22:34 compute-0 nova_compute[182935]: 2026-01-22 00:22:34.683 182939 DEBUG nova.network.neutron [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Successfully created port: 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:22:35 compute-0 nova_compute[182935]: 2026-01-22 00:22:35.903 182939 DEBUG nova.network.neutron [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Successfully updated port: 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:22:35 compute-0 nova_compute[182935]: 2026-01-22 00:22:35.926 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquiring lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:22:35 compute-0 nova_compute[182935]: 2026-01-22 00:22:35.926 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquired lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:22:35 compute-0 nova_compute[182935]: 2026-01-22 00:22:35.926 182939 DEBUG nova.network.neutron [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:22:36 compute-0 nova_compute[182935]: 2026-01-22 00:22:36.012 182939 DEBUG nova.compute.manager [req-bcafd4be-00a3-4178-9546-b6b956a9548b req-98e191da-b8dd-4df5-a851-aeb4236bb4b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-changed-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:36 compute-0 nova_compute[182935]: 2026-01-22 00:22:36.012 182939 DEBUG nova.compute.manager [req-bcafd4be-00a3-4178-9546-b6b956a9548b req-98e191da-b8dd-4df5-a851-aeb4236bb4b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Refreshing instance network info cache due to event network-changed-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:22:36 compute-0 nova_compute[182935]: 2026-01-22 00:22:36.013 182939 DEBUG oslo_concurrency.lockutils [req-bcafd4be-00a3-4178-9546-b6b956a9548b req-98e191da-b8dd-4df5-a851-aeb4236bb4b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:22:37 compute-0 nova_compute[182935]: 2026-01-22 00:22:37.033 182939 DEBUG nova.network.neutron [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:22:37 compute-0 nova_compute[182935]: 2026-01-22 00:22:37.199 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:38 compute-0 nova_compute[182935]: 2026-01-22 00:22:38.162 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:38 compute-0 nova_compute[182935]: 2026-01-22 00:22:38.791 182939 DEBUG nova.network.neutron [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Updating instance_info_cache with network_info: [{"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.012 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Releasing lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.012 182939 DEBUG nova.compute.manager [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Instance network_info: |[{"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.014 182939 DEBUG oslo_concurrency.lockutils [req-bcafd4be-00a3-4178-9546-b6b956a9548b req-98e191da-b8dd-4df5-a851-aeb4236bb4b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.014 182939 DEBUG nova.network.neutron [req-bcafd4be-00a3-4178-9546-b6b956a9548b req-98e191da-b8dd-4df5-a851-aeb4236bb4b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Refreshing network info cache for port 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.019 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Start _get_guest_xml network_info=[{"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.026 182939 WARNING nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.032 182939 DEBUG nova.virt.libvirt.host [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.033 182939 DEBUG nova.virt.libvirt.host [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.037 182939 DEBUG nova.virt.libvirt.host [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.039 182939 DEBUG nova.virt.libvirt.host [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.041 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.042 182939 DEBUG nova.virt.hardware [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.042 182939 DEBUG nova.virt.hardware [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.043 182939 DEBUG nova.virt.hardware [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.044 182939 DEBUG nova.virt.hardware [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.044 182939 DEBUG nova.virt.hardware [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.045 182939 DEBUG nova.virt.hardware [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.045 182939 DEBUG nova.virt.hardware [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.046 182939 DEBUG nova.virt.hardware [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.046 182939 DEBUG nova.virt.hardware [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.048 182939 DEBUG nova.virt.hardware [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.048 182939 DEBUG nova.virt.hardware [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.056 182939 DEBUG nova.virt.libvirt.vif [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:22:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1737213545',display_name='tempest-TestServerAdvancedOps-server-1737213545',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1737213545',id=156,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='893df2b0226a4f55801c6f14b12f84d5',ramdisk_id='',reservation_id='r-4u5rh31r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-1371967218',owner_user_name='tempest-TestServerAdvancedOps-1371967218-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:22:33Z,user_data=None,user_id='509108e93a554166b18e91e34ad8ed64',uuid=c9bdc52e-a3e4-4ebb-999e-39628e000115,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.057 182939 DEBUG nova.network.os_vif_util [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Converting VIF {"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.059 182939 DEBUG nova.network.os_vif_util [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.060 182939 DEBUG nova.objects.instance [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lazy-loading 'pci_devices' on Instance uuid c9bdc52e-a3e4-4ebb-999e-39628e000115 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.079 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:22:39 compute-0 nova_compute[182935]:   <uuid>c9bdc52e-a3e4-4ebb-999e-39628e000115</uuid>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   <name>instance-0000009c</name>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <nova:name>tempest-TestServerAdvancedOps-server-1737213545</nova:name>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:22:39</nova:creationTime>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:22:39 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:22:39 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:22:39 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:22:39 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:22:39 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:22:39 compute-0 nova_compute[182935]:         <nova:user uuid="509108e93a554166b18e91e34ad8ed64">tempest-TestServerAdvancedOps-1371967218-project-member</nova:user>
Jan 22 00:22:39 compute-0 nova_compute[182935]:         <nova:project uuid="893df2b0226a4f55801c6f14b12f84d5">tempest-TestServerAdvancedOps-1371967218</nova:project>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:22:39 compute-0 nova_compute[182935]:         <nova:port uuid="46e22c15-8eb0-4e38-87a0-52fe7c81a3f2">
Jan 22 00:22:39 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <system>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <entry name="serial">c9bdc52e-a3e4-4ebb-999e-39628e000115</entry>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <entry name="uuid">c9bdc52e-a3e4-4ebb-999e-39628e000115</entry>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     </system>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   <os>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   </os>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   <features>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   </features>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk.config"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:06:7a:8c"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <target dev="tap46e22c15-8e"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/console.log" append="off"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <video>
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     </video>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:22:39 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:22:39 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:22:39 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:22:39 compute-0 nova_compute[182935]: </domain>
Jan 22 00:22:39 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.081 182939 DEBUG nova.compute.manager [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Preparing to wait for external event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.082 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.082 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.082 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.083 182939 DEBUG nova.virt.libvirt.vif [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:22:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1737213545',display_name='tempest-TestServerAdvancedOps-server-1737213545',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1737213545',id=156,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='893df2b0226a4f55801c6f14b12f84d5',ramdisk_id='',reservation_id='r-4u5rh31r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-1371967218',owner_user_name='tempest-TestServerAdvancedOps-1371967218-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:22:33Z,user_data=None,user_id='509108e93a554166b18e91e34ad8ed64',uuid=c9bdc52e-a3e4-4ebb-999e-39628e000115,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.083 182939 DEBUG nova.network.os_vif_util [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Converting VIF {"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.084 182939 DEBUG nova.network.os_vif_util [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.084 182939 DEBUG os_vif [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.085 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.085 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.086 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.089 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.090 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap46e22c15-8e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.090 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap46e22c15-8e, col_values=(('external_ids', {'iface-id': '46e22c15-8eb0-4e38-87a0-52fe7c81a3f2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:7a:8c', 'vm-uuid': 'c9bdc52e-a3e4-4ebb-999e-39628e000115'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.131 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:39 compute-0 NetworkManager[55139]: <info>  [1769041359.1323] manager: (tap46e22c15-8e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.134 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.142 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.143 182939 INFO os_vif [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e')
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.215 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.216 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.216 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] No VIF found with MAC fa:16:3e:06:7a:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:22:39 compute-0 nova_compute[182935]: 2026-01-22 00:22:39.217 182939 INFO nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Using config drive
Jan 22 00:22:40 compute-0 nova_compute[182935]: 2026-01-22 00:22:40.409 182939 INFO nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Creating config drive at /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk.config
Jan 22 00:22:40 compute-0 nova_compute[182935]: 2026-01-22 00:22:40.415 182939 DEBUG oslo_concurrency.processutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwlm7syv7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:40 compute-0 nova_compute[182935]: 2026-01-22 00:22:40.541 182939 DEBUG oslo_concurrency.processutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwlm7syv7" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:40 compute-0 kernel: tap46e22c15-8e: entered promiscuous mode
Jan 22 00:22:40 compute-0 NetworkManager[55139]: <info>  [1769041360.5938] manager: (tap46e22c15-8e): new Tun device (/org/freedesktop/NetworkManager/Devices/283)
Jan 22 00:22:40 compute-0 ovn_controller[95047]: 2026-01-22T00:22:40Z|00585|binding|INFO|Claiming lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for this chassis.
Jan 22 00:22:40 compute-0 ovn_controller[95047]: 2026-01-22T00:22:40Z|00586|binding|INFO|46e22c15-8eb0-4e38-87a0-52fe7c81a3f2: Claiming fa:16:3e:06:7a:8c 10.100.0.4
Jan 22 00:22:40 compute-0 nova_compute[182935]: 2026-01-22 00:22:40.595 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:40.607 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:7a:8c 10.100.0.4'], port_security=['fa:16:3e:06:7a:8c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c9bdc52e-a3e4-4ebb-999e-39628e000115', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ca42520-5b08-4a77-acde-54242e5ad5e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '893df2b0226a4f55801c6f14b12f84d5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5232bd56-e083-4dbd-b565-94d1a97f56aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70527286-70a5-42d6-bfea-433a5bf5d611, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:22:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:40.608 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 in datapath 0ca42520-5b08-4a77-acde-54242e5ad5e0 bound to our chassis
Jan 22 00:22:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:40.609 104408 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0ca42520-5b08-4a77-acde-54242e5ad5e0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:22:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:40.610 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ca4df1d1-4609-43ce-8f40-bd9f5cf81bac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:40 compute-0 systemd-udevd[237276]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:22:40 compute-0 nova_compute[182935]: 2026-01-22 00:22:40.629 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:40 compute-0 ovn_controller[95047]: 2026-01-22T00:22:40Z|00587|binding|INFO|Setting lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 ovn-installed in OVS
Jan 22 00:22:40 compute-0 ovn_controller[95047]: 2026-01-22T00:22:40Z|00588|binding|INFO|Setting lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 up in Southbound
Jan 22 00:22:40 compute-0 NetworkManager[55139]: <info>  [1769041360.6349] device (tap46e22c15-8e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:22:40 compute-0 nova_compute[182935]: 2026-01-22 00:22:40.634 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:40 compute-0 NetworkManager[55139]: <info>  [1769041360.6360] device (tap46e22c15-8e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:22:40 compute-0 systemd-machined[154182]: New machine qemu-77-instance-0000009c.
Jan 22 00:22:40 compute-0 systemd[1]: Started Virtual Machine qemu-77-instance-0000009c.
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.185 182939 DEBUG nova.compute.manager [req-3c5fefe8-0794-4562-b224-6f59caba2776 req-aabd5183-a9c7-447e-a992-5ea34dde754d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.186 182939 DEBUG oslo_concurrency.lockutils [req-3c5fefe8-0794-4562-b224-6f59caba2776 req-aabd5183-a9c7-447e-a992-5ea34dde754d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.186 182939 DEBUG oslo_concurrency.lockutils [req-3c5fefe8-0794-4562-b224-6f59caba2776 req-aabd5183-a9c7-447e-a992-5ea34dde754d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.187 182939 DEBUG oslo_concurrency.lockutils [req-3c5fefe8-0794-4562-b224-6f59caba2776 req-aabd5183-a9c7-447e-a992-5ea34dde754d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.187 182939 DEBUG nova.compute.manager [req-3c5fefe8-0794-4562-b224-6f59caba2776 req-aabd5183-a9c7-447e-a992-5ea34dde754d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Processing event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.568 182939 DEBUG nova.compute.manager [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.569 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041361.5677016, c9bdc52e-a3e4-4ebb-999e-39628e000115 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.569 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] VM Started (Lifecycle Event)
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.571 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.574 182939 INFO nova.virt.libvirt.driver [-] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Instance spawned successfully.
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.574 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.596 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.600 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.600 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.601 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.601 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.602 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.602 182939 DEBUG nova.virt.libvirt.driver [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.605 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.612 182939 DEBUG nova.network.neutron [req-bcafd4be-00a3-4178-9546-b6b956a9548b req-98e191da-b8dd-4df5-a851-aeb4236bb4b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Updated VIF entry in instance network info cache for port 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.613 182939 DEBUG nova.network.neutron [req-bcafd4be-00a3-4178-9546-b6b956a9548b req-98e191da-b8dd-4df5-a851-aeb4236bb4b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Updating instance_info_cache with network_info: [{"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.643 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.644 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041361.5689378, c9bdc52e-a3e4-4ebb-999e-39628e000115 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.644 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] VM Paused (Lifecycle Event)
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.650 182939 DEBUG oslo_concurrency.lockutils [req-bcafd4be-00a3-4178-9546-b6b956a9548b req-98e191da-b8dd-4df5-a851-aeb4236bb4b1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.671 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.673 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041361.5708637, c9bdc52e-a3e4-4ebb-999e-39628e000115 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.673 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] VM Resumed (Lifecycle Event)
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.698 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.701 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.709 182939 INFO nova.compute.manager [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Took 8.50 seconds to spawn the instance on the hypervisor.
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.709 182939 DEBUG nova.compute.manager [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.743 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.811 182939 INFO nova.compute.manager [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Took 9.23 seconds to build instance.
Jan 22 00:22:41 compute-0 nova_compute[182935]: 2026-01-22 00:22:41.843 182939 DEBUG oslo_concurrency.lockutils [None req-eaeee8bd-46bd-4f3c-bf35-526d2c2cd803 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.056s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:43 compute-0 nova_compute[182935]: 2026-01-22 00:22:43.164 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:43 compute-0 nova_compute[182935]: 2026-01-22 00:22:43.289 182939 DEBUG nova.compute.manager [req-7ae17303-c70c-4ff2-aeaa-ae4d382fdde8 req-93c20303-00da-4edd-b1e0-fc4fcb4a174f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:43 compute-0 nova_compute[182935]: 2026-01-22 00:22:43.289 182939 DEBUG oslo_concurrency.lockutils [req-7ae17303-c70c-4ff2-aeaa-ae4d382fdde8 req-93c20303-00da-4edd-b1e0-fc4fcb4a174f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:43 compute-0 nova_compute[182935]: 2026-01-22 00:22:43.290 182939 DEBUG oslo_concurrency.lockutils [req-7ae17303-c70c-4ff2-aeaa-ae4d382fdde8 req-93c20303-00da-4edd-b1e0-fc4fcb4a174f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:43 compute-0 nova_compute[182935]: 2026-01-22 00:22:43.290 182939 DEBUG oslo_concurrency.lockutils [req-7ae17303-c70c-4ff2-aeaa-ae4d382fdde8 req-93c20303-00da-4edd-b1e0-fc4fcb4a174f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:43 compute-0 nova_compute[182935]: 2026-01-22 00:22:43.290 182939 DEBUG nova.compute.manager [req-7ae17303-c70c-4ff2-aeaa-ae4d382fdde8 req-93c20303-00da-4edd-b1e0-fc4fcb4a174f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] No waiting events found dispatching network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:22:43 compute-0 nova_compute[182935]: 2026-01-22 00:22:43.291 182939 WARNING nova.compute.manager [req-7ae17303-c70c-4ff2-aeaa-ae4d382fdde8 req-93c20303-00da-4edd-b1e0-fc4fcb4a174f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received unexpected event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for instance with vm_state active and task_state None.
Jan 22 00:22:44 compute-0 nova_compute[182935]: 2026-01-22 00:22:44.132 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:45 compute-0 nova_compute[182935]: 2026-01-22 00:22:45.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:45 compute-0 nova_compute[182935]: 2026-01-22 00:22:45.883 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:45 compute-0 nova_compute[182935]: 2026-01-22 00:22:45.884 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:45 compute-0 nova_compute[182935]: 2026-01-22 00:22:45.884 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:45 compute-0 nova_compute[182935]: 2026-01-22 00:22:45.884 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:22:45 compute-0 nova_compute[182935]: 2026-01-22 00:22:45.954 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:45 compute-0 nova_compute[182935]: 2026-01-22 00:22:45.981 182939 DEBUG nova.objects.instance [None req-cdb94073-9ca9-4450-b187-393457cc4b5e 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lazy-loading 'pci_devices' on Instance uuid c9bdc52e-a3e4-4ebb-999e-39628e000115 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.005 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041366.005065, c9bdc52e-a3e4-4ebb-999e-39628e000115 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.005 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] VM Paused (Lifecycle Event)
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.022 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.024 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.024 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.049 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.069 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.086 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.234 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.236 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5591MB free_disk=73.12651824951172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.236 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.236 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.323 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance c9bdc52e-a3e4-4ebb-999e-39628e000115 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.323 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.323 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.358 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.378 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.403 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:22:46 compute-0 nova_compute[182935]: 2026-01-22 00:22:46.403 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:47 compute-0 kernel: tap46e22c15-8e (unregistering): left promiscuous mode
Jan 22 00:22:47 compute-0 NetworkManager[55139]: <info>  [1769041367.8998] device (tap46e22c15-8e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:22:47 compute-0 nova_compute[182935]: 2026-01-22 00:22:47.906 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:47 compute-0 ovn_controller[95047]: 2026-01-22T00:22:47Z|00589|binding|INFO|Releasing lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 from this chassis (sb_readonly=0)
Jan 22 00:22:47 compute-0 ovn_controller[95047]: 2026-01-22T00:22:47Z|00590|binding|INFO|Setting lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 down in Southbound
Jan 22 00:22:47 compute-0 ovn_controller[95047]: 2026-01-22T00:22:47Z|00591|binding|INFO|Removing iface tap46e22c15-8e ovn-installed in OVS
Jan 22 00:22:47 compute-0 nova_compute[182935]: 2026-01-22 00:22:47.909 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:47.916 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:7a:8c 10.100.0.4'], port_security=['fa:16:3e:06:7a:8c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c9bdc52e-a3e4-4ebb-999e-39628e000115', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ca42520-5b08-4a77-acde-54242e5ad5e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '893df2b0226a4f55801c6f14b12f84d5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5232bd56-e083-4dbd-b565-94d1a97f56aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70527286-70a5-42d6-bfea-433a5bf5d611, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:22:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:47.917 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 in datapath 0ca42520-5b08-4a77-acde-54242e5ad5e0 unbound from our chassis
Jan 22 00:22:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:47.918 104408 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0ca42520-5b08-4a77-acde-54242e5ad5e0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:22:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:47.920 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5387b9a9-2fa5-4496-8534-ce951ddad18e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:47 compute-0 nova_compute[182935]: 2026-01-22 00:22:47.927 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:47 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Jan 22 00:22:47 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d0000009c.scope: Consumed 5.475s CPU time.
Jan 22 00:22:47 compute-0 systemd-machined[154182]: Machine qemu-77-instance-0000009c terminated.
Jan 22 00:22:48 compute-0 nova_compute[182935]: 2026-01-22 00:22:48.098 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:48 compute-0 nova_compute[182935]: 2026-01-22 00:22:48.102 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:48 compute-0 nova_compute[182935]: 2026-01-22 00:22:48.136 182939 DEBUG nova.compute.manager [None req-cdb94073-9ca9-4450-b187-393457cc4b5e 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:48 compute-0 nova_compute[182935]: 2026-01-22 00:22:48.141 182939 DEBUG nova.compute.manager [req-c1ea72af-8ba7-42ec-a6b9-79fde80e58ce req-8dbd4ea1-88ce-4965-aa92-c0c3e8f2bea0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-unplugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:48 compute-0 nova_compute[182935]: 2026-01-22 00:22:48.141 182939 DEBUG oslo_concurrency.lockutils [req-c1ea72af-8ba7-42ec-a6b9-79fde80e58ce req-8dbd4ea1-88ce-4965-aa92-c0c3e8f2bea0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:48 compute-0 nova_compute[182935]: 2026-01-22 00:22:48.141 182939 DEBUG oslo_concurrency.lockutils [req-c1ea72af-8ba7-42ec-a6b9-79fde80e58ce req-8dbd4ea1-88ce-4965-aa92-c0c3e8f2bea0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:48 compute-0 nova_compute[182935]: 2026-01-22 00:22:48.142 182939 DEBUG oslo_concurrency.lockutils [req-c1ea72af-8ba7-42ec-a6b9-79fde80e58ce req-8dbd4ea1-88ce-4965-aa92-c0c3e8f2bea0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:48 compute-0 nova_compute[182935]: 2026-01-22 00:22:48.142 182939 DEBUG nova.compute.manager [req-c1ea72af-8ba7-42ec-a6b9-79fde80e58ce req-8dbd4ea1-88ce-4965-aa92-c0c3e8f2bea0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] No waiting events found dispatching network-vif-unplugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:22:48 compute-0 nova_compute[182935]: 2026-01-22 00:22:48.142 182939 WARNING nova.compute.manager [req-c1ea72af-8ba7-42ec-a6b9-79fde80e58ce req-8dbd4ea1-88ce-4965-aa92-c0c3e8f2bea0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received unexpected event network-vif-unplugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for instance with vm_state active and task_state suspending.
Jan 22 00:22:48 compute-0 nova_compute[182935]: 2026-01-22 00:22:48.165 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:49 compute-0 nova_compute[182935]: 2026-01-22 00:22:49.136 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:49 compute-0 nova_compute[182935]: 2026-01-22 00:22:49.404 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:49 compute-0 nova_compute[182935]: 2026-01-22 00:22:49.404 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:22:49 compute-0 podman[237329]: 2026-01-22 00:22:49.691851076 +0000 UTC m=+0.060014489 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:22:49 compute-0 podman[237328]: 2026-01-22 00:22:49.724636206 +0000 UTC m=+0.092891901 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:22:49 compute-0 nova_compute[182935]: 2026-01-22 00:22:49.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:49 compute-0 nova_compute[182935]: 2026-01-22 00:22:49.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:22:49 compute-0 nova_compute[182935]: 2026-01-22 00:22:49.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.041 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.041 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.041 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.042 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c9bdc52e-a3e4-4ebb-999e-39628e000115 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.269 182939 DEBUG nova.compute.manager [req-cce13d19-c986-4966-b897-13fdbaafcc1c req-09c54ef5-5ec2-4747-b89e-35220ad56137 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.269 182939 DEBUG oslo_concurrency.lockutils [req-cce13d19-c986-4966-b897-13fdbaafcc1c req-09c54ef5-5ec2-4747-b89e-35220ad56137 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.270 182939 DEBUG oslo_concurrency.lockutils [req-cce13d19-c986-4966-b897-13fdbaafcc1c req-09c54ef5-5ec2-4747-b89e-35220ad56137 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.270 182939 DEBUG oslo_concurrency.lockutils [req-cce13d19-c986-4966-b897-13fdbaafcc1c req-09c54ef5-5ec2-4747-b89e-35220ad56137 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.270 182939 DEBUG nova.compute.manager [req-cce13d19-c986-4966-b897-13fdbaafcc1c req-09c54ef5-5ec2-4747-b89e-35220ad56137 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] No waiting events found dispatching network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.270 182939 WARNING nova.compute.manager [req-cce13d19-c986-4966-b897-13fdbaafcc1c req-09c54ef5-5ec2-4747-b89e-35220ad56137 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received unexpected event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for instance with vm_state suspended and task_state None.
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.863 182939 INFO nova.compute.manager [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Resuming
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.864 182939 DEBUG nova.objects.instance [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lazy-loading 'flavor' on Instance uuid c9bdc52e-a3e4-4ebb-999e-39628e000115 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:50 compute-0 nova_compute[182935]: 2026-01-22 00:22:50.918 182939 DEBUG oslo_concurrency.lockutils [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquiring lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:22:51 compute-0 nova_compute[182935]: 2026-01-22 00:22:51.324 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Updating instance_info_cache with network_info: [{"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:22:51 compute-0 nova_compute[182935]: 2026-01-22 00:22:51.347 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:22:51 compute-0 nova_compute[182935]: 2026-01-22 00:22:51.348 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:22:51 compute-0 nova_compute[182935]: 2026-01-22 00:22:51.348 182939 DEBUG oslo_concurrency.lockutils [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquired lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:22:51 compute-0 nova_compute[182935]: 2026-01-22 00:22:51.348 182939 DEBUG nova.network.neutron [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:22:51 compute-0 nova_compute[182935]: 2026-01-22 00:22:51.350 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:51 compute-0 podman[237378]: 2026-01-22 00:22:51.692878366 +0000 UTC m=+0.072168738 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:22:51 compute-0 nova_compute[182935]: 2026-01-22 00:22:51.812 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.779 182939 DEBUG nova.network.neutron [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Updating instance_info_cache with network_info: [{"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.805 182939 DEBUG oslo_concurrency.lockutils [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Releasing lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.811 182939 DEBUG nova.virt.libvirt.vif [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:22:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1737213545',display_name='tempest-TestServerAdvancedOps-server-1737213545',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1737213545',id=156,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:22:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='893df2b0226a4f55801c6f14b12f84d5',ramdisk_id='',reservation_id='r-4u5rh31r',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-1371967218',owner_user_name='tempest-TestServerAdvancedOps-1371967218-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:22:48Z,user_data=None,user_id='509108e93a554166b18e91e34ad8ed64',uuid=c9bdc52e-a3e4-4ebb-999e-39628e000115,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.811 182939 DEBUG nova.network.os_vif_util [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Converting VIF {"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.812 182939 DEBUG nova.network.os_vif_util [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.812 182939 DEBUG os_vif [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.813 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.813 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.814 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.818 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.818 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap46e22c15-8e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.818 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap46e22c15-8e, col_values=(('external_ids', {'iface-id': '46e22c15-8eb0-4e38-87a0-52fe7c81a3f2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:7a:8c', 'vm-uuid': 'c9bdc52e-a3e4-4ebb-999e-39628e000115'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.819 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.819 182939 INFO os_vif [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e')
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.838 182939 DEBUG nova.objects.instance [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lazy-loading 'numa_topology' on Instance uuid c9bdc52e-a3e4-4ebb-999e-39628e000115 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:52 compute-0 kernel: tap46e22c15-8e: entered promiscuous mode
Jan 22 00:22:52 compute-0 NetworkManager[55139]: <info>  [1769041372.9224] manager: (tap46e22c15-8e): new Tun device (/org/freedesktop/NetworkManager/Devices/284)
Jan 22 00:22:52 compute-0 ovn_controller[95047]: 2026-01-22T00:22:52Z|00592|binding|INFO|Claiming lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for this chassis.
Jan 22 00:22:52 compute-0 ovn_controller[95047]: 2026-01-22T00:22:52Z|00593|binding|INFO|46e22c15-8eb0-4e38-87a0-52fe7c81a3f2: Claiming fa:16:3e:06:7a:8c 10.100.0.4
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.969 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:52.978 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:7a:8c 10.100.0.4'], port_security=['fa:16:3e:06:7a:8c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c9bdc52e-a3e4-4ebb-999e-39628e000115', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ca42520-5b08-4a77-acde-54242e5ad5e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '893df2b0226a4f55801c6f14b12f84d5', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5232bd56-e083-4dbd-b565-94d1a97f56aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70527286-70a5-42d6-bfea-433a5bf5d611, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:22:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:52.979 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 in datapath 0ca42520-5b08-4a77-acde-54242e5ad5e0 bound to our chassis
Jan 22 00:22:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:52.980 104408 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0ca42520-5b08-4a77-acde-54242e5ad5e0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:22:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:52.981 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5617e8a7-0713-4c06-8f6b-428e11928015]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:52 compute-0 ovn_controller[95047]: 2026-01-22T00:22:52Z|00594|binding|INFO|Setting lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 up in Southbound
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.984 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:52 compute-0 ovn_controller[95047]: 2026-01-22T00:22:52Z|00595|binding|INFO|Setting lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 ovn-installed in OVS
Jan 22 00:22:52 compute-0 nova_compute[182935]: 2026-01-22 00:22:52.986 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:52 compute-0 systemd-machined[154182]: New machine qemu-78-instance-0000009c.
Jan 22 00:22:52 compute-0 systemd-udevd[237418]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:22:53 compute-0 NetworkManager[55139]: <info>  [1769041373.0094] device (tap46e22c15-8e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:22:53 compute-0 NetworkManager[55139]: <info>  [1769041373.0104] device (tap46e22c15-8e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:22:53 compute-0 systemd[1]: Started Virtual Machine qemu-78-instance-0000009c.
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.168 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.222 182939 DEBUG nova.compute.manager [req-7020c7f5-bca8-4e8b-ba89-27f9384270b1 req-a07c111c-e227-4f63-9c1d-a093ef5ac2f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.223 182939 DEBUG oslo_concurrency.lockutils [req-7020c7f5-bca8-4e8b-ba89-27f9384270b1 req-a07c111c-e227-4f63-9c1d-a093ef5ac2f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.223 182939 DEBUG oslo_concurrency.lockutils [req-7020c7f5-bca8-4e8b-ba89-27f9384270b1 req-a07c111c-e227-4f63-9c1d-a093ef5ac2f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.223 182939 DEBUG oslo_concurrency.lockutils [req-7020c7f5-bca8-4e8b-ba89-27f9384270b1 req-a07c111c-e227-4f63-9c1d-a093ef5ac2f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.224 182939 DEBUG nova.compute.manager [req-7020c7f5-bca8-4e8b-ba89-27f9384270b1 req-a07c111c-e227-4f63-9c1d-a093ef5ac2f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] No waiting events found dispatching network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.224 182939 WARNING nova.compute.manager [req-7020c7f5-bca8-4e8b-ba89-27f9384270b1 req-a07c111c-e227-4f63-9c1d-a093ef5ac2f1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received unexpected event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for instance with vm_state suspended and task_state resuming.
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.448 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for c9bdc52e-a3e4-4ebb-999e-39628e000115 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.449 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041373.4471705, c9bdc52e-a3e4-4ebb-999e-39628e000115 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.449 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] VM Started (Lifecycle Event)
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.465 182939 DEBUG nova.compute.manager [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.466 182939 DEBUG nova.objects.instance [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lazy-loading 'pci_devices' on Instance uuid c9bdc52e-a3e4-4ebb-999e-39628e000115 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.485 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.492 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.496 182939 INFO nova.virt.libvirt.driver [-] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Instance running successfully.
Jan 22 00:22:53 compute-0 virtqemud[182477]: argument unsupported: QEMU guest agent is not configured
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.499 182939 DEBUG nova.virt.libvirt.guest [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.499 182939 DEBUG nova.compute.manager [None req-508d869f-e275-4e2c-bb38-4a07b8bd46cf 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.627 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] During sync_power_state the instance has a pending task (resuming). Skip.
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.627 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041373.4531548, c9bdc52e-a3e4-4ebb-999e-39628e000115 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.628 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] VM Resumed (Lifecycle Event)
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.696 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:53 compute-0 nova_compute[182935]: 2026-01-22 00:22:53.699 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:22:54 compute-0 nova_compute[182935]: 2026-01-22 00:22:54.188 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:54 compute-0 nova_compute[182935]: 2026-01-22 00:22:54.763 182939 DEBUG nova.objects.instance [None req-4980e4de-75d9-4df4-bb3b-6c33232a472e 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lazy-loading 'pci_devices' on Instance uuid c9bdc52e-a3e4-4ebb-999e-39628e000115 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:54 compute-0 nova_compute[182935]: 2026-01-22 00:22:54.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:54 compute-0 nova_compute[182935]: 2026-01-22 00:22:54.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:54 compute-0 nova_compute[182935]: 2026-01-22 00:22:54.798 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041374.7978945, c9bdc52e-a3e4-4ebb-999e-39628e000115 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:54 compute-0 nova_compute[182935]: 2026-01-22 00:22:54.798 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] VM Paused (Lifecycle Event)
Jan 22 00:22:54 compute-0 nova_compute[182935]: 2026-01-22 00:22:54.820 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:54 compute-0 nova_compute[182935]: 2026-01-22 00:22:54.824 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:22:54 compute-0 nova_compute[182935]: 2026-01-22 00:22:54.844 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 22 00:22:55 compute-0 nova_compute[182935]: 2026-01-22 00:22:55.380 182939 DEBUG nova.compute.manager [req-26433ffd-8e2b-4c2d-992e-aaec09f83258 req-cbbb4062-06a4-49c1-9288-73278b1cdbf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:55 compute-0 nova_compute[182935]: 2026-01-22 00:22:55.381 182939 DEBUG oslo_concurrency.lockutils [req-26433ffd-8e2b-4c2d-992e-aaec09f83258 req-cbbb4062-06a4-49c1-9288-73278b1cdbf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:55 compute-0 nova_compute[182935]: 2026-01-22 00:22:55.381 182939 DEBUG oslo_concurrency.lockutils [req-26433ffd-8e2b-4c2d-992e-aaec09f83258 req-cbbb4062-06a4-49c1-9288-73278b1cdbf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:55 compute-0 nova_compute[182935]: 2026-01-22 00:22:55.381 182939 DEBUG oslo_concurrency.lockutils [req-26433ffd-8e2b-4c2d-992e-aaec09f83258 req-cbbb4062-06a4-49c1-9288-73278b1cdbf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:55 compute-0 nova_compute[182935]: 2026-01-22 00:22:55.382 182939 DEBUG nova.compute.manager [req-26433ffd-8e2b-4c2d-992e-aaec09f83258 req-cbbb4062-06a4-49c1-9288-73278b1cdbf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] No waiting events found dispatching network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:22:55 compute-0 nova_compute[182935]: 2026-01-22 00:22:55.382 182939 WARNING nova.compute.manager [req-26433ffd-8e2b-4c2d-992e-aaec09f83258 req-cbbb4062-06a4-49c1-9288-73278b1cdbf0 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received unexpected event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for instance with vm_state active and task_state suspending.
Jan 22 00:22:55 compute-0 kernel: tap46e22c15-8e (unregistering): left promiscuous mode
Jan 22 00:22:55 compute-0 NetworkManager[55139]: <info>  [1769041375.5410] device (tap46e22c15-8e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:22:55 compute-0 nova_compute[182935]: 2026-01-22 00:22:55.545 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:55 compute-0 ovn_controller[95047]: 2026-01-22T00:22:55Z|00596|binding|INFO|Releasing lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 from this chassis (sb_readonly=0)
Jan 22 00:22:55 compute-0 ovn_controller[95047]: 2026-01-22T00:22:55Z|00597|binding|INFO|Setting lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 down in Southbound
Jan 22 00:22:55 compute-0 ovn_controller[95047]: 2026-01-22T00:22:55Z|00598|binding|INFO|Removing iface tap46e22c15-8e ovn-installed in OVS
Jan 22 00:22:55 compute-0 nova_compute[182935]: 2026-01-22 00:22:55.546 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:55.555 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:7a:8c 10.100.0.4'], port_security=['fa:16:3e:06:7a:8c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c9bdc52e-a3e4-4ebb-999e-39628e000115', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ca42520-5b08-4a77-acde-54242e5ad5e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '893df2b0226a4f55801c6f14b12f84d5', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5232bd56-e083-4dbd-b565-94d1a97f56aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70527286-70a5-42d6-bfea-433a5bf5d611, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:22:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:55.557 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 in datapath 0ca42520-5b08-4a77-acde-54242e5ad5e0 unbound from our chassis
Jan 22 00:22:55 compute-0 nova_compute[182935]: 2026-01-22 00:22:55.557 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:55.559 104408 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0ca42520-5b08-4a77-acde-54242e5ad5e0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:22:55 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:55.560 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9f60505d-2c01-474a-929d-1d19e51d987d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:55 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Jan 22 00:22:55 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d0000009c.scope: Consumed 1.781s CPU time.
Jan 22 00:22:55 compute-0 systemd-machined[154182]: Machine qemu-78-instance-0000009c terminated.
Jan 22 00:22:55 compute-0 NetworkManager[55139]: <info>  [1769041375.7448] manager: (tap46e22c15-8e): new Tun device (/org/freedesktop/NetworkManager/Devices/285)
Jan 22 00:22:55 compute-0 nova_compute[182935]: 2026-01-22 00:22:55.747 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:55 compute-0 nova_compute[182935]: 2026-01-22 00:22:55.754 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:55 compute-0 nova_compute[182935]: 2026-01-22 00:22:55.789 182939 DEBUG nova.compute.manager [None req-4980e4de-75d9-4df4-bb3b-6c33232a472e 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:56.709 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:22:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:56.710 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:22:56 compute-0 nova_compute[182935]: 2026-01-22 00:22:56.745 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:56 compute-0 nova_compute[182935]: 2026-01-22 00:22:56.873 182939 INFO nova.compute.manager [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Resuming
Jan 22 00:22:56 compute-0 nova_compute[182935]: 2026-01-22 00:22:56.875 182939 DEBUG nova.objects.instance [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lazy-loading 'flavor' on Instance uuid c9bdc52e-a3e4-4ebb-999e-39628e000115 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:56 compute-0 nova_compute[182935]: 2026-01-22 00:22:56.941 182939 DEBUG oslo_concurrency.lockutils [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquiring lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:22:56 compute-0 nova_compute[182935]: 2026-01-22 00:22:56.942 182939 DEBUG oslo_concurrency.lockutils [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquired lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:22:56 compute-0 nova_compute[182935]: 2026-01-22 00:22:56.942 182939 DEBUG nova.network.neutron [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.506 182939 DEBUG nova.compute.manager [req-c29b0cda-1664-4399-b9d5-ce1518c3f681 req-8844fd3e-ecff-473f-80aa-e77148a0926c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-unplugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.507 182939 DEBUG oslo_concurrency.lockutils [req-c29b0cda-1664-4399-b9d5-ce1518c3f681 req-8844fd3e-ecff-473f-80aa-e77148a0926c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.507 182939 DEBUG oslo_concurrency.lockutils [req-c29b0cda-1664-4399-b9d5-ce1518c3f681 req-8844fd3e-ecff-473f-80aa-e77148a0926c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.508 182939 DEBUG oslo_concurrency.lockutils [req-c29b0cda-1664-4399-b9d5-ce1518c3f681 req-8844fd3e-ecff-473f-80aa-e77148a0926c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.508 182939 DEBUG nova.compute.manager [req-c29b0cda-1664-4399-b9d5-ce1518c3f681 req-8844fd3e-ecff-473f-80aa-e77148a0926c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] No waiting events found dispatching network-vif-unplugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.509 182939 WARNING nova.compute.manager [req-c29b0cda-1664-4399-b9d5-ce1518c3f681 req-8844fd3e-ecff-473f-80aa-e77148a0926c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received unexpected event network-vif-unplugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for instance with vm_state suspended and task_state resuming.
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.509 182939 DEBUG nova.compute.manager [req-c29b0cda-1664-4399-b9d5-ce1518c3f681 req-8844fd3e-ecff-473f-80aa-e77148a0926c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.510 182939 DEBUG oslo_concurrency.lockutils [req-c29b0cda-1664-4399-b9d5-ce1518c3f681 req-8844fd3e-ecff-473f-80aa-e77148a0926c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.510 182939 DEBUG oslo_concurrency.lockutils [req-c29b0cda-1664-4399-b9d5-ce1518c3f681 req-8844fd3e-ecff-473f-80aa-e77148a0926c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.511 182939 DEBUG oslo_concurrency.lockutils [req-c29b0cda-1664-4399-b9d5-ce1518c3f681 req-8844fd3e-ecff-473f-80aa-e77148a0926c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.511 182939 DEBUG nova.compute.manager [req-c29b0cda-1664-4399-b9d5-ce1518c3f681 req-8844fd3e-ecff-473f-80aa-e77148a0926c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] No waiting events found dispatching network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.512 182939 WARNING nova.compute.manager [req-c29b0cda-1664-4399-b9d5-ce1518c3f681 req-8844fd3e-ecff-473f-80aa-e77148a0926c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received unexpected event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for instance with vm_state suspended and task_state resuming.
Jan 22 00:22:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:57.711 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:57 compute-0 nova_compute[182935]: 2026-01-22 00:22:57.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.170 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.387 182939 DEBUG nova.network.neutron [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Updating instance_info_cache with network_info: [{"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.407 182939 DEBUG oslo_concurrency.lockutils [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Releasing lock "refresh_cache-c9bdc52e-a3e4-4ebb-999e-39628e000115" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.416 182939 DEBUG nova.virt.libvirt.vif [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:22:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1737213545',display_name='tempest-TestServerAdvancedOps-server-1737213545',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1737213545',id=156,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:22:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='893df2b0226a4f55801c6f14b12f84d5',ramdisk_id='',reservation_id='r-4u5rh31r',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-1371967218',owner_user_name='tempest-TestServerAdvancedOps-1371967218-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:22:55Z,user_data=None,user_id='509108e93a554166b18e91e34ad8ed64',uuid=c9bdc52e-a3e4-4ebb-999e-39628e000115,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.416 182939 DEBUG nova.network.os_vif_util [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Converting VIF {"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.417 182939 DEBUG nova.network.os_vif_util [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.418 182939 DEBUG os_vif [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.419 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.419 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.420 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.424 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.424 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap46e22c15-8e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.425 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap46e22c15-8e, col_values=(('external_ids', {'iface-id': '46e22c15-8eb0-4e38-87a0-52fe7c81a3f2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:7a:8c', 'vm-uuid': 'c9bdc52e-a3e4-4ebb-999e-39628e000115'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.426 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.426 182939 INFO os_vif [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e')
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.448 182939 DEBUG nova.objects.instance [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lazy-loading 'numa_topology' on Instance uuid c9bdc52e-a3e4-4ebb-999e-39628e000115 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:58 compute-0 podman[237461]: 2026-01-22 00:22:58.530755142 +0000 UTC m=+0.061197788 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 00:22:58 compute-0 kernel: tap46e22c15-8e: entered promiscuous mode
Jan 22 00:22:58 compute-0 NetworkManager[55139]: <info>  [1769041378.5413] manager: (tap46e22c15-8e): new Tun device (/org/freedesktop/NetworkManager/Devices/286)
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.543 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:58 compute-0 ovn_controller[95047]: 2026-01-22T00:22:58Z|00599|binding|INFO|Claiming lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for this chassis.
Jan 22 00:22:58 compute-0 ovn_controller[95047]: 2026-01-22T00:22:58Z|00600|binding|INFO|46e22c15-8eb0-4e38-87a0-52fe7c81a3f2: Claiming fa:16:3e:06:7a:8c 10.100.0.4
Jan 22 00:22:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:58.554 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:7a:8c 10.100.0.4'], port_security=['fa:16:3e:06:7a:8c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c9bdc52e-a3e4-4ebb-999e-39628e000115', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ca42520-5b08-4a77-acde-54242e5ad5e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '893df2b0226a4f55801c6f14b12f84d5', 'neutron:revision_number': '7', 'neutron:security_group_ids': '5232bd56-e083-4dbd-b565-94d1a97f56aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70527286-70a5-42d6-bfea-433a5bf5d611, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:22:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:58.555 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 in datapath 0ca42520-5b08-4a77-acde-54242e5ad5e0 bound to our chassis
Jan 22 00:22:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:58.556 104408 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0ca42520-5b08-4a77-acde-54242e5ad5e0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:22:58 compute-0 ovn_controller[95047]: 2026-01-22T00:22:58Z|00601|binding|INFO|Setting lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 up in Southbound
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.556 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:58 compute-0 ovn_controller[95047]: 2026-01-22T00:22:58Z|00602|binding|INFO|Setting lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 ovn-installed in OVS
Jan 22 00:22:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:22:58.557 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c62bdb-6928-4bb3-83c2-e641fce66e4c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.558 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:58 compute-0 systemd-udevd[237491]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:22:58 compute-0 NetworkManager[55139]: <info>  [1769041378.5865] device (tap46e22c15-8e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:22:58 compute-0 NetworkManager[55139]: <info>  [1769041378.5879] device (tap46e22c15-8e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:22:58 compute-0 sshd-session[237458]: Received disconnect from 45.148.10.157 port 11106:11:  [preauth]
Jan 22 00:22:58 compute-0 sshd-session[237458]: Disconnected from authenticating user root 45.148.10.157 port 11106 [preauth]
Jan 22 00:22:58 compute-0 systemd-machined[154182]: New machine qemu-79-instance-0000009c.
Jan 22 00:22:58 compute-0 systemd[1]: Started Virtual Machine qemu-79-instance-0000009c.
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.980 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for c9bdc52e-a3e4-4ebb-999e-39628e000115 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.981 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041378.979573, c9bdc52e-a3e4-4ebb-999e-39628e000115 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:58 compute-0 nova_compute[182935]: 2026-01-22 00:22:58.981 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] VM Started (Lifecycle Event)
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.003 182939 DEBUG nova.compute.manager [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.003 182939 DEBUG nova.objects.instance [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lazy-loading 'pci_devices' on Instance uuid c9bdc52e-a3e4-4ebb-999e-39628e000115 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.006 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.010 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.026 182939 INFO nova.virt.libvirt.driver [-] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Instance running successfully.
Jan 22 00:22:59 compute-0 virtqemud[182477]: argument unsupported: QEMU guest agent is not configured
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.029 182939 DEBUG nova.virt.libvirt.guest [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.029 182939 DEBUG nova.compute.manager [None req-c7164eb6-1421-4805-8b9c-2f1009133e92 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.036 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] During sync_power_state the instance has a pending task (resuming). Skip.
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.043 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041378.98725, c9bdc52e-a3e4-4ebb-999e-39628e000115 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.044 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] VM Resumed (Lifecycle Event)
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.080 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.084 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.230 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.601 182939 DEBUG nova.compute.manager [req-304215f1-8617-4805-b984-fac86bff6a96 req-7895ac34-fa05-47e5-9d3b-a69fffed47ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.602 182939 DEBUG oslo_concurrency.lockutils [req-304215f1-8617-4805-b984-fac86bff6a96 req-7895ac34-fa05-47e5-9d3b-a69fffed47ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.603 182939 DEBUG oslo_concurrency.lockutils [req-304215f1-8617-4805-b984-fac86bff6a96 req-7895ac34-fa05-47e5-9d3b-a69fffed47ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.603 182939 DEBUG oslo_concurrency.lockutils [req-304215f1-8617-4805-b984-fac86bff6a96 req-7895ac34-fa05-47e5-9d3b-a69fffed47ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.604 182939 DEBUG nova.compute.manager [req-304215f1-8617-4805-b984-fac86bff6a96 req-7895ac34-fa05-47e5-9d3b-a69fffed47ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] No waiting events found dispatching network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.604 182939 WARNING nova.compute.manager [req-304215f1-8617-4805-b984-fac86bff6a96 req-7895ac34-fa05-47e5-9d3b-a69fffed47ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received unexpected event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for instance with vm_state active and task_state None.
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.605 182939 DEBUG nova.compute.manager [req-304215f1-8617-4805-b984-fac86bff6a96 req-7895ac34-fa05-47e5-9d3b-a69fffed47ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.605 182939 DEBUG oslo_concurrency.lockutils [req-304215f1-8617-4805-b984-fac86bff6a96 req-7895ac34-fa05-47e5-9d3b-a69fffed47ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.606 182939 DEBUG oslo_concurrency.lockutils [req-304215f1-8617-4805-b984-fac86bff6a96 req-7895ac34-fa05-47e5-9d3b-a69fffed47ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.606 182939 DEBUG oslo_concurrency.lockutils [req-304215f1-8617-4805-b984-fac86bff6a96 req-7895ac34-fa05-47e5-9d3b-a69fffed47ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.606 182939 DEBUG nova.compute.manager [req-304215f1-8617-4805-b984-fac86bff6a96 req-7895ac34-fa05-47e5-9d3b-a69fffed47ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] No waiting events found dispatching network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:22:59 compute-0 nova_compute[182935]: 2026-01-22 00:22:59.607 182939 WARNING nova.compute.manager [req-304215f1-8617-4805-b984-fac86bff6a96 req-7895ac34-fa05-47e5-9d3b-a69fffed47ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received unexpected event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for instance with vm_state active and task_state None.
Jan 22 00:23:00 compute-0 nova_compute[182935]: 2026-01-22 00:23:00.874 182939 DEBUG oslo_concurrency.lockutils [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:00 compute-0 nova_compute[182935]: 2026-01-22 00:23:00.874 182939 DEBUG oslo_concurrency.lockutils [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:00 compute-0 nova_compute[182935]: 2026-01-22 00:23:00.875 182939 DEBUG oslo_concurrency.lockutils [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:00 compute-0 nova_compute[182935]: 2026-01-22 00:23:00.875 182939 DEBUG oslo_concurrency.lockutils [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:00 compute-0 nova_compute[182935]: 2026-01-22 00:23:00.875 182939 DEBUG oslo_concurrency.lockutils [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:00 compute-0 nova_compute[182935]: 2026-01-22 00:23:00.890 182939 INFO nova.compute.manager [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Terminating instance
Jan 22 00:23:00 compute-0 nova_compute[182935]: 2026-01-22 00:23:00.901 182939 DEBUG nova.compute.manager [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:23:00 compute-0 kernel: tap46e22c15-8e (unregistering): left promiscuous mode
Jan 22 00:23:00 compute-0 NetworkManager[55139]: <info>  [1769041380.9218] device (tap46e22c15-8e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:23:00 compute-0 nova_compute[182935]: 2026-01-22 00:23:00.929 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:00 compute-0 ovn_controller[95047]: 2026-01-22T00:23:00Z|00603|binding|INFO|Releasing lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 from this chassis (sb_readonly=0)
Jan 22 00:23:00 compute-0 ovn_controller[95047]: 2026-01-22T00:23:00Z|00604|binding|INFO|Setting lport 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 down in Southbound
Jan 22 00:23:00 compute-0 ovn_controller[95047]: 2026-01-22T00:23:00Z|00605|binding|INFO|Removing iface tap46e22c15-8e ovn-installed in OVS
Jan 22 00:23:00 compute-0 nova_compute[182935]: 2026-01-22 00:23:00.931 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:00.938 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:7a:8c 10.100.0.4'], port_security=['fa:16:3e:06:7a:8c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c9bdc52e-a3e4-4ebb-999e-39628e000115', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ca42520-5b08-4a77-acde-54242e5ad5e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '893df2b0226a4f55801c6f14b12f84d5', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5232bd56-e083-4dbd-b565-94d1a97f56aa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70527286-70a5-42d6-bfea-433a5bf5d611, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:23:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:00.940 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 in datapath 0ca42520-5b08-4a77-acde-54242e5ad5e0 unbound from our chassis
Jan 22 00:23:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:00.941 104408 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0ca42520-5b08-4a77-acde-54242e5ad5e0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:23:00 compute-0 nova_compute[182935]: 2026-01-22 00:23:00.943 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:00.943 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a957a198-93f3-4424-932a-dabf868c48b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:00 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Jan 22 00:23:00 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d0000009c.scope: Consumed 2.260s CPU time.
Jan 22 00:23:00 compute-0 systemd-machined[154182]: Machine qemu-79-instance-0000009c terminated.
Jan 22 00:23:01 compute-0 NetworkManager[55139]: <info>  [1769041381.1197] manager: (tap46e22c15-8e): new Tun device (/org/freedesktop/NetworkManager/Devices/287)
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.120 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.123 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.163 182939 INFO nova.virt.libvirt.driver [-] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Instance destroyed successfully.
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.163 182939 DEBUG nova.objects.instance [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lazy-loading 'resources' on Instance uuid c9bdc52e-a3e4-4ebb-999e-39628e000115 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.250 182939 DEBUG nova.virt.libvirt.vif [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:22:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1737213545',display_name='tempest-TestServerAdvancedOps-server-1737213545',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1737213545',id=156,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:22:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='893df2b0226a4f55801c6f14b12f84d5',ramdisk_id='',reservation_id='r-4u5rh31r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerAdvancedOps-1371967218',owner_user_name='tempest-TestServerAdvancedOps-1371967218-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:22:59Z,user_data=None,user_id='509108e93a554166b18e91e34ad8ed64',uuid=c9bdc52e-a3e4-4ebb-999e-39628e000115,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.250 182939 DEBUG nova.network.os_vif_util [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Converting VIF {"id": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "address": "fa:16:3e:06:7a:8c", "network": {"id": "0ca42520-5b08-4a77-acde-54242e5ad5e0", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-363847095-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "893df2b0226a4f55801c6f14b12f84d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap46e22c15-8e", "ovs_interfaceid": "46e22c15-8eb0-4e38-87a0-52fe7c81a3f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.251 182939 DEBUG nova.network.os_vif_util [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.251 182939 DEBUG os_vif [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.253 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.253 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap46e22c15-8e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.255 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.256 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.258 182939 INFO os_vif [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:7a:8c,bridge_name='br-int',has_traffic_filtering=True,id=46e22c15-8eb0-4e38-87a0-52fe7c81a3f2,network=Network(0ca42520-5b08-4a77-acde-54242e5ad5e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap46e22c15-8e')
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.259 182939 INFO nova.virt.libvirt.driver [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Deleting instance files /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115_del
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.259 182939 INFO nova.virt.libvirt.driver [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Deletion of /var/lib/nova/instances/c9bdc52e-a3e4-4ebb-999e-39628e000115_del complete
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.347 182939 INFO nova.compute.manager [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Took 0.45 seconds to destroy the instance on the hypervisor.
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.348 182939 DEBUG oslo.service.loopingcall [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.348 182939 DEBUG nova.compute.manager [-] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:23:01 compute-0 nova_compute[182935]: 2026-01-22 00:23:01.348 182939 DEBUG nova.network.neutron [-] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:23:02 compute-0 nova_compute[182935]: 2026-01-22 00:23:02.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:02 compute-0 nova_compute[182935]: 2026-01-22 00:23:02.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:23:03 compute-0 nova_compute[182935]: 2026-01-22 00:23:03.172 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:03.220 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:03.220 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:03.220 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:03 compute-0 nova_compute[182935]: 2026-01-22 00:23:03.790 182939 DEBUG nova.compute.manager [req-ac97b3c0-c611-4d0e-857f-e57e39e300f5 req-1cb75bf3-6ce1-4ad3-b7a9-d1a0ef693ae3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-unplugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:03 compute-0 nova_compute[182935]: 2026-01-22 00:23:03.791 182939 DEBUG oslo_concurrency.lockutils [req-ac97b3c0-c611-4d0e-857f-e57e39e300f5 req-1cb75bf3-6ce1-4ad3-b7a9-d1a0ef693ae3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:03 compute-0 nova_compute[182935]: 2026-01-22 00:23:03.791 182939 DEBUG oslo_concurrency.lockutils [req-ac97b3c0-c611-4d0e-857f-e57e39e300f5 req-1cb75bf3-6ce1-4ad3-b7a9-d1a0ef693ae3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:03 compute-0 nova_compute[182935]: 2026-01-22 00:23:03.791 182939 DEBUG oslo_concurrency.lockutils [req-ac97b3c0-c611-4d0e-857f-e57e39e300f5 req-1cb75bf3-6ce1-4ad3-b7a9-d1a0ef693ae3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:03 compute-0 nova_compute[182935]: 2026-01-22 00:23:03.792 182939 DEBUG nova.compute.manager [req-ac97b3c0-c611-4d0e-857f-e57e39e300f5 req-1cb75bf3-6ce1-4ad3-b7a9-d1a0ef693ae3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] No waiting events found dispatching network-vif-unplugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:23:03 compute-0 nova_compute[182935]: 2026-01-22 00:23:03.792 182939 DEBUG nova.compute.manager [req-ac97b3c0-c611-4d0e-857f-e57e39e300f5 req-1cb75bf3-6ce1-4ad3-b7a9-d1a0ef693ae3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-unplugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:23:03 compute-0 nova_compute[182935]: 2026-01-22 00:23:03.824 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:04 compute-0 nova_compute[182935]: 2026-01-22 00:23:04.652 182939 DEBUG nova.network.neutron [-] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:23:04 compute-0 nova_compute[182935]: 2026-01-22 00:23:04.690 182939 INFO nova.compute.manager [-] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Took 3.34 seconds to deallocate network for instance.
Jan 22 00:23:04 compute-0 podman[237533]: 2026-01-22 00:23:04.693546081 +0000 UTC m=+0.063141973 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true)
Jan 22 00:23:04 compute-0 podman[237532]: 2026-01-22 00:23:04.721767563 +0000 UTC m=+0.091363015 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 00:23:04 compute-0 nova_compute[182935]: 2026-01-22 00:23:04.765 182939 DEBUG oslo_concurrency.lockutils [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:04 compute-0 nova_compute[182935]: 2026-01-22 00:23:04.766 182939 DEBUG oslo_concurrency.lockutils [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:04 compute-0 nova_compute[182935]: 2026-01-22 00:23:04.816 182939 DEBUG nova.compute.provider_tree [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:23:04 compute-0 nova_compute[182935]: 2026-01-22 00:23:04.831 182939 DEBUG nova.scheduler.client.report [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:23:04 compute-0 nova_compute[182935]: 2026-01-22 00:23:04.857 182939 DEBUG oslo_concurrency.lockutils [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:04 compute-0 nova_compute[182935]: 2026-01-22 00:23:04.905 182939 INFO nova.scheduler.client.report [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Deleted allocations for instance c9bdc52e-a3e4-4ebb-999e-39628e000115
Jan 22 00:23:04 compute-0 nova_compute[182935]: 2026-01-22 00:23:04.980 182939 DEBUG oslo_concurrency.lockutils [None req-d732f02e-e11b-425a-8e75-5e49bb3cf45b 509108e93a554166b18e91e34ad8ed64 893df2b0226a4f55801c6f14b12f84d5 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:05 compute-0 nova_compute[182935]: 2026-01-22 00:23:05.926 182939 DEBUG nova.compute.manager [req-38e0d60f-3b48-42cb-9725-9c87ae72b9a3 req-8ac7cc75-7d3b-4c09-9513-b87167104082 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:05 compute-0 nova_compute[182935]: 2026-01-22 00:23:05.926 182939 DEBUG oslo_concurrency.lockutils [req-38e0d60f-3b48-42cb-9725-9c87ae72b9a3 req-8ac7cc75-7d3b-4c09-9513-b87167104082 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:05 compute-0 nova_compute[182935]: 2026-01-22 00:23:05.927 182939 DEBUG oslo_concurrency.lockutils [req-38e0d60f-3b48-42cb-9725-9c87ae72b9a3 req-8ac7cc75-7d3b-4c09-9513-b87167104082 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:05 compute-0 nova_compute[182935]: 2026-01-22 00:23:05.927 182939 DEBUG oslo_concurrency.lockutils [req-38e0d60f-3b48-42cb-9725-9c87ae72b9a3 req-8ac7cc75-7d3b-4c09-9513-b87167104082 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c9bdc52e-a3e4-4ebb-999e-39628e000115-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:05 compute-0 nova_compute[182935]: 2026-01-22 00:23:05.927 182939 DEBUG nova.compute.manager [req-38e0d60f-3b48-42cb-9725-9c87ae72b9a3 req-8ac7cc75-7d3b-4c09-9513-b87167104082 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] No waiting events found dispatching network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:23:05 compute-0 nova_compute[182935]: 2026-01-22 00:23:05.928 182939 WARNING nova.compute.manager [req-38e0d60f-3b48-42cb-9725-9c87ae72b9a3 req-8ac7cc75-7d3b-4c09-9513-b87167104082 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received unexpected event network-vif-plugged-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 for instance with vm_state deleted and task_state None.
Jan 22 00:23:06 compute-0 nova_compute[182935]: 2026-01-22 00:23:06.257 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:07 compute-0 nova_compute[182935]: 2026-01-22 00:23:07.369 182939 DEBUG nova.compute.manager [req-c4d90e6f-55f3-4e84-b3e8-4a0d73476521 req-192ab8e7-62ff-4bf1-a74f-dd8ab9089e27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Received event network-vif-deleted-46e22c15-8eb0-4e38-87a0-52fe7c81a3f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:08 compute-0 nova_compute[182935]: 2026-01-22 00:23:08.174 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:08 compute-0 nova_compute[182935]: 2026-01-22 00:23:08.407 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:10.891 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:d5:ab'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e45a905-ef69-47b8-b157-96af9472b990, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7c217807-262b-45e7-a62c-ca33e3f039ed) old=Port_Binding(mac=['fa:16:3e:95:d5:ab 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cc568949-a996-45b6-b055-c1780ec7685a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:23:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:10.893 104408 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7c217807-262b-45e7-a62c-ca33e3f039ed in datapath cc568949-a996-45b6-b055-c1780ec7685a updated
Jan 22 00:23:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:10.894 104408 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cc568949-a996-45b6-b055-c1780ec7685a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Jan 22 00:23:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:10.895 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[aba782ac-0751-44bf-bcf1-697c967bae5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:11 compute-0 nova_compute[182935]: 2026-01-22 00:23:11.260 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:12 compute-0 sshd-session[237571]: Invalid user redis from 188.166.69.60 port 51352
Jan 22 00:23:12 compute-0 sshd-session[237571]: Connection closed by invalid user redis 188.166.69.60 port 51352 [preauth]
Jan 22 00:23:13 compute-0 nova_compute[182935]: 2026-01-22 00:23:13.175 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:16 compute-0 nova_compute[182935]: 2026-01-22 00:23:16.162 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041381.160319, c9bdc52e-a3e4-4ebb-999e-39628e000115 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:23:16 compute-0 nova_compute[182935]: 2026-01-22 00:23:16.162 182939 INFO nova.compute.manager [-] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] VM Stopped (Lifecycle Event)
Jan 22 00:23:16 compute-0 nova_compute[182935]: 2026-01-22 00:23:16.263 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:16 compute-0 nova_compute[182935]: 2026-01-22 00:23:16.305 182939 DEBUG nova.compute.manager [None req-bd714191-f2c0-4280-a85d-96c6dc77fb6b - - - - - -] [instance: c9bdc52e-a3e4-4ebb-999e-39628e000115] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:23:18 compute-0 nova_compute[182935]: 2026-01-22 00:23:18.189 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:20 compute-0 podman[237574]: 2026-01-22 00:23:20.709453943 +0000 UTC m=+0.077844123 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:23:20 compute-0 podman[237573]: 2026-01-22 00:23:20.724065681 +0000 UTC m=+0.092011791 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 00:23:20 compute-0 nova_compute[182935]: 2026-01-22 00:23:20.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:20 compute-0 nova_compute[182935]: 2026-01-22 00:23:20.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:23:20 compute-0 nova_compute[182935]: 2026-01-22 00:23:20.824 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:23:21 compute-0 nova_compute[182935]: 2026-01-22 00:23:21.265 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:21 compute-0 nova_compute[182935]: 2026-01-22 00:23:21.712 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "1f32d3c6-0780-4715-9f42-e713cec6363f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:21 compute-0 nova_compute[182935]: 2026-01-22 00:23:21.712 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:21 compute-0 nova_compute[182935]: 2026-01-22 00:23:21.732 182939 DEBUG nova.compute.manager [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:23:21 compute-0 nova_compute[182935]: 2026-01-22 00:23:21.896 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:21 compute-0 nova_compute[182935]: 2026-01-22 00:23:21.897 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:21 compute-0 nova_compute[182935]: 2026-01-22 00:23:21.908 182939 DEBUG nova.virt.hardware [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:23:21 compute-0 nova_compute[182935]: 2026-01-22 00:23:21.908 182939 INFO nova.compute.claims [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:23:22 compute-0 nova_compute[182935]: 2026-01-22 00:23:22.044 182939 DEBUG nova.compute.provider_tree [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:23:22 compute-0 nova_compute[182935]: 2026-01-22 00:23:22.062 182939 DEBUG nova.scheduler.client.report [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:23:22 compute-0 nova_compute[182935]: 2026-01-22 00:23:22.087 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:22 compute-0 nova_compute[182935]: 2026-01-22 00:23:22.088 182939 DEBUG nova.compute.manager [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:23:22 compute-0 nova_compute[182935]: 2026-01-22 00:23:22.636 182939 DEBUG nova.compute.manager [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:23:22 compute-0 nova_compute[182935]: 2026-01-22 00:23:22.637 182939 DEBUG nova.network.neutron [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:23:22 compute-0 podman[237622]: 2026-01-22 00:23:22.703211969 +0000 UTC m=+0.077600977 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:23:22 compute-0 nova_compute[182935]: 2026-01-22 00:23:22.985 182939 DEBUG nova.policy [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a60ce2b7b7ae47b484de12add551b287', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:23:23 compute-0 nova_compute[182935]: 2026-01-22 00:23:23.134 182939 INFO nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:23:23 compute-0 nova_compute[182935]: 2026-01-22 00:23:23.191 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:23 compute-0 nova_compute[182935]: 2026-01-22 00:23:23.479 182939 DEBUG nova.compute.manager [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:23:26 compute-0 nova_compute[182935]: 2026-01-22 00:23:26.269 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.585 182939 DEBUG nova.compute.manager [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.586 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.587 182939 INFO nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Creating image(s)
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.588 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "/var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.588 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.589 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "/var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.606 182939 DEBUG oslo_concurrency.processutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.670 182939 DEBUG oslo_concurrency.processutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.671 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.672 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.689 182939 DEBUG oslo_concurrency.processutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.761 182939 DEBUG oslo_concurrency.processutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.763 182939 DEBUG oslo_concurrency.processutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.922 182939 DEBUG oslo_concurrency.processutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk 1073741824" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.923 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.924 182939 DEBUG oslo_concurrency.processutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.984 182939 DEBUG oslo_concurrency.processutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.985 182939 DEBUG nova.virt.disk.api [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Checking if we can resize image /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:23:27 compute-0 nova_compute[182935]: 2026-01-22 00:23:27.985 182939 DEBUG oslo_concurrency.processutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:28 compute-0 nova_compute[182935]: 2026-01-22 00:23:28.045 182939 DEBUG oslo_concurrency.processutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:28 compute-0 nova_compute[182935]: 2026-01-22 00:23:28.046 182939 DEBUG nova.virt.disk.api [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Cannot resize image /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:23:28 compute-0 nova_compute[182935]: 2026-01-22 00:23:28.046 182939 DEBUG nova.objects.instance [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'migration_context' on Instance uuid 1f32d3c6-0780-4715-9f42-e713cec6363f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:23:28 compute-0 nova_compute[182935]: 2026-01-22 00:23:28.168 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:23:28 compute-0 nova_compute[182935]: 2026-01-22 00:23:28.169 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Ensure instance console log exists: /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:23:28 compute-0 nova_compute[182935]: 2026-01-22 00:23:28.170 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:28 compute-0 nova_compute[182935]: 2026-01-22 00:23:28.170 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:28 compute-0 nova_compute[182935]: 2026-01-22 00:23:28.170 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:28 compute-0 nova_compute[182935]: 2026-01-22 00:23:28.194 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:28 compute-0 nova_compute[182935]: 2026-01-22 00:23:28.442 182939 DEBUG nova.network.neutron [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Successfully created port: 71328ccb-58ba-48ba-a908-44d72201d853 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:23:28 compute-0 podman[237662]: 2026-01-22 00:23:28.710178399 +0000 UTC m=+0.079392930 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 00:23:31 compute-0 nova_compute[182935]: 2026-01-22 00:23:31.080 182939 DEBUG nova.network.neutron [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Successfully updated port: 71328ccb-58ba-48ba-a908-44d72201d853 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:23:31 compute-0 nova_compute[182935]: 2026-01-22 00:23:31.135 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:23:31 compute-0 nova_compute[182935]: 2026-01-22 00:23:31.136 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquired lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:23:31 compute-0 nova_compute[182935]: 2026-01-22 00:23:31.136 182939 DEBUG nova.network.neutron [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:23:31 compute-0 nova_compute[182935]: 2026-01-22 00:23:31.272 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:31 compute-0 nova_compute[182935]: 2026-01-22 00:23:31.362 182939 DEBUG nova.compute.manager [req-6fd8b0e4-1090-418b-8b41-7b3e13e34e30 req-1dc0f01d-8d8c-4c2a-85df-ee3ad42e1aa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Received event network-changed-71328ccb-58ba-48ba-a908-44d72201d853 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:31 compute-0 nova_compute[182935]: 2026-01-22 00:23:31.363 182939 DEBUG nova.compute.manager [req-6fd8b0e4-1090-418b-8b41-7b3e13e34e30 req-1dc0f01d-8d8c-4c2a-85df-ee3ad42e1aa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Refreshing instance network info cache due to event network-changed-71328ccb-58ba-48ba-a908-44d72201d853. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:23:31 compute-0 nova_compute[182935]: 2026-01-22 00:23:31.363 182939 DEBUG oslo_concurrency.lockutils [req-6fd8b0e4-1090-418b-8b41-7b3e13e34e30 req-1dc0f01d-8d8c-4c2a-85df-ee3ad42e1aa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:23:31 compute-0 nova_compute[182935]: 2026-01-22 00:23:31.554 182939 DEBUG nova.network.neutron [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.220 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.280 182939 DEBUG nova.network.neutron [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Updating instance_info_cache with network_info: [{"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.303 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Releasing lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.304 182939 DEBUG nova.compute.manager [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Instance network_info: |[{"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.305 182939 DEBUG oslo_concurrency.lockutils [req-6fd8b0e4-1090-418b-8b41-7b3e13e34e30 req-1dc0f01d-8d8c-4c2a-85df-ee3ad42e1aa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.305 182939 DEBUG nova.network.neutron [req-6fd8b0e4-1090-418b-8b41-7b3e13e34e30 req-1dc0f01d-8d8c-4c2a-85df-ee3ad42e1aa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Refreshing network info cache for port 71328ccb-58ba-48ba-a908-44d72201d853 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.310 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Start _get_guest_xml network_info=[{"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.315 182939 WARNING nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.324 182939 DEBUG nova.virt.libvirt.host [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.325 182939 DEBUG nova.virt.libvirt.host [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.328 182939 DEBUG nova.virt.libvirt.host [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.328 182939 DEBUG nova.virt.libvirt.host [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.329 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.330 182939 DEBUG nova.virt.hardware [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.330 182939 DEBUG nova.virt.hardware [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.330 182939 DEBUG nova.virt.hardware [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.330 182939 DEBUG nova.virt.hardware [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.331 182939 DEBUG nova.virt.hardware [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.331 182939 DEBUG nova.virt.hardware [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.331 182939 DEBUG nova.virt.hardware [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.331 182939 DEBUG nova.virt.hardware [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.332 182939 DEBUG nova.virt.hardware [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.332 182939 DEBUG nova.virt.hardware [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.332 182939 DEBUG nova.virt.hardware [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.336 182939 DEBUG nova.virt.libvirt.vif [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:23:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1958614551',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1958614551',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ge',id=158,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzNbeWQqM+R5mCLYdVsdPyQXYDqnkkhhC73mxN5fX0QRC+i5pxaSAc7LRsQKs9V1np8BzitSAx9O4U37xdH3m6MF7eYp2Ff07iBZVcoSIsB4CpGyP/xz08PAIvxm/KFgA==',key_name='tempest-TestSecurityGroupsBasicOps-135115301',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-o40j35ts',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:23:24Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=1f32d3c6-0780-4715-9f42-e713cec6363f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.336 182939 DEBUG nova.network.os_vif_util [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.337 182939 DEBUG nova.network.os_vif_util [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:85:4c,bridge_name='br-int',has_traffic_filtering=True,id=71328ccb-58ba-48ba-a908-44d72201d853,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71328ccb-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.338 182939 DEBUG nova.objects.instance [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1f32d3c6-0780-4715-9f42-e713cec6363f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.354 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:23:33 compute-0 nova_compute[182935]:   <uuid>1f32d3c6-0780-4715-9f42-e713cec6363f</uuid>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   <name>instance-0000009e</name>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1958614551</nova:name>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:23:33</nova:creationTime>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:23:33 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:23:33 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:23:33 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:23:33 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:23:33 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:23:33 compute-0 nova_compute[182935]:         <nova:user uuid="a60ce2b7b7ae47b484de12add551b287">tempest-TestSecurityGroupsBasicOps-1492736128-project-member</nova:user>
Jan 22 00:23:33 compute-0 nova_compute[182935]:         <nova:project uuid="02bcfc5f1f1044a3856e73a5938ff011">tempest-TestSecurityGroupsBasicOps-1492736128</nova:project>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:23:33 compute-0 nova_compute[182935]:         <nova:port uuid="71328ccb-58ba-48ba-a908-44d72201d853">
Jan 22 00:23:33 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <system>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <entry name="serial">1f32d3c6-0780-4715-9f42-e713cec6363f</entry>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <entry name="uuid">1f32d3c6-0780-4715-9f42-e713cec6363f</entry>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     </system>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   <os>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   </os>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   <features>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   </features>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk.config"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:1b:85:4c"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <target dev="tap71328ccb-58"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/console.log" append="off"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <video>
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     </video>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:23:33 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:23:33 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:23:33 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:23:33 compute-0 nova_compute[182935]: </domain>
Jan 22 00:23:33 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.355 182939 DEBUG nova.compute.manager [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Preparing to wait for external event network-vif-plugged-71328ccb-58ba-48ba-a908-44d72201d853 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.356 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.356 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.356 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.357 182939 DEBUG nova.virt.libvirt.vif [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:23:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1958614551',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1958614551',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ge',id=158,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzNbeWQqM+R5mCLYdVsdPyQXYDqnkkhhC73mxN5fX0QRC+i5pxaSAc7LRsQKs9V1np8BzitSAx9O4U37xdH3m6MF7eYp2Ff07iBZVcoSIsB4CpGyP/xz08PAIvxm/KFgA==',key_name='tempest-TestSecurityGroupsBasicOps-135115301',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-o40j35ts',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:23:24Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=1f32d3c6-0780-4715-9f42-e713cec6363f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.357 182939 DEBUG nova.network.os_vif_util [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.358 182939 DEBUG nova.network.os_vif_util [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:85:4c,bridge_name='br-int',has_traffic_filtering=True,id=71328ccb-58ba-48ba-a908-44d72201d853,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71328ccb-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.358 182939 DEBUG os_vif [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:85:4c,bridge_name='br-int',has_traffic_filtering=True,id=71328ccb-58ba-48ba-a908-44d72201d853,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71328ccb-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.358 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.359 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.359 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.362 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.362 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71328ccb-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.362 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap71328ccb-58, col_values=(('external_ids', {'iface-id': '71328ccb-58ba-48ba-a908-44d72201d853', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1b:85:4c', 'vm-uuid': '1f32d3c6-0780-4715-9f42-e713cec6363f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:33 compute-0 NetworkManager[55139]: <info>  [1769041413.3652] manager: (tap71328ccb-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.364 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.366 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.371 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.372 182939 INFO os_vif [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:85:4c,bridge_name='br-int',has_traffic_filtering=True,id=71328ccb-58ba-48ba-a908-44d72201d853,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71328ccb-58')
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.650 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.651 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.652 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] No VIF found with MAC fa:16:3e:1b:85:4c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:23:33 compute-0 nova_compute[182935]: 2026-01-22 00:23:33.652 182939 INFO nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Using config drive
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.208 182939 INFO nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Creating config drive at /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk.config
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.212 182939 DEBUG oslo_concurrency.processutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc0ytrnld execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.335 182939 DEBUG oslo_concurrency.processutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc0ytrnld" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:34 compute-0 kernel: tap71328ccb-58: entered promiscuous mode
Jan 22 00:23:34 compute-0 NetworkManager[55139]: <info>  [1769041414.4398] manager: (tap71328ccb-58): new Tun device (/org/freedesktop/NetworkManager/Devices/289)
Jan 22 00:23:34 compute-0 ovn_controller[95047]: 2026-01-22T00:23:34Z|00606|binding|INFO|Claiming lport 71328ccb-58ba-48ba-a908-44d72201d853 for this chassis.
Jan 22 00:23:34 compute-0 ovn_controller[95047]: 2026-01-22T00:23:34Z|00607|binding|INFO|71328ccb-58ba-48ba-a908-44d72201d853: Claiming fa:16:3e:1b:85:4c 10.100.0.6
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.442 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.449 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.454 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.460 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:34 compute-0 systemd-udevd[237700]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:23:34 compute-0 NetworkManager[55139]: <info>  [1769041414.4654] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/290)
Jan 22 00:23:34 compute-0 NetworkManager[55139]: <info>  [1769041414.4660] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.466 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:85:4c 10.100.0.6'], port_security=['fa:16:3e:1b:85:4c 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1f32d3c6-0780-4715-9f42-e713cec6363f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ac7ef113-70ce-4186-8d61-02a4d7406050', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f297238-9b07-4e59-b73b-faeb28c51c5f, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=71328ccb-58ba-48ba-a908-44d72201d853) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.467 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 71328ccb-58ba-48ba-a908-44d72201d853 in datapath 92f99623-b3a9-41d7-ab3e-5bc19b701c77 bound to our chassis
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.469 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 92f99623-b3a9-41d7-ab3e-5bc19b701c77
Jan 22 00:23:34 compute-0 systemd-machined[154182]: New machine qemu-80-instance-0000009e.
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.479 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4ca55a-c9dd-4bda-8c1e-ec4079cd622a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.480 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap92f99623-b1 in ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:23:34 compute-0 NetworkManager[55139]: <info>  [1769041414.4834] device (tap71328ccb-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:23:34 compute-0 NetworkManager[55139]: <info>  [1769041414.4838] device (tap71328ccb-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.483 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap92f99623-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.483 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[32b57fac-b6bd-48ca-a282-0edb7c23df54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.484 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ec2415d0-a052-4da2-bd8e-88abff6da3f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.499 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c76c0d-ea86-4bc2-8e15-bfea0608e22f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.532 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[11aa8594-20ac-4954-bd31-1b0399de5fcd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.563 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b0bf8f5d-3d36-49b8-8e01-1319c1591a96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 systemd-udevd[237704]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.582 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[aac27df8-cb8c-41df-ae1a-08b7c6d865f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 systemd[1]: Started Virtual Machine qemu-80-instance-0000009e.
Jan 22 00:23:34 compute-0 NetworkManager[55139]: <info>  [1769041414.5844] manager: (tap92f99623-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/292)
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.594 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.609 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:34 compute-0 ovn_controller[95047]: 2026-01-22T00:23:34Z|00608|binding|INFO|Setting lport 71328ccb-58ba-48ba-a908-44d72201d853 ovn-installed in OVS
Jan 22 00:23:34 compute-0 ovn_controller[95047]: 2026-01-22T00:23:34Z|00609|binding|INFO|Setting lport 71328ccb-58ba-48ba-a908-44d72201d853 up in Southbound
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.618 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.618 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[4396feec-13dd-4a46-a5e4-cc35a72a73ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.624 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[cd1af3f2-5a34-401b-9ab0-9ca89d984c3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 NetworkManager[55139]: <info>  [1769041414.6448] device (tap92f99623-b0): carrier: link connected
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.649 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[dadd846c-5248-47b0-8ca3-06df3852aa6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.668 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[04107ced-9231-448b-9f4f-075a793101e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92f99623-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:be:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591738, 'reachable_time': 33578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237733, 'error': None, 'target': 'ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.686 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b6148b-13a3-42f6-8780-3e7d5fb9c107]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:bee7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 591738, 'tstamp': 591738}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237734, 'error': None, 'target': 'ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.704 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ccc75f2e-82f5-42d0-80b4-8eeb540c235a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92f99623-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:be:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591738, 'reachable_time': 33578, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237736, 'error': None, 'target': 'ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.733 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7bfd9dcd-b4e9-4660-9b89-76bdcf928c32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.798 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f33cdaee-6aa5-478d-856f-42519471440c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.799 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92f99623-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.800 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.800 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap92f99623-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:34 compute-0 NetworkManager[55139]: <info>  [1769041414.8027] manager: (tap92f99623-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/293)
Jan 22 00:23:34 compute-0 kernel: tap92f99623-b0: entered promiscuous mode
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.802 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.805 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.805 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap92f99623-b0, col_values=(('external_ids', {'iface-id': '66ff7839-fd4b-434d-8f68-86c957e9ef5e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.806 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:34 compute-0 ovn_controller[95047]: 2026-01-22T00:23:34Z|00610|binding|INFO|Releasing lport 66ff7839-fd4b-434d-8f68-86c957e9ef5e from this chassis (sb_readonly=0)
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.818 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.818 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/92f99623-b3a9-41d7-ab3e-5bc19b701c77.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/92f99623-b3a9-41d7-ab3e-5bc19b701c77.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.819 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fe253477-097e-4334-96eb-cf9c893698c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.820 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-92f99623-b3a9-41d7-ab3e-5bc19b701c77
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/92f99623-b3a9-41d7-ab3e-5bc19b701c77.pid.haproxy
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 92f99623-b3a9-41d7-ab3e-5bc19b701c77
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:23:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:34.822 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'env', 'PROCESS_TAG=haproxy-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/92f99623-b3a9-41d7-ab3e-5bc19b701c77.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.857 182939 DEBUG nova.network.neutron [req-6fd8b0e4-1090-418b-8b41-7b3e13e34e30 req-1dc0f01d-8d8c-4c2a-85df-ee3ad42e1aa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Updated VIF entry in instance network info cache for port 71328ccb-58ba-48ba-a908-44d72201d853. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.859 182939 DEBUG nova.network.neutron [req-6fd8b0e4-1090-418b-8b41-7b3e13e34e30 req-1dc0f01d-8d8c-4c2a-85df-ee3ad42e1aa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Updating instance_info_cache with network_info: [{"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:23:34 compute-0 nova_compute[182935]: 2026-01-22 00:23:34.975 182939 DEBUG oslo_concurrency.lockutils [req-6fd8b0e4-1090-418b-8b41-7b3e13e34e30 req-1dc0f01d-8d8c-4c2a-85df-ee3ad42e1aa1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:23:35 compute-0 nova_compute[182935]: 2026-01-22 00:23:35.056 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041415.055708, 1f32d3c6-0780-4715-9f42-e713cec6363f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:23:35 compute-0 nova_compute[182935]: 2026-01-22 00:23:35.057 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] VM Started (Lifecycle Event)
Jan 22 00:23:35 compute-0 podman[237773]: 2026-01-22 00:23:35.181211105 +0000 UTC m=+0.060540812 container create f943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:23:35 compute-0 nova_compute[182935]: 2026-01-22 00:23:35.200 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:23:35 compute-0 nova_compute[182935]: 2026-01-22 00:23:35.208 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041415.0559328, 1f32d3c6-0780-4715-9f42-e713cec6363f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:23:35 compute-0 nova_compute[182935]: 2026-01-22 00:23:35.208 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] VM Paused (Lifecycle Event)
Jan 22 00:23:35 compute-0 systemd[1]: Started libpod-conmon-f943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2.scope.
Jan 22 00:23:35 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:23:35 compute-0 podman[237773]: 2026-01-22 00:23:35.144344268 +0000 UTC m=+0.023674005 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:23:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/414f764c59da3f19752983271e6831e471fec702ec6651f1fcd828ef69379b25/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:23:35 compute-0 podman[237773]: 2026-01-22 00:23:35.256022695 +0000 UTC m=+0.135352402 container init f943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 22 00:23:35 compute-0 podman[237773]: 2026-01-22 00:23:35.262052049 +0000 UTC m=+0.141381756 container start f943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:23:35 compute-0 podman[237789]: 2026-01-22 00:23:35.276756228 +0000 UTC m=+0.057472938 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 00:23:35 compute-0 podman[237786]: 2026-01-22 00:23:35.281187393 +0000 UTC m=+0.065357635 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Jan 22 00:23:35 compute-0 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[237790]: [NOTICE]   (237824) : New worker (237832) forked
Jan 22 00:23:35 compute-0 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[237790]: [NOTICE]   (237824) : Loading success.
Jan 22 00:23:35 compute-0 nova_compute[182935]: 2026-01-22 00:23:35.628 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:23:35 compute-0 nova_compute[182935]: 2026-01-22 00:23:35.633 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:23:35 compute-0 nova_compute[182935]: 2026-01-22 00:23:35.703 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.252 182939 DEBUG nova.compute.manager [req-4b77130c-f738-4671-a98b-48cbbdd80b78 req-11bf9678-56cb-4568-b30f-1eaf1165c081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Received event network-vif-plugged-71328ccb-58ba-48ba-a908-44d72201d853 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.253 182939 DEBUG oslo_concurrency.lockutils [req-4b77130c-f738-4671-a98b-48cbbdd80b78 req-11bf9678-56cb-4568-b30f-1eaf1165c081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.253 182939 DEBUG oslo_concurrency.lockutils [req-4b77130c-f738-4671-a98b-48cbbdd80b78 req-11bf9678-56cb-4568-b30f-1eaf1165c081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.253 182939 DEBUG oslo_concurrency.lockutils [req-4b77130c-f738-4671-a98b-48cbbdd80b78 req-11bf9678-56cb-4568-b30f-1eaf1165c081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.253 182939 DEBUG nova.compute.manager [req-4b77130c-f738-4671-a98b-48cbbdd80b78 req-11bf9678-56cb-4568-b30f-1eaf1165c081 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Processing event network-vif-plugged-71328ccb-58ba-48ba-a908-44d72201d853 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.254 182939 DEBUG nova.compute.manager [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.259 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041417.2590544, 1f32d3c6-0780-4715-9f42-e713cec6363f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.259 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] VM Resumed (Lifecycle Event)
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.261 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.264 182939 INFO nova.virt.libvirt.driver [-] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Instance spawned successfully.
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.265 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.621 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.628 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.629 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.629 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.630 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.630 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.630 182939 DEBUG nova.virt.libvirt.driver [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.633 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:23:37 compute-0 nova_compute[182935]: 2026-01-22 00:23:37.681 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:23:38 compute-0 nova_compute[182935]: 2026-01-22 00:23:38.265 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:38 compute-0 nova_compute[182935]: 2026-01-22 00:23:38.279 182939 INFO nova.compute.manager [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Took 10.69 seconds to spawn the instance on the hypervisor.
Jan 22 00:23:38 compute-0 nova_compute[182935]: 2026-01-22 00:23:38.279 182939 DEBUG nova.compute.manager [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:23:38 compute-0 nova_compute[182935]: 2026-01-22 00:23:38.364 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:38 compute-0 nova_compute[182935]: 2026-01-22 00:23:38.784 182939 INFO nova.compute.manager [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Took 16.95 seconds to build instance.
Jan 22 00:23:38 compute-0 nova_compute[182935]: 2026-01-22 00:23:38.807 182939 DEBUG oslo_concurrency.lockutils [None req-37188a7b-6d07-41b2-9f8e-36e84e12cbb0 a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:39 compute-0 nova_compute[182935]: 2026-01-22 00:23:39.396 182939 DEBUG nova.compute.manager [req-e3b5741b-c437-4788-ae00-6c12dec20d0e req-85c41b65-91dd-4a6d-8686-e0b54ab2b8e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Received event network-vif-plugged-71328ccb-58ba-48ba-a908-44d72201d853 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:39 compute-0 nova_compute[182935]: 2026-01-22 00:23:39.396 182939 DEBUG oslo_concurrency.lockutils [req-e3b5741b-c437-4788-ae00-6c12dec20d0e req-85c41b65-91dd-4a6d-8686-e0b54ab2b8e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:39 compute-0 nova_compute[182935]: 2026-01-22 00:23:39.397 182939 DEBUG oslo_concurrency.lockutils [req-e3b5741b-c437-4788-ae00-6c12dec20d0e req-85c41b65-91dd-4a6d-8686-e0b54ab2b8e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:39 compute-0 nova_compute[182935]: 2026-01-22 00:23:39.397 182939 DEBUG oslo_concurrency.lockutils [req-e3b5741b-c437-4788-ae00-6c12dec20d0e req-85c41b65-91dd-4a6d-8686-e0b54ab2b8e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:39 compute-0 nova_compute[182935]: 2026-01-22 00:23:39.397 182939 DEBUG nova.compute.manager [req-e3b5741b-c437-4788-ae00-6c12dec20d0e req-85c41b65-91dd-4a6d-8686-e0b54ab2b8e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] No waiting events found dispatching network-vif-plugged-71328ccb-58ba-48ba-a908-44d72201d853 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:23:39 compute-0 nova_compute[182935]: 2026-01-22 00:23:39.397 182939 WARNING nova.compute.manager [req-e3b5741b-c437-4788-ae00-6c12dec20d0e req-85c41b65-91dd-4a6d-8686-e0b54ab2b8e8 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Received unexpected event network-vif-plugged-71328ccb-58ba-48ba-a908-44d72201d853 for instance with vm_state active and task_state None.
Jan 22 00:23:39 compute-0 nova_compute[182935]: 2026-01-22 00:23:39.910 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:43 compute-0 nova_compute[182935]: 2026-01-22 00:23:43.302 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:43 compute-0 nova_compute[182935]: 2026-01-22 00:23:43.366 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:45 compute-0 nova_compute[182935]: 2026-01-22 00:23:45.824 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.054 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.054 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.054 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.054 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.097 182939 DEBUG nova.compute.manager [req-57f34c48-24ee-422e-b81e-b7c03c41bd64 req-0574983e-b3c0-4d02-8063-0d5b2791f23e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Received event network-changed-71328ccb-58ba-48ba-a908-44d72201d853 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.097 182939 DEBUG nova.compute.manager [req-57f34c48-24ee-422e-b81e-b7c03c41bd64 req-0574983e-b3c0-4d02-8063-0d5b2791f23e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Refreshing instance network info cache due to event network-changed-71328ccb-58ba-48ba-a908-44d72201d853. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.098 182939 DEBUG oslo_concurrency.lockutils [req-57f34c48-24ee-422e-b81e-b7c03c41bd64 req-0574983e-b3c0-4d02-8063-0d5b2791f23e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.098 182939 DEBUG oslo_concurrency.lockutils [req-57f34c48-24ee-422e-b81e-b7c03c41bd64 req-0574983e-b3c0-4d02-8063-0d5b2791f23e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.099 182939 DEBUG nova.network.neutron [req-57f34c48-24ee-422e-b81e-b7c03c41bd64 req-0574983e-b3c0-4d02-8063-0d5b2791f23e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Refreshing network info cache for port 71328ccb-58ba-48ba-a908-44d72201d853 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.158 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.225 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.227 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.287 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.440 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.443 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5565MB free_disk=73.12641906738281GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.443 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.444 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.532 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 1f32d3c6-0780-4715-9f42-e713cec6363f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.532 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.533 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.589 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.614 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.645 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:23:46 compute-0 nova_compute[182935]: 2026-01-22 00:23:46.650 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:48 compute-0 nova_compute[182935]: 2026-01-22 00:23:48.175 182939 DEBUG nova.network.neutron [req-57f34c48-24ee-422e-b81e-b7c03c41bd64 req-0574983e-b3c0-4d02-8063-0d5b2791f23e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Updated VIF entry in instance network info cache for port 71328ccb-58ba-48ba-a908-44d72201d853. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:23:48 compute-0 nova_compute[182935]: 2026-01-22 00:23:48.176 182939 DEBUG nova.network.neutron [req-57f34c48-24ee-422e-b81e-b7c03c41bd64 req-0574983e-b3c0-4d02-8063-0d5b2791f23e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Updating instance_info_cache with network_info: [{"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:23:48 compute-0 nova_compute[182935]: 2026-01-22 00:23:48.203 182939 DEBUG oslo_concurrency.lockutils [req-57f34c48-24ee-422e-b81e-b7c03c41bd64 req-0574983e-b3c0-4d02-8063-0d5b2791f23e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:23:48 compute-0 nova_compute[182935]: 2026-01-22 00:23:48.240 182939 DEBUG nova.compute.manager [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Received event network-changed-71328ccb-58ba-48ba-a908-44d72201d853 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:48 compute-0 nova_compute[182935]: 2026-01-22 00:23:48.240 182939 DEBUG nova.compute.manager [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Refreshing instance network info cache due to event network-changed-71328ccb-58ba-48ba-a908-44d72201d853. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:23:48 compute-0 nova_compute[182935]: 2026-01-22 00:23:48.241 182939 DEBUG oslo_concurrency.lockutils [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:23:48 compute-0 nova_compute[182935]: 2026-01-22 00:23:48.241 182939 DEBUG oslo_concurrency.lockutils [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:23:48 compute-0 nova_compute[182935]: 2026-01-22 00:23:48.241 182939 DEBUG nova.network.neutron [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Refreshing network info cache for port 71328ccb-58ba-48ba-a908-44d72201d853 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:23:48 compute-0 nova_compute[182935]: 2026-01-22 00:23:48.335 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:48 compute-0 nova_compute[182935]: 2026-01-22 00:23:48.367 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:48 compute-0 nova_compute[182935]: 2026-01-22 00:23:48.873 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:49 compute-0 nova_compute[182935]: 2026-01-22 00:23:49.731 182939 DEBUG nova.network.neutron [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Updated VIF entry in instance network info cache for port 71328ccb-58ba-48ba-a908-44d72201d853. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:23:49 compute-0 nova_compute[182935]: 2026-01-22 00:23:49.732 182939 DEBUG nova.network.neutron [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Updating instance_info_cache with network_info: [{"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:23:49 compute-0 nova_compute[182935]: 2026-01-22 00:23:49.750 182939 DEBUG oslo_concurrency.lockutils [req-6bff11ab-48df-47b0-909e-0ac0e9f85b1c req-ccc02547-177d-46dd-8197-951212244605 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:23:50 compute-0 ovn_controller[95047]: 2026-01-22T00:23:50Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1b:85:4c 10.100.0.6
Jan 22 00:23:50 compute-0 ovn_controller[95047]: 2026-01-22T00:23:50Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1b:85:4c 10.100.0.6
Jan 22 00:23:51 compute-0 nova_compute[182935]: 2026-01-22 00:23:51.619 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:51 compute-0 nova_compute[182935]: 2026-01-22 00:23:51.620 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:23:51 compute-0 podman[237869]: 2026-01-22 00:23:51.718969605 +0000 UTC m=+0.080643391 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:23:51 compute-0 podman[237868]: 2026-01-22 00:23:51.724224679 +0000 UTC m=+0.086175471 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:23:51 compute-0 nova_compute[182935]: 2026-01-22 00:23:51.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:51 compute-0 nova_compute[182935]: 2026-01-22 00:23:51.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:23:51 compute-0 nova_compute[182935]: 2026-01-22 00:23:51.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:23:51 compute-0 nova_compute[182935]: 2026-01-22 00:23:51.980 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:23:51 compute-0 nova_compute[182935]: 2026-01-22 00:23:51.980 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:23:51 compute-0 nova_compute[182935]: 2026-01-22 00:23:51.981 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:23:51 compute-0 nova_compute[182935]: 2026-01-22 00:23:51.982 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1f32d3c6-0780-4715-9f42-e713cec6363f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:23:53 compute-0 nova_compute[182935]: 2026-01-22 00:23:53.380 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:53 compute-0 nova_compute[182935]: 2026-01-22 00:23:53.561 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Updating instance_info_cache with network_info: [{"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:23:53 compute-0 nova_compute[182935]: 2026-01-22 00:23:53.665 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-1f32d3c6-0780-4715-9f42-e713cec6363f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:23:53 compute-0 nova_compute[182935]: 2026-01-22 00:23:53.665 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:23:53 compute-0 nova_compute[182935]: 2026-01-22 00:23:53.666 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:53 compute-0 podman[237914]: 2026-01-22 00:23:53.71774214 +0000 UTC m=+0.080783044 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:23:54 compute-0 nova_compute[182935]: 2026-01-22 00:23:54.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:55 compute-0 sshd-session[237939]: Invalid user mongodb from 188.166.69.60 port 33222
Jan 22 00:23:55 compute-0 nova_compute[182935]: 2026-01-22 00:23:55.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:55 compute-0 sshd-session[237939]: Connection closed by invalid user mongodb 188.166.69.60 port 33222 [preauth]
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.289 182939 DEBUG oslo_concurrency.lockutils [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "1f32d3c6-0780-4715-9f42-e713cec6363f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.290 182939 DEBUG oslo_concurrency.lockutils [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.290 182939 DEBUG oslo_concurrency.lockutils [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.290 182939 DEBUG oslo_concurrency.lockutils [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.290 182939 DEBUG oslo_concurrency.lockutils [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.301 182939 INFO nova.compute.manager [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Terminating instance
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.310 182939 DEBUG nova.compute.manager [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:23:56 compute-0 kernel: tap71328ccb-58 (unregistering): left promiscuous mode
Jan 22 00:23:56 compute-0 NetworkManager[55139]: <info>  [1769041436.3333] device (tap71328ccb-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.342 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:56 compute-0 ovn_controller[95047]: 2026-01-22T00:23:56Z|00611|binding|INFO|Releasing lport 71328ccb-58ba-48ba-a908-44d72201d853 from this chassis (sb_readonly=0)
Jan 22 00:23:56 compute-0 ovn_controller[95047]: 2026-01-22T00:23:56Z|00612|binding|INFO|Setting lport 71328ccb-58ba-48ba-a908-44d72201d853 down in Southbound
Jan 22 00:23:56 compute-0 ovn_controller[95047]: 2026-01-22T00:23:56Z|00613|binding|INFO|Removing iface tap71328ccb-58 ovn-installed in OVS
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.355 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:56.357 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:85:4c 10.100.0.6', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1f32d3c6-0780-4715-9f42-e713cec6363f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02bcfc5f1f1044a3856e73a5938ff011', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f297238-9b07-4e59-b73b-faeb28c51c5f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=71328ccb-58ba-48ba-a908-44d72201d853) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:23:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:56.358 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 71328ccb-58ba-48ba-a908-44d72201d853 in datapath 92f99623-b3a9-41d7-ab3e-5bc19b701c77 unbound from our chassis
Jan 22 00:23:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:56.360 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 92f99623-b3a9-41d7-ab3e-5bc19b701c77, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:23:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:56.361 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0a28e7-d50d-45b0-8cd4-19e39ec7b1b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:56 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:56.362 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77 namespace which is not needed anymore
Jan 22 00:23:56 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d0000009e.scope: Deactivated successfully.
Jan 22 00:23:56 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d0000009e.scope: Consumed 13.166s CPU time.
Jan 22 00:23:56 compute-0 systemd-machined[154182]: Machine qemu-80-instance-0000009e terminated.
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.539 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.545 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.581 182939 INFO nova.virt.libvirt.driver [-] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Instance destroyed successfully.
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.582 182939 DEBUG nova.objects.instance [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lazy-loading 'resources' on Instance uuid 1f32d3c6-0780-4715-9f42-e713cec6363f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.607 182939 DEBUG nova.virt.libvirt.vif [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:23:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1958614551',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1492736128-gen-1-1958614551',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1492736128-ge',id=158,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFzNbeWQqM+R5mCLYdVsdPyQXYDqnkkhhC73mxN5fX0QRC+i5pxaSAc7LRsQKs9V1np8BzitSAx9O4U37xdH3m6MF7eYp2Ff07iBZVcoSIsB4CpGyP/xz08PAIvxm/KFgA==',key_name='tempest-TestSecurityGroupsBasicOps-135115301',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:23:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='02bcfc5f1f1044a3856e73a5938ff011',ramdisk_id='',reservation_id='r-o40j35ts',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1492736128',owner_user_name='tempest-TestSecurityGroupsBasicOps-1492736128-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:23:38Z,user_data=None,user_id='a60ce2b7b7ae47b484de12add551b287',uuid=1f32d3c6-0780-4715-9f42-e713cec6363f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.608 182939 DEBUG nova.network.os_vif_util [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converting VIF {"id": "71328ccb-58ba-48ba-a908-44d72201d853", "address": "fa:16:3e:1b:85:4c", "network": {"id": "92f99623-b3a9-41d7-ab3e-5bc19b701c77", "bridge": "br-int", "label": "tempest-network-smoke--863395071", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02bcfc5f1f1044a3856e73a5938ff011", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71328ccb-58", "ovs_interfaceid": "71328ccb-58ba-48ba-a908-44d72201d853", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.609 182939 DEBUG nova.network.os_vif_util [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1b:85:4c,bridge_name='br-int',has_traffic_filtering=True,id=71328ccb-58ba-48ba-a908-44d72201d853,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71328ccb-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.609 182939 DEBUG os_vif [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:85:4c,bridge_name='br-int',has_traffic_filtering=True,id=71328ccb-58ba-48ba-a908-44d72201d853,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71328ccb-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.611 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.611 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71328ccb-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.613 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.614 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.616 182939 INFO os_vif [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:85:4c,bridge_name='br-int',has_traffic_filtering=True,id=71328ccb-58ba-48ba-a908-44d72201d853,network=Network(92f99623-b3a9-41d7-ab3e-5bc19b701c77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71328ccb-58')
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.617 182939 INFO nova.virt.libvirt.driver [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Deleting instance files /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f_del
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.618 182939 INFO nova.virt.libvirt.driver [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Deletion of /var/lib/nova/instances/1f32d3c6-0780-4715-9f42-e713cec6363f_del complete
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.703 182939 DEBUG nova.compute.manager [req-61a1aab4-0fe6-46b3-a51e-140a9158bbd5 req-fa61acb0-e71f-42f9-ab75-b44b75fe1f65 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Received event network-vif-unplugged-71328ccb-58ba-48ba-a908-44d72201d853 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.704 182939 DEBUG oslo_concurrency.lockutils [req-61a1aab4-0fe6-46b3-a51e-140a9158bbd5 req-fa61acb0-e71f-42f9-ab75-b44b75fe1f65 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.704 182939 DEBUG oslo_concurrency.lockutils [req-61a1aab4-0fe6-46b3-a51e-140a9158bbd5 req-fa61acb0-e71f-42f9-ab75-b44b75fe1f65 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.704 182939 DEBUG oslo_concurrency.lockutils [req-61a1aab4-0fe6-46b3-a51e-140a9158bbd5 req-fa61acb0-e71f-42f9-ab75-b44b75fe1f65 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.704 182939 DEBUG nova.compute.manager [req-61a1aab4-0fe6-46b3-a51e-140a9158bbd5 req-fa61acb0-e71f-42f9-ab75-b44b75fe1f65 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] No waiting events found dispatching network-vif-unplugged-71328ccb-58ba-48ba-a908-44d72201d853 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.704 182939 DEBUG nova.compute.manager [req-61a1aab4-0fe6-46b3-a51e-140a9158bbd5 req-fa61acb0-e71f-42f9-ab75-b44b75fe1f65 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Received event network-vif-unplugged-71328ccb-58ba-48ba-a908-44d72201d853 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.739 182939 INFO nova.compute.manager [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.740 182939 DEBUG oslo.service.loopingcall [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.740 182939 DEBUG nova.compute.manager [-] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:23:56 compute-0 nova_compute[182935]: 2026-01-22 00:23:56.740 182939 DEBUG nova.network.neutron [-] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:23:56 compute-0 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[237790]: [NOTICE]   (237824) : haproxy version is 2.8.14-c23fe91
Jan 22 00:23:56 compute-0 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[237790]: [NOTICE]   (237824) : path to executable is /usr/sbin/haproxy
Jan 22 00:23:56 compute-0 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[237790]: [WARNING]  (237824) : Exiting Master process...
Jan 22 00:23:56 compute-0 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[237790]: [WARNING]  (237824) : Exiting Master process...
Jan 22 00:23:56 compute-0 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[237790]: [ALERT]    (237824) : Current worker (237832) exited with code 143 (Terminated)
Jan 22 00:23:56 compute-0 neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77[237790]: [WARNING]  (237824) : All workers exited. Exiting... (0)
Jan 22 00:23:56 compute-0 systemd[1]: libpod-f943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2.scope: Deactivated successfully.
Jan 22 00:23:56 compute-0 podman[237966]: 2026-01-22 00:23:56.827855983 +0000 UTC m=+0.383828526 container died f943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:23:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2-userdata-shm.mount: Deactivated successfully.
Jan 22 00:23:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-414f764c59da3f19752983271e6831e471fec702ec6651f1fcd828ef69379b25-merged.mount: Deactivated successfully.
Jan 22 00:23:56 compute-0 podman[237966]: 2026-01-22 00:23:56.982675768 +0000 UTC m=+0.538648261 container cleanup f943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:23:56 compute-0 systemd[1]: libpod-conmon-f943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2.scope: Deactivated successfully.
Jan 22 00:23:57 compute-0 podman[238010]: 2026-01-22 00:23:57.151560747 +0000 UTC m=+0.142805789 container remove f943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:23:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:57.157 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[eb488aff-153f-4cee-9adb-22da79d18126]: (4, ('Thu Jan 22 12:23:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77 (f943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2)\nf943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2\nThu Jan 22 12:23:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77 (f943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2)\nf943799913615bc88c415e1daf0c3be9455a87ca66dd882f2083dbcefc0b37f2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:57.159 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c05df0e0-cf97-48ee-9b4c-2fde8dee16bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:57.160 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92f99623-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:23:57 compute-0 nova_compute[182935]: 2026-01-22 00:23:57.161 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:57 compute-0 kernel: tap92f99623-b0: left promiscuous mode
Jan 22 00:23:57 compute-0 nova_compute[182935]: 2026-01-22 00:23:57.174 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:57 compute-0 nova_compute[182935]: 2026-01-22 00:23:57.175 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:57.177 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[836495b1-3a3d-47f1-b0ea-5c6a1f1bcc1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:57.192 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7d8b2ac2-165b-4f18-8aa1-14bdc02d5e0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:57.193 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1564029a-163a-450b-b863-a042ae1a1bec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:57.210 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5961c503-7749-48b8-acc4-4ca3d256079f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 591729, 'reachable_time': 22675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238025, 'error': None, 'target': 'ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:57.213 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-92f99623-b3a9-41d7-ab3e-5bc19b701c77 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:23:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:57.213 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[e067de43-7f29-42ae-9f5c-90025eb8bcce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:23:57 compute-0 systemd[1]: run-netns-ovnmeta\x2d92f99623\x2db3a9\x2d41d7\x2dab3e\x2d5bc19b701c77.mount: Deactivated successfully.
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.225 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:58.226 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:23:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:23:58.226 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.382 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.487 182939 DEBUG nova.network.neutron [-] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.509 182939 INFO nova.compute.manager [-] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Took 1.77 seconds to deallocate network for instance.
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.592 182939 DEBUG oslo_concurrency.lockutils [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.593 182939 DEBUG oslo_concurrency.lockutils [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.654 182939 DEBUG nova.compute.provider_tree [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.676 182939 DEBUG nova.scheduler.client.report [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.712 182939 DEBUG oslo_concurrency.lockutils [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.739 182939 INFO nova.scheduler.client.report [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Deleted allocations for instance 1f32d3c6-0780-4715-9f42-e713cec6363f
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.837 182939 DEBUG oslo_concurrency.lockutils [None req-3841cbc9-0194-4cd1-8dd0-444672f48d4c a60ce2b7b7ae47b484de12add551b287 02bcfc5f1f1044a3856e73a5938ff011 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.860 182939 DEBUG nova.compute.manager [req-4758b5bd-c493-46b2-b932-10361d0f6bdd req-f99b8e76-ff61-40e4-9d00-25e5dc65bab2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Received event network-vif-plugged-71328ccb-58ba-48ba-a908-44d72201d853 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.861 182939 DEBUG oslo_concurrency.lockutils [req-4758b5bd-c493-46b2-b932-10361d0f6bdd req-f99b8e76-ff61-40e4-9d00-25e5dc65bab2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.861 182939 DEBUG oslo_concurrency.lockutils [req-4758b5bd-c493-46b2-b932-10361d0f6bdd req-f99b8e76-ff61-40e4-9d00-25e5dc65bab2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.861 182939 DEBUG oslo_concurrency.lockutils [req-4758b5bd-c493-46b2-b932-10361d0f6bdd req-f99b8e76-ff61-40e4-9d00-25e5dc65bab2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "1f32d3c6-0780-4715-9f42-e713cec6363f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.861 182939 DEBUG nova.compute.manager [req-4758b5bd-c493-46b2-b932-10361d0f6bdd req-f99b8e76-ff61-40e4-9d00-25e5dc65bab2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] No waiting events found dispatching network-vif-plugged-71328ccb-58ba-48ba-a908-44d72201d853 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:23:58 compute-0 nova_compute[182935]: 2026-01-22 00:23:58.861 182939 WARNING nova.compute.manager [req-4758b5bd-c493-46b2-b932-10361d0f6bdd req-f99b8e76-ff61-40e4-9d00-25e5dc65bab2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Received unexpected event network-vif-plugged-71328ccb-58ba-48ba-a908-44d72201d853 for instance with vm_state deleted and task_state None.
Jan 22 00:23:59 compute-0 podman[238027]: 2026-01-22 00:23:59.677608501 +0000 UTC m=+0.050659717 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 00:23:59 compute-0 nova_compute[182935]: 2026-01-22 00:23:59.787 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:01.228 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:01 compute-0 nova_compute[182935]: 2026-01-22 00:24:01.435 182939 DEBUG nova.compute.manager [req-7363590c-3bff-4169-a758-609fe82b47ab req-c31bcc24-1ef3-4bff-9379-d2716f411052 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Received event network-vif-deleted-71328ccb-58ba-48ba-a908-44d72201d853 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:01 compute-0 nova_compute[182935]: 2026-01-22 00:24:01.616 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:03.220 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:03.221 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:03.221 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:03 compute-0 nova_compute[182935]: 2026-01-22 00:24:03.384 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:03 compute-0 nova_compute[182935]: 2026-01-22 00:24:03.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:03 compute-0 nova_compute[182935]: 2026-01-22 00:24:03.812 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.375 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Acquiring lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.376 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.398 182939 DEBUG nova.compute.manager [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.515 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.515 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.523 182939 DEBUG nova.virt.hardware [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.523 182939 INFO nova.compute.claims [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.651 182939 DEBUG nova.compute.provider_tree [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.670 182939 DEBUG nova.scheduler.client.report [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:24:05 compute-0 podman[238046]: 2026-01-22 00:24:05.686738803 +0000 UTC m=+0.059694762 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 00:24:05 compute-0 podman[238047]: 2026-01-22 00:24:05.692841528 +0000 UTC m=+0.061753611 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.692 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.693 182939 DEBUG nova.compute.manager [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.785 182939 DEBUG nova.compute.manager [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.785 182939 DEBUG nova.network.neutron [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.854 182939 INFO nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:24:05 compute-0 nova_compute[182935]: 2026-01-22 00:24:05.877 182939 DEBUG nova.compute.manager [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.029 182939 DEBUG nova.compute.manager [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.031 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.031 182939 INFO nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Creating image(s)
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.032 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Acquiring lock "/var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.032 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "/var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.033 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "/var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.045 182939 DEBUG oslo_concurrency.processutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.067 182939 DEBUG nova.policy [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.100 182939 DEBUG oslo_concurrency.processutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.101 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.101 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.117 182939 DEBUG oslo_concurrency.processutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.173 182939 DEBUG oslo_concurrency.processutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.174 182939 DEBUG oslo_concurrency.processutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.216 182939 DEBUG oslo_concurrency.processutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.218 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.218 182939 DEBUG oslo_concurrency.processutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.277 182939 DEBUG oslo_concurrency.processutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.278 182939 DEBUG nova.virt.disk.api [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Checking if we can resize image /var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.279 182939 DEBUG oslo_concurrency.processutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.339 182939 DEBUG oslo_concurrency.processutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.340 182939 DEBUG nova.virt.disk.api [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Cannot resize image /var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.341 182939 DEBUG nova.objects.instance [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lazy-loading 'migration_context' on Instance uuid 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.359 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.359 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Ensure instance console log exists: /var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.360 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.360 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.360 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:06 compute-0 nova_compute[182935]: 2026-01-22 00:24:06.620 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:07 compute-0 nova_compute[182935]: 2026-01-22 00:24:07.616 182939 DEBUG nova.network.neutron [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Successfully created port: d5de0eb8-a6c6-4750-9935-ed1b8b196167 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:24:08 compute-0 nova_compute[182935]: 2026-01-22 00:24:08.388 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:09 compute-0 nova_compute[182935]: 2026-01-22 00:24:09.048 182939 DEBUG nova.network.neutron [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Successfully updated port: d5de0eb8-a6c6-4750-9935-ed1b8b196167 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:24:09 compute-0 nova_compute[182935]: 2026-01-22 00:24:09.076 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Acquiring lock "refresh_cache-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:24:09 compute-0 nova_compute[182935]: 2026-01-22 00:24:09.076 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Acquired lock "refresh_cache-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:24:09 compute-0 nova_compute[182935]: 2026-01-22 00:24:09.076 182939 DEBUG nova.network.neutron [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:24:09 compute-0 nova_compute[182935]: 2026-01-22 00:24:09.196 182939 DEBUG nova.compute.manager [req-1c121d66-3dc1-4df9-be74-bb2c6e7ec561 req-89e7ca32-7eaf-47cd-b148-6f4bd1fe79ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Received event network-changed-d5de0eb8-a6c6-4750-9935-ed1b8b196167 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:09 compute-0 nova_compute[182935]: 2026-01-22 00:24:09.196 182939 DEBUG nova.compute.manager [req-1c121d66-3dc1-4df9-be74-bb2c6e7ec561 req-89e7ca32-7eaf-47cd-b148-6f4bd1fe79ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Refreshing instance network info cache due to event network-changed-d5de0eb8-a6c6-4750-9935-ed1b8b196167. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:24:09 compute-0 nova_compute[182935]: 2026-01-22 00:24:09.197 182939 DEBUG oslo_concurrency.lockutils [req-1c121d66-3dc1-4df9-be74-bb2c6e7ec561 req-89e7ca32-7eaf-47cd-b148-6f4bd1fe79ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:24:09 compute-0 nova_compute[182935]: 2026-01-22 00:24:09.417 182939 DEBUG nova.network.neutron [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.087 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.266 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.271 182939 DEBUG nova.network.neutron [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Updating instance_info_cache with network_info: [{"id": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "address": "fa:16:3e:19:c8:90", "network": {"id": "d3347b7a-627d-46d6-af62-a195dbfcdbf5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-601813314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc7e8332e4644c4c80a30d240e3c9983", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5de0eb8-a6", "ovs_interfaceid": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.292 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Releasing lock "refresh_cache-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.292 182939 DEBUG nova.compute.manager [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Instance network_info: |[{"id": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "address": "fa:16:3e:19:c8:90", "network": {"id": "d3347b7a-627d-46d6-af62-a195dbfcdbf5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-601813314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc7e8332e4644c4c80a30d240e3c9983", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5de0eb8-a6", "ovs_interfaceid": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.293 182939 DEBUG oslo_concurrency.lockutils [req-1c121d66-3dc1-4df9-be74-bb2c6e7ec561 req-89e7ca32-7eaf-47cd-b148-6f4bd1fe79ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.293 182939 DEBUG nova.network.neutron [req-1c121d66-3dc1-4df9-be74-bb2c6e7ec561 req-89e7ca32-7eaf-47cd-b148-6f4bd1fe79ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Refreshing network info cache for port d5de0eb8-a6c6-4750-9935-ed1b8b196167 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.296 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Start _get_guest_xml network_info=[{"id": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "address": "fa:16:3e:19:c8:90", "network": {"id": "d3347b7a-627d-46d6-af62-a195dbfcdbf5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-601813314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc7e8332e4644c4c80a30d240e3c9983", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5de0eb8-a6", "ovs_interfaceid": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.301 182939 WARNING nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.306 182939 DEBUG nova.virt.libvirt.host [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.307 182939 DEBUG nova.virt.libvirt.host [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.313 182939 DEBUG nova.virt.libvirt.host [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.314 182939 DEBUG nova.virt.libvirt.host [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.315 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.315 182939 DEBUG nova.virt.hardware [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.315 182939 DEBUG nova.virt.hardware [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.316 182939 DEBUG nova.virt.hardware [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.316 182939 DEBUG nova.virt.hardware [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.316 182939 DEBUG nova.virt.hardware [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.316 182939 DEBUG nova.virt.hardware [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.316 182939 DEBUG nova.virt.hardware [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.317 182939 DEBUG nova.virt.hardware [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.317 182939 DEBUG nova.virt.hardware [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.317 182939 DEBUG nova.virt.hardware [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.317 182939 DEBUG nova.virt.hardware [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.321 182939 DEBUG nova.virt.libvirt.vif [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:24:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1766047182',display_name='tempest-TestServerBasicOps-server-1766047182',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1766047182',id=160,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOO5ldbG11Tw+Z74Y8iyk6UTZuDzjl8OgPy/pJ88DnRvZL39JX6C103aSCrYFMG7NIMN1+jx+s1HKlQ3ZvE3Rj9eLzl+CDcW+2nSZTWR3dXOdpDWfTn6CZdJVHGZDKpCBw==',key_name='tempest-TestServerBasicOps-99730790',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dc7e8332e4644c4c80a30d240e3c9983',ramdisk_id='',reservation_id='r-x2gjkal2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1427962289',owner_user_name='tempest-TestServerBasicOps-1427962289-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:24:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2d732de79224490e900df2fc0d2fcc37',uuid=13e4fbdd-2a25-4ff8-9b96-6af0c16c3484,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "address": "fa:16:3e:19:c8:90", "network": {"id": "d3347b7a-627d-46d6-af62-a195dbfcdbf5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-601813314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc7e8332e4644c4c80a30d240e3c9983", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5de0eb8-a6", "ovs_interfaceid": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.321 182939 DEBUG nova.network.os_vif_util [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Converting VIF {"id": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "address": "fa:16:3e:19:c8:90", "network": {"id": "d3347b7a-627d-46d6-af62-a195dbfcdbf5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-601813314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc7e8332e4644c4c80a30d240e3c9983", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5de0eb8-a6", "ovs_interfaceid": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.322 182939 DEBUG nova.network.os_vif_util [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:c8:90,bridge_name='br-int',has_traffic_filtering=True,id=d5de0eb8-a6c6-4750-9935-ed1b8b196167,network=Network(d3347b7a-627d-46d6-af62-a195dbfcdbf5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5de0eb8-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.323 182939 DEBUG nova.objects.instance [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lazy-loading 'pci_devices' on Instance uuid 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.340 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:24:11 compute-0 nova_compute[182935]:   <uuid>13e4fbdd-2a25-4ff8-9b96-6af0c16c3484</uuid>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   <name>instance-000000a0</name>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <nova:name>tempest-TestServerBasicOps-server-1766047182</nova:name>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:24:11</nova:creationTime>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:24:11 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:24:11 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:24:11 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:24:11 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:24:11 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:24:11 compute-0 nova_compute[182935]:         <nova:user uuid="2d732de79224490e900df2fc0d2fcc37">tempest-TestServerBasicOps-1427962289-project-member</nova:user>
Jan 22 00:24:11 compute-0 nova_compute[182935]:         <nova:project uuid="dc7e8332e4644c4c80a30d240e3c9983">tempest-TestServerBasicOps-1427962289</nova:project>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:24:11 compute-0 nova_compute[182935]:         <nova:port uuid="d5de0eb8-a6c6-4750-9935-ed1b8b196167">
Jan 22 00:24:11 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <system>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <entry name="serial">13e4fbdd-2a25-4ff8-9b96-6af0c16c3484</entry>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <entry name="uuid">13e4fbdd-2a25-4ff8-9b96-6af0c16c3484</entry>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     </system>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   <os>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   </os>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   <features>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   </features>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.config"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:19:c8:90"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <target dev="tapd5de0eb8-a6"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/console.log" append="off"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <video>
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     </video>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:24:11 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:24:11 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:24:11 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:24:11 compute-0 nova_compute[182935]: </domain>
Jan 22 00:24:11 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.341 182939 DEBUG nova.compute.manager [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Preparing to wait for external event network-vif-plugged-d5de0eb8-a6c6-4750-9935-ed1b8b196167 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.342 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Acquiring lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.343 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.343 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.344 182939 DEBUG nova.virt.libvirt.vif [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:24:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1766047182',display_name='tempest-TestServerBasicOps-server-1766047182',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1766047182',id=160,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOO5ldbG11Tw+Z74Y8iyk6UTZuDzjl8OgPy/pJ88DnRvZL39JX6C103aSCrYFMG7NIMN1+jx+s1HKlQ3ZvE3Rj9eLzl+CDcW+2nSZTWR3dXOdpDWfTn6CZdJVHGZDKpCBw==',key_name='tempest-TestServerBasicOps-99730790',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dc7e8332e4644c4c80a30d240e3c9983',ramdisk_id='',reservation_id='r-x2gjkal2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1427962289',owner_user_name='tempest-TestServerBasicOps-1427962289-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:24:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2d732de79224490e900df2fc0d2fcc37',uuid=13e4fbdd-2a25-4ff8-9b96-6af0c16c3484,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "address": "fa:16:3e:19:c8:90", "network": {"id": "d3347b7a-627d-46d6-af62-a195dbfcdbf5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-601813314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc7e8332e4644c4c80a30d240e3c9983", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5de0eb8-a6", "ovs_interfaceid": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.345 182939 DEBUG nova.network.os_vif_util [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Converting VIF {"id": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "address": "fa:16:3e:19:c8:90", "network": {"id": "d3347b7a-627d-46d6-af62-a195dbfcdbf5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-601813314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc7e8332e4644c4c80a30d240e3c9983", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5de0eb8-a6", "ovs_interfaceid": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.346 182939 DEBUG nova.network.os_vif_util [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:c8:90,bridge_name='br-int',has_traffic_filtering=True,id=d5de0eb8-a6c6-4750-9935-ed1b8b196167,network=Network(d3347b7a-627d-46d6-af62-a195dbfcdbf5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5de0eb8-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.347 182939 DEBUG os_vif [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:c8:90,bridge_name='br-int',has_traffic_filtering=True,id=d5de0eb8-a6c6-4750-9935-ed1b8b196167,network=Network(d3347b7a-627d-46d6-af62-a195dbfcdbf5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5de0eb8-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.348 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.349 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.349 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.354 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.355 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5de0eb8-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.355 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd5de0eb8-a6, col_values=(('external_ids', {'iface-id': 'd5de0eb8-a6c6-4750-9935-ed1b8b196167', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:c8:90', 'vm-uuid': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:11 compute-0 NetworkManager[55139]: <info>  [1769041451.3590] manager: (tapd5de0eb8-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/294)
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.358 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.362 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.364 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.365 182939 INFO os_vif [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:c8:90,bridge_name='br-int',has_traffic_filtering=True,id=d5de0eb8-a6c6-4750-9935-ed1b8b196167,network=Network(d3347b7a-627d-46d6-af62-a195dbfcdbf5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5de0eb8-a6')
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.579 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041436.577769, 1f32d3c6-0780-4715-9f42-e713cec6363f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.579 182939 INFO nova.compute.manager [-] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] VM Stopped (Lifecycle Event)
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.621 182939 DEBUG nova.compute.manager [None req-a83c5445-31d3-44a1-8176-de0ae3afe7bb - - - - - -] [instance: 1f32d3c6-0780-4715-9f42-e713cec6363f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.737 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.738 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.738 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] No VIF found with MAC fa:16:3e:19:c8:90, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:24:11 compute-0 nova_compute[182935]: 2026-01-22 00:24:11.739 182939 INFO nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Using config drive
Jan 22 00:24:12 compute-0 nova_compute[182935]: 2026-01-22 00:24:12.659 182939 INFO nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Creating config drive at /var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.config
Jan 22 00:24:12 compute-0 nova_compute[182935]: 2026-01-22 00:24:12.673 182939 DEBUG oslo_concurrency.processutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp09eycr8o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:24:12 compute-0 nova_compute[182935]: 2026-01-22 00:24:12.823 182939 DEBUG oslo_concurrency.processutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp09eycr8o" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:24:12 compute-0 kernel: tapd5de0eb8-a6: entered promiscuous mode
Jan 22 00:24:12 compute-0 NetworkManager[55139]: <info>  [1769041452.8870] manager: (tapd5de0eb8-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/295)
Jan 22 00:24:12 compute-0 ovn_controller[95047]: 2026-01-22T00:24:12Z|00614|binding|INFO|Claiming lport d5de0eb8-a6c6-4750-9935-ed1b8b196167 for this chassis.
Jan 22 00:24:12 compute-0 ovn_controller[95047]: 2026-01-22T00:24:12Z|00615|binding|INFO|d5de0eb8-a6c6-4750-9935-ed1b8b196167: Claiming fa:16:3e:19:c8:90 10.100.0.10
Jan 22 00:24:12 compute-0 nova_compute[182935]: 2026-01-22 00:24:12.888 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:12.901 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:c8:90 10.100.0.10'], port_security=['fa:16:3e:19:c8:90 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d3347b7a-627d-46d6-af62-a195dbfcdbf5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'neutron:revision_number': '2', 'neutron:security_group_ids': '411cdcab-709f-4c30-b2d5-ce3bfa44052f 8e6d546b-76c1-4d38-bd4c-decc18831385', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f41957c3-31da-441c-adcb-bb0f8d2c8137, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=d5de0eb8-a6c6-4750-9935-ed1b8b196167) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:24:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:12.902 104408 INFO neutron.agent.ovn.metadata.agent [-] Port d5de0eb8-a6c6-4750-9935-ed1b8b196167 in datapath d3347b7a-627d-46d6-af62-a195dbfcdbf5 bound to our chassis
Jan 22 00:24:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:12.903 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d3347b7a-627d-46d6-af62-a195dbfcdbf5
Jan 22 00:24:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:12.915 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ebcbe25c-5dc3-4a44-bbdb-aaf43362c1ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:12 compute-0 systemd-udevd[238122]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:24:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:12.918 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd3347b7a-61 in ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:24:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:12.920 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd3347b7a-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:24:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:12.920 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[232cb855-fcba-4688-93e3-d87dcd16bfa1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:12.921 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b176e7-79f5-48c2-b0ef-162d088a3dba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:12 compute-0 NetworkManager[55139]: <info>  [1769041452.9299] device (tapd5de0eb8-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:24:12 compute-0 NetworkManager[55139]: <info>  [1769041452.9305] device (tapd5de0eb8-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:24:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:12.932 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[807f571d-fdc7-48cc-9514-485207539de0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:12 compute-0 systemd-machined[154182]: New machine qemu-81-instance-000000a0.
Jan 22 00:24:12 compute-0 nova_compute[182935]: 2026-01-22 00:24:12.941 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:12 compute-0 ovn_controller[95047]: 2026-01-22T00:24:12Z|00616|binding|INFO|Setting lport d5de0eb8-a6c6-4750-9935-ed1b8b196167 ovn-installed in OVS
Jan 22 00:24:12 compute-0 ovn_controller[95047]: 2026-01-22T00:24:12Z|00617|binding|INFO|Setting lport d5de0eb8-a6c6-4750-9935-ed1b8b196167 up in Southbound
Jan 22 00:24:12 compute-0 nova_compute[182935]: 2026-01-22 00:24:12.948 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:12 compute-0 systemd[1]: Started Virtual Machine qemu-81-instance-000000a0.
Jan 22 00:24:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:12.957 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e142b2-5945-47c5-a5bb-0a42a428c6de]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:12.989 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a1090b-d992-4cae-92cf-b92ee5dba6f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:12.995 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d44444d8-cf48-46e0-af3a-376ccfd9ceb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:12 compute-0 NetworkManager[55139]: <info>  [1769041452.9976] manager: (tapd3347b7a-60): new Veth device (/org/freedesktop/NetworkManager/Devices/296)
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.035 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[3cea1208-3679-4d0b-ba39-1aa22fb7976a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.039 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[cb5bfa51-90ca-4cb9-9250-3c95d0fa9dc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:13 compute-0 NetworkManager[55139]: <info>  [1769041453.0684] device (tapd3347b7a-60): carrier: link connected
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.073 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7468f68a-bbbd-4fe7-8742-163aadf230fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.089 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3fc0cb-f1bd-4a70-af1d-c9c0b37cec3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd3347b7a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:b4:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595580, 'reachable_time': 15749, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238155, 'error': None, 'target': 'ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.109 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[74c3f8ef-aeda-408b-8407-220ea02e46b0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:b4a3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 595580, 'tstamp': 595580}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238156, 'error': None, 'target': 'ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.124 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[594da29e-be16-4ccf-b758-2f9242feaa3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd3347b7a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:b4:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595580, 'reachable_time': 15749, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238157, 'error': None, 'target': 'ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.161 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[760e39ec-04c5-4d39-941d-9d3b5cc2546d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.219 182939 DEBUG nova.compute.manager [req-dd25540c-fe08-4630-9d47-c5fb9cc9dca3 req-bd598f93-7d2e-4da1-a5f7-028c328f5dcf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Received event network-vif-plugged-d5de0eb8-a6c6-4750-9935-ed1b8b196167 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.219 182939 DEBUG oslo_concurrency.lockutils [req-dd25540c-fe08-4630-9d47-c5fb9cc9dca3 req-bd598f93-7d2e-4da1-a5f7-028c328f5dcf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.220 182939 DEBUG oslo_concurrency.lockutils [req-dd25540c-fe08-4630-9d47-c5fb9cc9dca3 req-bd598f93-7d2e-4da1-a5f7-028c328f5dcf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.220 182939 DEBUG oslo_concurrency.lockutils [req-dd25540c-fe08-4630-9d47-c5fb9cc9dca3 req-bd598f93-7d2e-4da1-a5f7-028c328f5dcf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.220 182939 DEBUG nova.compute.manager [req-dd25540c-fe08-4630-9d47-c5fb9cc9dca3 req-bd598f93-7d2e-4da1-a5f7-028c328f5dcf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Processing event network-vif-plugged-d5de0eb8-a6c6-4750-9935-ed1b8b196167 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.222 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a1399ed9-70b4-4adb-879d-e97d56a2ff0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.223 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3347b7a-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.224 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.224 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3347b7a-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.225 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:13 compute-0 NetworkManager[55139]: <info>  [1769041453.2266] manager: (tapd3347b7a-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Jan 22 00:24:13 compute-0 kernel: tapd3347b7a-60: entered promiscuous mode
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.227 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.229 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd3347b7a-60, col_values=(('external_ids', {'iface-id': 'cf54a886-26fb-4f3c-955d-98b27232c204'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.230 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:13 compute-0 ovn_controller[95047]: 2026-01-22T00:24:13Z|00618|binding|INFO|Releasing lport cf54a886-26fb-4f3c-955d-98b27232c204 from this chassis (sb_readonly=0)
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.237 182939 DEBUG nova.network.neutron [req-1c121d66-3dc1-4df9-be74-bb2c6e7ec561 req-89e7ca32-7eaf-47cd-b148-6f4bd1fe79ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Updated VIF entry in instance network info cache for port d5de0eb8-a6c6-4750-9935-ed1b8b196167. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.237 182939 DEBUG nova.network.neutron [req-1c121d66-3dc1-4df9-be74-bb2c6e7ec561 req-89e7ca32-7eaf-47cd-b148-6f4bd1fe79ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Updating instance_info_cache with network_info: [{"id": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "address": "fa:16:3e:19:c8:90", "network": {"id": "d3347b7a-627d-46d6-af62-a195dbfcdbf5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-601813314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc7e8332e4644c4c80a30d240e3c9983", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5de0eb8-a6", "ovs_interfaceid": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.242 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.243 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d3347b7a-627d-46d6-af62-a195dbfcdbf5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d3347b7a-627d-46d6-af62-a195dbfcdbf5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.245 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[eb93c091-b33b-4ef2-8555-95725902df62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.246 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-d3347b7a-627d-46d6-af62-a195dbfcdbf5
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/d3347b7a-627d-46d6-af62-a195dbfcdbf5.pid.haproxy
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID d3347b7a-627d-46d6-af62-a195dbfcdbf5
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:24:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:13.248 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5', 'env', 'PROCESS_TAG=haproxy-d3347b7a-627d-46d6-af62-a195dbfcdbf5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d3347b7a-627d-46d6-af62-a195dbfcdbf5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.257 182939 DEBUG oslo_concurrency.lockutils [req-1c121d66-3dc1-4df9-be74-bb2c6e7ec561 req-89e7ca32-7eaf-47cd-b148-6f4bd1fe79ce 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.427 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.589 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041453.5893853, 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.590 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] VM Started (Lifecycle Event)
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.592 182939 DEBUG nova.compute.manager [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.596 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.600 182939 INFO nova.virt.libvirt.driver [-] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Instance spawned successfully.
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.600 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.614 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.620 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:24:13 compute-0 podman[238194]: 2026-01-22 00:24:13.6206594 +0000 UTC m=+0.071043972 container create e8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.624 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.625 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.626 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.626 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.627 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.627 182939 DEBUG nova.virt.libvirt.driver [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.638 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.639 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041453.5904896, 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.639 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] VM Paused (Lifecycle Event)
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.663 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.667 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041453.595373, 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:24:13 compute-0 systemd[1]: Started libpod-conmon-e8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108.scope.
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.668 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] VM Resumed (Lifecycle Event)
Jan 22 00:24:13 compute-0 podman[238194]: 2026-01-22 00:24:13.580778142 +0000 UTC m=+0.031162764 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.688 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:24:13 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.692 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:24:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2bd1647ff519bf23da683d9966f384f271066e732643895a03266f9df2f2d9c1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.714 182939 INFO nova.compute.manager [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Took 7.68 seconds to spawn the instance on the hypervisor.
Jan 22 00:24:13 compute-0 podman[238194]: 2026-01-22 00:24:13.715232921 +0000 UTC m=+0.165617513 container init e8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.715 182939 DEBUG nova.compute.manager [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.719 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:24:13 compute-0 podman[238194]: 2026-01-22 00:24:13.722102875 +0000 UTC m=+0.172487447 container start e8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 00:24:13 compute-0 neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5[238210]: [NOTICE]   (238214) : New worker (238216) forked
Jan 22 00:24:13 compute-0 neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5[238210]: [NOTICE]   (238214) : Loading success.
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.851 182939 INFO nova.compute.manager [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Took 8.37 seconds to build instance.
Jan 22 00:24:13 compute-0 nova_compute[182935]: 2026-01-22 00:24:13.879 182939 DEBUG oslo_concurrency.lockutils [None req-dab214f5-9cca-43dc-a521-6206500681c2 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.503s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:15 compute-0 nova_compute[182935]: 2026-01-22 00:24:15.328 182939 DEBUG nova.compute.manager [req-0e273d06-c4d8-4438-85c6-cf77e3e498bf req-087dd49a-ec0c-4816-83ab-e41c58c1b811 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Received event network-vif-plugged-d5de0eb8-a6c6-4750-9935-ed1b8b196167 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:15 compute-0 nova_compute[182935]: 2026-01-22 00:24:15.329 182939 DEBUG oslo_concurrency.lockutils [req-0e273d06-c4d8-4438-85c6-cf77e3e498bf req-087dd49a-ec0c-4816-83ab-e41c58c1b811 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:15 compute-0 nova_compute[182935]: 2026-01-22 00:24:15.330 182939 DEBUG oslo_concurrency.lockutils [req-0e273d06-c4d8-4438-85c6-cf77e3e498bf req-087dd49a-ec0c-4816-83ab-e41c58c1b811 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:15 compute-0 nova_compute[182935]: 2026-01-22 00:24:15.330 182939 DEBUG oslo_concurrency.lockutils [req-0e273d06-c4d8-4438-85c6-cf77e3e498bf req-087dd49a-ec0c-4816-83ab-e41c58c1b811 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:15 compute-0 nova_compute[182935]: 2026-01-22 00:24:15.330 182939 DEBUG nova.compute.manager [req-0e273d06-c4d8-4438-85c6-cf77e3e498bf req-087dd49a-ec0c-4816-83ab-e41c58c1b811 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] No waiting events found dispatching network-vif-plugged-d5de0eb8-a6c6-4750-9935-ed1b8b196167 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:24:15 compute-0 nova_compute[182935]: 2026-01-22 00:24:15.331 182939 WARNING nova.compute.manager [req-0e273d06-c4d8-4438-85c6-cf77e3e498bf req-087dd49a-ec0c-4816-83ab-e41c58c1b811 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Received unexpected event network-vif-plugged-d5de0eb8-a6c6-4750-9935-ed1b8b196167 for instance with vm_state active and task_state None.
Jan 22 00:24:16 compute-0 nova_compute[182935]: 2026-01-22 00:24:16.359 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:17 compute-0 NetworkManager[55139]: <info>  [1769041457.2836] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/298)
Jan 22 00:24:17 compute-0 NetworkManager[55139]: <info>  [1769041457.2851] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/299)
Jan 22 00:24:17 compute-0 nova_compute[182935]: 2026-01-22 00:24:17.282 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:17 compute-0 nova_compute[182935]: 2026-01-22 00:24:17.409 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:17 compute-0 ovn_controller[95047]: 2026-01-22T00:24:17Z|00619|binding|INFO|Releasing lport cf54a886-26fb-4f3c-955d-98b27232c204 from this chassis (sb_readonly=0)
Jan 22 00:24:17 compute-0 nova_compute[182935]: 2026-01-22 00:24:17.424 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:17 compute-0 nova_compute[182935]: 2026-01-22 00:24:17.680 182939 DEBUG nova.compute.manager [req-11f387b8-6c0b-470a-9173-4e2b99c54d98 req-efc280d9-7730-4f8b-bbca-4a79c24b5323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Received event network-changed-d5de0eb8-a6c6-4750-9935-ed1b8b196167 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:17 compute-0 nova_compute[182935]: 2026-01-22 00:24:17.681 182939 DEBUG nova.compute.manager [req-11f387b8-6c0b-470a-9173-4e2b99c54d98 req-efc280d9-7730-4f8b-bbca-4a79c24b5323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Refreshing instance network info cache due to event network-changed-d5de0eb8-a6c6-4750-9935-ed1b8b196167. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:24:17 compute-0 nova_compute[182935]: 2026-01-22 00:24:17.681 182939 DEBUG oslo_concurrency.lockutils [req-11f387b8-6c0b-470a-9173-4e2b99c54d98 req-efc280d9-7730-4f8b-bbca-4a79c24b5323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:24:17 compute-0 nova_compute[182935]: 2026-01-22 00:24:17.681 182939 DEBUG oslo_concurrency.lockutils [req-11f387b8-6c0b-470a-9173-4e2b99c54d98 req-efc280d9-7730-4f8b-bbca-4a79c24b5323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:24:17 compute-0 nova_compute[182935]: 2026-01-22 00:24:17.681 182939 DEBUG nova.network.neutron [req-11f387b8-6c0b-470a-9173-4e2b99c54d98 req-efc280d9-7730-4f8b-bbca-4a79c24b5323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Refreshing network info cache for port d5de0eb8-a6c6-4750-9935-ed1b8b196167 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:24:18 compute-0 nova_compute[182935]: 2026-01-22 00:24:18.430 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:20 compute-0 nova_compute[182935]: 2026-01-22 00:24:20.355 182939 DEBUG nova.network.neutron [req-11f387b8-6c0b-470a-9173-4e2b99c54d98 req-efc280d9-7730-4f8b-bbca-4a79c24b5323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Updated VIF entry in instance network info cache for port d5de0eb8-a6c6-4750-9935-ed1b8b196167. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:24:20 compute-0 nova_compute[182935]: 2026-01-22 00:24:20.355 182939 DEBUG nova.network.neutron [req-11f387b8-6c0b-470a-9173-4e2b99c54d98 req-efc280d9-7730-4f8b-bbca-4a79c24b5323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Updating instance_info_cache with network_info: [{"id": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "address": "fa:16:3e:19:c8:90", "network": {"id": "d3347b7a-627d-46d6-af62-a195dbfcdbf5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-601813314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc7e8332e4644c4c80a30d240e3c9983", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5de0eb8-a6", "ovs_interfaceid": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:24:20 compute-0 nova_compute[182935]: 2026-01-22 00:24:20.374 182939 DEBUG oslo_concurrency.lockutils [req-11f387b8-6c0b-470a-9173-4e2b99c54d98 req-efc280d9-7730-4f8b-bbca-4a79c24b5323 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:24:21 compute-0 nova_compute[182935]: 2026-01-22 00:24:21.362 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:22 compute-0 podman[238227]: 2026-01-22 00:24:22.680160174 +0000 UTC m=+0.050041901 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:24:22 compute-0 podman[238226]: 2026-01-22 00:24:22.733676628 +0000 UTC m=+0.104247782 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 22 00:24:22 compute-0 nova_compute[182935]: 2026-01-22 00:24:22.751 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.320 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'name': 'tempest-TestServerBasicOps-server-1766047182', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a0', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'hostId': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.321 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.322 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1766047182>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1766047182>]
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.325 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484 / tapd5de0eb8-a6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.325 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0985e4d-cc78-406c-8344-260c2fad16ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': 'instance-000000a0-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-tapd5de0eb8-a6', 'timestamp': '2026-01-22T00:24:23.322783', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'tapd5de0eb8-a6', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:c8:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd5de0eb8-a6'}, 'message_id': 'b3a65f2a-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.117064022, 'message_signature': '4bb8343ec15fb76e27ffc17a971086bdc3be82420f6b2deb926984c281974543'}]}, 'timestamp': '2026-01-22 00:24:23.326602', '_unique_id': 'fb7f07d5c551407da54ad40ccd24922c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.328 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.330 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.330 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.330 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1766047182>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1766047182>]
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.330 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.330 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89a54a40-1763-4514-91fc-b707484cd779', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': 'instance-000000a0-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-tapd5de0eb8-a6', 'timestamp': '2026-01-22T00:24:23.330728', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'tapd5de0eb8-a6', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:c8:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd5de0eb8-a6'}, 'message_id': 'b3a7149c-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.117064022, 'message_signature': 'b48eb3f9d9680c410b3984119b243b8ba592381549be819365b112b5eafc3b0c'}]}, 'timestamp': '2026-01-22 00:24:23.331072', '_unique_id': '444cd2220d32457cb053f958c54a9741'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.331 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.343 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.344 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd757f74-1bd6-43c5-8cf7-ac9465a2e2e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-vda', 'timestamp': '2026-01-22T00:24:23.332720', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b3a9225a-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.127019589, 'message_signature': '09e3564ebc01538a7d35ea142d284eae10875d8f1728288c0285d0d3d0bd704d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-sda', 'timestamp': '2026-01-22T00:24:23.332720', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b3a92fac-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.127019589, 'message_signature': '6257a2075cc20e25fc762f2bf3de69df2a2c31e0bb0cc2ce010b9ab70fefa795'}]}, 'timestamp': '2026-01-22 00:24:23.344876', '_unique_id': '87a8289234cf4701839b395cf832c687'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.345 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.373 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.374 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5cae257-cb7c-4c02-8e93-77ecddff62c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-vda', 'timestamp': '2026-01-22T00:24:23.347111', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b3adb5fe-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.14139247, 'message_signature': '4674f2376db962b65a44507d5ed81ded674ab2ef9426d76036074bbbf461fd26'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-sda', 'timestamp': '2026-01-22T00:24:23.347111', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b3adc77e-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.14139247, 'message_signature': 'ec16414f13813a1a6f7f72eb9eee9152029680f8be614232b8cb8f2f0ae8d779'}]}, 'timestamp': '2026-01-22 00:24:23.374991', '_unique_id': '233cb734c2ec4bb8a52f2dfc00de3c7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.376 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.377 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.377 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.377 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1766047182>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1766047182>]
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.377 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.378 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c20cc900-e9a6-4aa3-b95e-46dc793b3250', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': 'instance-000000a0-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-tapd5de0eb8-a6', 'timestamp': '2026-01-22T00:24:23.378007', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'tapd5de0eb8-a6', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:c8:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd5de0eb8-a6'}, 'message_id': 'b3ae4c44-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.117064022, 'message_signature': '019829ca4a3bd2ae766eef35e37e3cac5450518d62656c3d4f0f6eb4ca5ef055'}]}, 'timestamp': '2026-01-22 00:24:23.378393', '_unique_id': 'b9ae7676bbf146eab980add8a0330936'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.379 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.380 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.380 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1766047182>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1766047182>]
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.380 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.380 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.380 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '995fc134-525a-4562-bf96-b583f90859fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-vda', 'timestamp': '2026-01-22T00:24:23.380440', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b3aea860-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.127019589, 'message_signature': 'd6972d7d09ba6bc335cbe3a73834a47a2d92752b23a276ae84da73f7687b9165'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-sda', 'timestamp': '2026-01-22T00:24:23.380440', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b3aeb26a-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.127019589, 'message_signature': '106c85272eabfaa29502076d159c5aa448b711a8ad346f4e71548e21e688d6c2'}]}, 'timestamp': '2026-01-22 00:24:23.380948', '_unique_id': 'dd72c74fec624843a48d7339109e6d9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.381 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.382 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.382 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e9bf7f1-9d21-42ad-b023-62d34070d823', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': 'instance-000000a0-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-tapd5de0eb8-a6', 'timestamp': '2026-01-22T00:24:23.382480', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'tapd5de0eb8-a6', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:c8:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd5de0eb8-a6'}, 'message_id': 'b3aefb58-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.117064022, 'message_signature': '3dca8dc1368e5a7be0c25f1a297c172e6ca15475f1b917c3eddf54f62e345538'}]}, 'timestamp': '2026-01-22 00:24:23.382896', '_unique_id': '7ccfaea6e0194a3b9db70dd6d56b9876'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.383 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.385 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.410 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/cpu volume: 9480000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f57d2ae0-8c1d-4f36-b273-e5f897efbbfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9480000000, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'timestamp': '2026-01-22T00:24:23.385199', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b3b33ad8-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.204212315, 'message_signature': '39b073074e0f3693728c484f8807a14785adaa8615a6f525bf32681fe1986771'}]}, 'timestamp': '2026-01-22 00:24:23.410785', '_unique_id': '525e5a8f88a240d9b79fda8441543723'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.412 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '31427c93-49e9-4ea0-8033-1036015a4cf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': 'instance-000000a0-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-tapd5de0eb8-a6', 'timestamp': '2026-01-22T00:24:23.413021', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'tapd5de0eb8-a6', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:c8:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd5de0eb8-a6'}, 'message_id': 'b3b3a266-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.117064022, 'message_signature': '66834a0e2d30f9f73fc1c5a491222994218615d252596d9357106c627f0da867'}]}, 'timestamp': '2026-01-22 00:24:23.413373', '_unique_id': 'b003f9e185ab4b93981c49709be62057'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.413 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.414 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.415 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.415 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '282e402a-019e-4649-98b5-f9a12eec01d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-vda', 'timestamp': '2026-01-22T00:24:23.414975', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b3b3ef50-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.14139247, 'message_signature': '1bc7fbdf1125365d1fd94eabe91f8b3277e6c84f4fd9cc5c170dc105ca66e442'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-sda', 'timestamp': '2026-01-22T00:24:23.414975', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b3b3f8ec-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.14139247, 'message_signature': '7b421476013713324db70a4183c951d5532295821bcab1a3a6cce41e7b26245e'}]}, 'timestamp': '2026-01-22 00:24:23.415521', '_unique_id': '5afb315d6e3f4b8c983c4569f2f0cb62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.416 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'caf53cf1-4299-4ec5-99f6-8377d5bd1ccf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': 'instance-000000a0-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-tapd5de0eb8-a6', 'timestamp': '2026-01-22T00:24:23.417104', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'tapd5de0eb8-a6', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:c8:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd5de0eb8-a6'}, 'message_id': 'b3b441b2-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.117064022, 'message_signature': '750db74750455d85973586c7f1614191a510429946746366ebd436cf9202c4ab'}]}, 'timestamp': '2026-01-22 00:24:23.417402', '_unique_id': '7175013641e44f0fa2a966b9ec681117'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.418 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.418 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.419 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4623808d-0f7f-4e8b-b8a4-dd0072a1b0d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-vda', 'timestamp': '2026-01-22T00:24:23.418935', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b3b48b90-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.14139247, 'message_signature': 'debe3e838761660896c6e470e938b67e2df52065d1c050c4996bf04189138b7f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-sda', 'timestamp': '2026-01-22T00:24:23.418935', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b3b49518-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.14139247, 'message_signature': 'a5e0dd8f978aed2ec0dea5c92c2e445a631f0894111320887359ad0a29ed86d0'}]}, 'timestamp': '2026-01-22 00:24:23.419588', '_unique_id': '75bb316077ff46529aec39b697034a41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.420 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.421 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.421 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7bd9cd6-bb51-4db3-a227-0cfc59fc40ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': 'instance-000000a0-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-tapd5de0eb8-a6', 'timestamp': '2026-01-22T00:24:23.421232', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'tapd5de0eb8-a6', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:c8:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd5de0eb8-a6'}, 'message_id': 'b3b4e3ec-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.117064022, 'message_signature': '3fa18ff79734348e75dc15a76f3c1e45abe61c3afbf421477e4fee72f9cd96d2'}]}, 'timestamp': '2026-01-22 00:24:23.421593', '_unique_id': 'fcab4700c38f4cebba4425a2b63b3a63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.423 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.424 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.424 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484: ceilometer.compute.pollsters.NoVolumeException
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.424 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.424 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c5a8405-d8cd-40f6-9e6a-de0f752923f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': 'instance-000000a0-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-tapd5de0eb8-a6', 'timestamp': '2026-01-22T00:24:23.424445', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'tapd5de0eb8-a6', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:c8:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd5de0eb8-a6'}, 'message_id': 'b3b55f70-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.117064022, 'message_signature': 'c1ebd69118312c092f9077dd9307badbcf10dacad7971e72f05121246e6221a7'}]}, 'timestamp': '2026-01-22 00:24:23.424715', '_unique_id': '871bd180ed5a4e878cb942be65e6f368'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.425 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.426 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.426 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/network.incoming.packets.drop volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5aa0307a-8636-4844-84e6-a84db115ba46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': 'instance-000000a0-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-tapd5de0eb8-a6', 'timestamp': '2026-01-22T00:24:23.426387', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'tapd5de0eb8-a6', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:c8:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd5de0eb8-a6'}, 'message_id': 'b3b5aba6-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.117064022, 'message_signature': 'db0d88654806104a7e6524bbed10117af4a5168e8f88fe6f225719ed34ad509a'}]}, 'timestamp': '2026-01-22 00:24:23.426706', '_unique_id': '72375371ee144f09b61ef31db3b6a60d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.427 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.428 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.428 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.428 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4133c636-2f5f-48e5-a07f-4f5f24e2b6a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-vda', 'timestamp': '2026-01-22T00:24:23.428385', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b3b5fb10-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.127019589, 'message_signature': 'd764b55725dd1a019d78698e962c7531187b54b567433859c4cd4fd9c87484c7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-sda', 'timestamp': '2026-01-22T00:24:23.428385', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b3b6083a-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.127019589, 'message_signature': '6a617b59b63fb74c6f7166dbbc8138d9ea13536bb78de9f657d0549e20868d1d'}]}, 'timestamp': '2026-01-22 00:24:23.429057', '_unique_id': '5a5c2abbf4d54409b0ed61b87f3f358c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.429 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.430 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.430 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3786a4b2-2f8d-4b72-b084-88a212b6028f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-vda', 'timestamp': '2026-01-22T00:24:23.430721', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b3b65682-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.14139247, 'message_signature': '3407a4a4bdaec47efce94a8cde0e810291c87adb0c119db683056c3ad5ffd2b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-sda', 'timestamp': '2026-01-22T00:24:23.430721', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b3b6600a-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.14139247, 'message_signature': '8289f375d2fa1412cf665aa6b7af6408357a9dc2a1ec9023522e17bb8ce2d413'}]}, 'timestamp': '2026-01-22 00:24:23.431265', '_unique_id': '31dd18c01af24e80b7c7e01645d74fe4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.431 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.432 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.432 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db482469-c0fa-41ee-aa77-1582f079d8ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-vda', 'timestamp': '2026-01-22T00:24:23.432868', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b3b6a8f8-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.14139247, 'message_signature': '690b56a264e248dedea14235c81c7a483156f84888fef0b65e9c37cb247a27a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-sda', 'timestamp': '2026-01-22T00:24:23.432868', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b3b6b280-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.14139247, 'message_signature': '414af92934eb0a35b5ef303c493f126d2bebf1227b31993e01a35cd23d8114a1'}]}, 'timestamp': '2026-01-22 00:24:23.433378', '_unique_id': 'c2c30c812db6431d9afa6069491763fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.433 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.435 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.435 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.read.latency volume: 77590717 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.436 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/disk.device.read.latency volume: 532243 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b034c6bf-d0ae-4b70-97b4-06a49b2070c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 77590717, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-vda', 'timestamp': '2026-01-22T00:24:23.435721', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b3b71ba8-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.14139247, 'message_signature': 'f9f7393b454a8ed00a12d68c736e8cdaaae9dc9531cf2510eece7291996380d1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 532243, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-sda', 'timestamp': '2026-01-22T00:24:23.435721', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'instance-000000a0', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b3b728be-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.14139247, 'message_signature': '5de1e2d9c87579cc876363c93ac5dc00831340361603eb8de59e56bea815579c'}]}, 'timestamp': '2026-01-22 00:24:23.436430', '_unique_id': '71af21dc04d8403baf52c6983797bb48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.437 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.438 12 DEBUG ceilometer.compute.pollsters [-] 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd76a86c-5e6c-4e4e-882c-e3bc06addbd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '2d732de79224490e900df2fc0d2fcc37', 'user_name': None, 'project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'project_name': None, 'resource_id': 'instance-000000a0-13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-tapd5de0eb8-a6', 'timestamp': '2026-01-22T00:24:23.438026', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1766047182', 'name': 'tapd5de0eb8-a6', 'instance_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'instance_type': 'm1.nano', 'host': '57924f8c100960a3f87a53e88258690ae634ee123423ee447f9aebed', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:c8:90', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd5de0eb8-a6'}, 'message_id': 'b3b770f8-f728-11f0-9743-fa163e6b0dfb', 'monotonic_time': 5966.117064022, 'message_signature': 'c71afa7e158948c62b7bb5181af8dbcf4782678f07bf371fea50c60776e323a0'}]}, 'timestamp': '2026-01-22 00:24:23.438267', '_unique_id': 'dd86671d0862444a87dc53c9cb9e01b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:24:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:24:23.439 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:24:23 compute-0 nova_compute[182935]: 2026-01-22 00:24:23.467 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:24 compute-0 podman[238274]: 2026-01-22 00:24:24.67468866 +0000 UTC m=+0.046738973 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:24:25 compute-0 ovn_controller[95047]: 2026-01-22T00:24:25Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:19:c8:90 10.100.0.10
Jan 22 00:24:25 compute-0 ovn_controller[95047]: 2026-01-22T00:24:25Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:19:c8:90 10.100.0.10
Jan 22 00:24:26 compute-0 nova_compute[182935]: 2026-01-22 00:24:26.365 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:28 compute-0 nova_compute[182935]: 2026-01-22 00:24:28.470 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:30 compute-0 podman[238313]: 2026-01-22 00:24:30.673551229 +0000 UTC m=+0.049777875 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 22 00:24:31 compute-0 nova_compute[182935]: 2026-01-22 00:24:31.369 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:33 compute-0 nova_compute[182935]: 2026-01-22 00:24:33.472 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:36 compute-0 nova_compute[182935]: 2026-01-22 00:24:36.372 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:36 compute-0 podman[238334]: 2026-01-22 00:24:36.689575766 +0000 UTC m=+0.060043530 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:24:36 compute-0 podman[238333]: 2026-01-22 00:24:36.702539064 +0000 UTC m=+0.076544732 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, architecture=x86_64)
Jan 22 00:24:38 compute-0 nova_compute[182935]: 2026-01-22 00:24:38.534 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:39 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:39.710 104736 DEBUG eventlet.wsgi.server [-] (104736) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 00:24:39 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:39.712 104736 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0
Jan 22 00:24:39 compute-0 ovn_metadata_agent[104403]: Accept: */*
Jan 22 00:24:39 compute-0 ovn_metadata_agent[104403]: Connection: close
Jan 22 00:24:39 compute-0 ovn_metadata_agent[104403]: Content-Type: text/plain
Jan 22 00:24:39 compute-0 ovn_metadata_agent[104403]: Host: 169.254.169.254
Jan 22 00:24:39 compute-0 ovn_metadata_agent[104403]: User-Agent: curl/7.84.0
Jan 22 00:24:39 compute-0 ovn_metadata_agent[104403]: X-Forwarded-For: 10.100.0.10
Jan 22 00:24:39 compute-0 ovn_metadata_agent[104403]: X-Ovn-Network-Id: d3347b7a-627d-46d6-af62-a195dbfcdbf5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 00:24:39 compute-0 sshd-session[238373]: Invalid user mongodb from 188.166.69.60 port 43096
Jan 22 00:24:39 compute-0 sshd-session[238373]: Connection closed by invalid user mongodb 188.166.69.60 port 43096 [preauth]
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:40.757 104736 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:40.759 104736 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 1.0465763
Jan 22 00:24:40 compute-0 haproxy-metadata-proxy-d3347b7a-627d-46d6-af62-a195dbfcdbf5[238216]: 10.100.0.10:41664 [22/Jan/2026:00:24:39.708] listener listener/metadata 0/0/0/1050/1050 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:40.878 104736 DEBUG eventlet.wsgi.server [-] (104736) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:40.879 104736 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: Accept: */*
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: Connection: close
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: Content-Length: 100
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: Content-Type: application/x-www-form-urlencoded
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: Host: 169.254.169.254
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: User-Agent: curl/7.84.0
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: X-Forwarded-For: 10.100.0.10
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: X-Ovn-Network-Id: d3347b7a-627d-46d6-af62-a195dbfcdbf5
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:24:40 compute-0 ovn_metadata_agent[104403]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Jan 22 00:24:41 compute-0 nova_compute[182935]: 2026-01-22 00:24:41.375 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:41.853 104736 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Jan 22 00:24:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:41.853 104736 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.9739652
Jan 22 00:24:41 compute-0 haproxy-metadata-proxy-d3347b7a-627d-46d6-af62-a195dbfcdbf5[238216]: 10.100.0.10:41676 [22/Jan/2026:00:24:40.878] listener listener/metadata 0/0/0/975/975 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Jan 22 00:24:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:42.370 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:24:42 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:42.371 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:24:42 compute-0 nova_compute[182935]: 2026-01-22 00:24:42.411 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:43 compute-0 nova_compute[182935]: 2026-01-22 00:24:43.584 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:44 compute-0 nova_compute[182935]: 2026-01-22 00:24:44.738 182939 DEBUG oslo_concurrency.lockutils [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Acquiring lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:44 compute-0 nova_compute[182935]: 2026-01-22 00:24:44.739 182939 DEBUG oslo_concurrency.lockutils [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:44 compute-0 nova_compute[182935]: 2026-01-22 00:24:44.739 182939 DEBUG oslo_concurrency.lockutils [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Acquiring lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:44 compute-0 nova_compute[182935]: 2026-01-22 00:24:44.739 182939 DEBUG oslo_concurrency.lockutils [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:44 compute-0 nova_compute[182935]: 2026-01-22 00:24:44.740 182939 DEBUG oslo_concurrency.lockutils [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:44 compute-0 nova_compute[182935]: 2026-01-22 00:24:44.881 182939 INFO nova.compute.manager [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Terminating instance
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.073 182939 DEBUG nova.compute.manager [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:24:45 compute-0 kernel: tapd5de0eb8-a6 (unregistering): left promiscuous mode
Jan 22 00:24:45 compute-0 NetworkManager[55139]: <info>  [1769041485.1018] device (tapd5de0eb8-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.107 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:45 compute-0 ovn_controller[95047]: 2026-01-22T00:24:45Z|00620|binding|INFO|Releasing lport d5de0eb8-a6c6-4750-9935-ed1b8b196167 from this chassis (sb_readonly=0)
Jan 22 00:24:45 compute-0 ovn_controller[95047]: 2026-01-22T00:24:45Z|00621|binding|INFO|Setting lport d5de0eb8-a6c6-4750-9935-ed1b8b196167 down in Southbound
Jan 22 00:24:45 compute-0 ovn_controller[95047]: 2026-01-22T00:24:45Z|00622|binding|INFO|Removing iface tapd5de0eb8-a6 ovn-installed in OVS
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.109 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.128 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:45 compute-0 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a0.scope: Deactivated successfully.
Jan 22 00:24:45 compute-0 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a0.scope: Consumed 14.780s CPU time.
Jan 22 00:24:45 compute-0 systemd-machined[154182]: Machine qemu-81-instance-000000a0 terminated.
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.286 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:c8:90 10.100.0.10'], port_security=['fa:16:3e:19:c8:90 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '13e4fbdd-2a25-4ff8-9b96-6af0c16c3484', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d3347b7a-627d-46d6-af62-a195dbfcdbf5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc7e8332e4644c4c80a30d240e3c9983', 'neutron:revision_number': '4', 'neutron:security_group_ids': '411cdcab-709f-4c30-b2d5-ce3bfa44052f 8e6d546b-76c1-4d38-bd4c-decc18831385', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f41957c3-31da-441c-adcb-bb0f8d2c8137, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=d5de0eb8-a6c6-4750-9935-ed1b8b196167) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.287 104408 INFO neutron.agent.ovn.metadata.agent [-] Port d5de0eb8-a6c6-4750-9935-ed1b8b196167 in datapath d3347b7a-627d-46d6-af62-a195dbfcdbf5 unbound from our chassis
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.288 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d3347b7a-627d-46d6-af62-a195dbfcdbf5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.290 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[594396c8-d5ba-40f4-9f36-4b0251ccea1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.290 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5 namespace which is not needed anymore
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.338 182939 INFO nova.virt.libvirt.driver [-] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Instance destroyed successfully.
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.339 182939 DEBUG nova.objects.instance [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lazy-loading 'resources' on Instance uuid 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.372 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.373 182939 DEBUG nova.virt.libvirt.vif [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:24:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1766047182',display_name='tempest-TestServerBasicOps-server-1766047182',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1766047182',id=160,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOO5ldbG11Tw+Z74Y8iyk6UTZuDzjl8OgPy/pJ88DnRvZL39JX6C103aSCrYFMG7NIMN1+jx+s1HKlQ3ZvE3Rj9eLzl+CDcW+2nSZTWR3dXOdpDWfTn6CZdJVHGZDKpCBw==',key_name='tempest-TestServerBasicOps-99730790',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:24:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dc7e8332e4644c4c80a30d240e3c9983',ramdisk_id='',reservation_id='r-x2gjkal2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-1427962289',owner_user_name='tempest-TestServerBasicOps-1427962289-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:24:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='2d732de79224490e900df2fc0d2fcc37',uuid=13e4fbdd-2a25-4ff8-9b96-6af0c16c3484,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "address": "fa:16:3e:19:c8:90", "network": {"id": "d3347b7a-627d-46d6-af62-a195dbfcdbf5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-601813314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc7e8332e4644c4c80a30d240e3c9983", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5de0eb8-a6", "ovs_interfaceid": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.374 182939 DEBUG nova.network.os_vif_util [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Converting VIF {"id": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "address": "fa:16:3e:19:c8:90", "network": {"id": "d3347b7a-627d-46d6-af62-a195dbfcdbf5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-601813314-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc7e8332e4644c4c80a30d240e3c9983", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5de0eb8-a6", "ovs_interfaceid": "d5de0eb8-a6c6-4750-9935-ed1b8b196167", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.375 182939 DEBUG nova.network.os_vif_util [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:19:c8:90,bridge_name='br-int',has_traffic_filtering=True,id=d5de0eb8-a6c6-4750-9935-ed1b8b196167,network=Network(d3347b7a-627d-46d6-af62-a195dbfcdbf5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5de0eb8-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.375 182939 DEBUG os_vif [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:19:c8:90,bridge_name='br-int',has_traffic_filtering=True,id=d5de0eb8-a6c6-4750-9935-ed1b8b196167,network=Network(d3347b7a-627d-46d6-af62-a195dbfcdbf5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5de0eb8-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.377 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.378 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5de0eb8-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.381 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.384 182939 INFO os_vif [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:19:c8:90,bridge_name='br-int',has_traffic_filtering=True,id=d5de0eb8-a6c6-4750-9935-ed1b8b196167,network=Network(d3347b7a-627d-46d6-af62-a195dbfcdbf5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5de0eb8-a6')
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.385 182939 INFO nova.virt.libvirt.driver [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Deleting instance files /var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484_del
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.386 182939 INFO nova.virt.libvirt.driver [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Deletion of /var/lib/nova/instances/13e4fbdd-2a25-4ff8-9b96-6af0c16c3484_del complete
Jan 22 00:24:45 compute-0 neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5[238210]: [NOTICE]   (238214) : haproxy version is 2.8.14-c23fe91
Jan 22 00:24:45 compute-0 neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5[238210]: [NOTICE]   (238214) : path to executable is /usr/sbin/haproxy
Jan 22 00:24:45 compute-0 neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5[238210]: [WARNING]  (238214) : Exiting Master process...
Jan 22 00:24:45 compute-0 neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5[238210]: [ALERT]    (238214) : Current worker (238216) exited with code 143 (Terminated)
Jan 22 00:24:45 compute-0 neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5[238210]: [WARNING]  (238214) : All workers exited. Exiting... (0)
Jan 22 00:24:45 compute-0 systemd[1]: libpod-e8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108.scope: Deactivated successfully.
Jan 22 00:24:45 compute-0 podman[238416]: 2026-01-22 00:24:45.431101524 +0000 UTC m=+0.046232262 container died e8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:24:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108-userdata-shm.mount: Deactivated successfully.
Jan 22 00:24:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-2bd1647ff519bf23da683d9966f384f271066e732643895a03266f9df2f2d9c1-merged.mount: Deactivated successfully.
Jan 22 00:24:45 compute-0 podman[238416]: 2026-01-22 00:24:45.471841963 +0000 UTC m=+0.086972721 container cleanup e8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 00:24:45 compute-0 systemd[1]: libpod-conmon-e8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108.scope: Deactivated successfully.
Jan 22 00:24:45 compute-0 podman[238445]: 2026-01-22 00:24:45.531022911 +0000 UTC m=+0.038387545 container remove e8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.539 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6f2c93fb-2e66-4295-ba5f-7c06ead4827b]: (4, ('Thu Jan 22 12:24:45 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5 (e8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108)\ne8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108\nThu Jan 22 12:24:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5 (e8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108)\ne8467b0de94a41fe11f141bb36155f614ec628d0d14e897e00119a995f686108\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.541 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[adb97b18-f48b-492d-96e3-6be168bb352a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.542 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3347b7a-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:24:45 compute-0 kernel: tapd3347b7a-60: left promiscuous mode
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.545 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.550 182939 INFO nova.compute.manager [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Took 0.48 seconds to destroy the instance on the hypervisor.
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.550 182939 DEBUG oslo.service.loopingcall [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.551 182939 DEBUG nova.compute.manager [-] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.551 182939 DEBUG nova.network.neutron [-] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.551 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[75895db1-634a-4a3b-aabf-04f57a8e3dec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.562 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.575 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9f5cd6f6-d72f-46e6-9b7f-a750bc749556]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.577 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d10acb2d-6858-44f1-ae64-9ce6cf656e45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.592 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[32dba7df-88e9-4190-80a0-cd3c180ee033]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 595572, 'reachable_time': 16951, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238460, 'error': None, 'target': 'ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.596 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d3347b7a-627d-46d6-af62-a195dbfcdbf5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:24:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:24:45.596 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[20a27d4d-3e5d-406d-bbdc-2564fdfeb418]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:24:45 compute-0 systemd[1]: run-netns-ovnmeta\x2dd3347b7a\x2d627d\x2d46d6\x2daf62\x2da195dbfcdbf5.mount: Deactivated successfully.
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.713 182939 DEBUG nova.compute.manager [req-a826623a-aec1-46b5-9da4-a1e201bde51a req-670a7f2b-1baf-4971-9c92-297b27a2662a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Received event network-vif-unplugged-d5de0eb8-a6c6-4750-9935-ed1b8b196167 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.714 182939 DEBUG oslo_concurrency.lockutils [req-a826623a-aec1-46b5-9da4-a1e201bde51a req-670a7f2b-1baf-4971-9c92-297b27a2662a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.714 182939 DEBUG oslo_concurrency.lockutils [req-a826623a-aec1-46b5-9da4-a1e201bde51a req-670a7f2b-1baf-4971-9c92-297b27a2662a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.714 182939 DEBUG oslo_concurrency.lockutils [req-a826623a-aec1-46b5-9da4-a1e201bde51a req-670a7f2b-1baf-4971-9c92-297b27a2662a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.714 182939 DEBUG nova.compute.manager [req-a826623a-aec1-46b5-9da4-a1e201bde51a req-670a7f2b-1baf-4971-9c92-297b27a2662a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] No waiting events found dispatching network-vif-unplugged-d5de0eb8-a6c6-4750-9935-ed1b8b196167 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:24:45 compute-0 nova_compute[182935]: 2026-01-22 00:24:45.715 182939 DEBUG nova.compute.manager [req-a826623a-aec1-46b5-9da4-a1e201bde51a req-670a7f2b-1baf-4971-9c92-297b27a2662a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Received event network-vif-unplugged-d5de0eb8-a6c6-4750-9935-ed1b8b196167 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.350 182939 DEBUG nova.network.neutron [-] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.376 182939 INFO nova.compute.manager [-] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Took 1.83 seconds to deallocate network for instance.
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.502 182939 DEBUG oslo_concurrency.lockutils [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.503 182939 DEBUG oslo_concurrency.lockutils [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.510 182939 DEBUG nova.compute.manager [req-d5bf9876-a39f-413b-ac9d-9f52736af525 req-10a5bc5e-cb63-4c9e-9cc1-076387fa87bf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Received event network-vif-deleted-d5de0eb8-a6c6-4750-9935-ed1b8b196167 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.848 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.922 182939 DEBUG nova.compute.manager [req-c0c56b3c-44bb-4b64-9083-9f17d57cd513 req-b3e92e84-08ba-4f3b-b02a-2f19a22665d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Received event network-vif-plugged-d5de0eb8-a6c6-4750-9935-ed1b8b196167 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.923 182939 DEBUG oslo_concurrency.lockutils [req-c0c56b3c-44bb-4b64-9083-9f17d57cd513 req-b3e92e84-08ba-4f3b-b02a-2f19a22665d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.923 182939 DEBUG oslo_concurrency.lockutils [req-c0c56b3c-44bb-4b64-9083-9f17d57cd513 req-b3e92e84-08ba-4f3b-b02a-2f19a22665d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.923 182939 DEBUG oslo_concurrency.lockutils [req-c0c56b3c-44bb-4b64-9083-9f17d57cd513 req-b3e92e84-08ba-4f3b-b02a-2f19a22665d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.923 182939 DEBUG nova.compute.manager [req-c0c56b3c-44bb-4b64-9083-9f17d57cd513 req-b3e92e84-08ba-4f3b-b02a-2f19a22665d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] No waiting events found dispatching network-vif-plugged-d5de0eb8-a6c6-4750-9935-ed1b8b196167 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.924 182939 WARNING nova.compute.manager [req-c0c56b3c-44bb-4b64-9083-9f17d57cd513 req-b3e92e84-08ba-4f3b-b02a-2f19a22665d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Received unexpected event network-vif-plugged-d5de0eb8-a6c6-4750-9935-ed1b8b196167 for instance with vm_state deleted and task_state None.
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.926 182939 DEBUG nova.compute.provider_tree [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.947 182939 DEBUG nova.scheduler.client.report [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.989 182939 DEBUG oslo_concurrency.lockutils [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.486s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.992 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.992 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:47 compute-0 nova_compute[182935]: 2026-01-22 00:24:47.992 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.027 182939 INFO nova.scheduler.client.report [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Deleted allocations for instance 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.168 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.169 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5686MB free_disk=73.12722396850586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.169 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.169 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.172 182939 DEBUG oslo_concurrency.lockutils [None req-52296d56-5b9a-41c2-998e-28f9050fc52b 2d732de79224490e900df2fc0d2fcc37 dc7e8332e4644c4c80a30d240e3c9983 - - default default] Lock "13e4fbdd-2a25-4ff8-9b96-6af0c16c3484" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.433s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.244 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.245 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.267 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.285 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.316 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.316 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:24:48 compute-0 nova_compute[182935]: 2026-01-22 00:24:48.586 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:50 compute-0 nova_compute[182935]: 2026-01-22 00:24:50.432 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:51 compute-0 nova_compute[182935]: 2026-01-22 00:24:51.317 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:51 compute-0 nova_compute[182935]: 2026-01-22 00:24:51.318 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:24:52 compute-0 nova_compute[182935]: 2026-01-22 00:24:52.584 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:52 compute-0 nova_compute[182935]: 2026-01-22 00:24:52.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:52 compute-0 nova_compute[182935]: 2026-01-22 00:24:52.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:24:52 compute-0 nova_compute[182935]: 2026-01-22 00:24:52.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:24:52 compute-0 nova_compute[182935]: 2026-01-22 00:24:52.808 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:52 compute-0 nova_compute[182935]: 2026-01-22 00:24:52.843 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:24:53 compute-0 nova_compute[182935]: 2026-01-22 00:24:53.587 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:53 compute-0 podman[238464]: 2026-01-22 00:24:53.703847754 +0000 UTC m=+0.064517346 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:24:53 compute-0 podman[238463]: 2026-01-22 00:24:53.77341432 +0000 UTC m=+0.134951783 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:24:53 compute-0 nova_compute[182935]: 2026-01-22 00:24:53.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:55 compute-0 nova_compute[182935]: 2026-01-22 00:24:55.434 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:24:55 compute-0 podman[238514]: 2026-01-22 00:24:55.708347677 +0000 UTC m=+0.076161894 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:24:56 compute-0 nova_compute[182935]: 2026-01-22 00:24:56.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:57 compute-0 nova_compute[182935]: 2026-01-22 00:24:57.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:24:58 compute-0 nova_compute[182935]: 2026-01-22 00:24:58.626 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:00 compute-0 nova_compute[182935]: 2026-01-22 00:25:00.337 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041485.3365855, 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:25:00 compute-0 nova_compute[182935]: 2026-01-22 00:25:00.338 182939 INFO nova.compute.manager [-] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] VM Stopped (Lifecycle Event)
Jan 22 00:25:00 compute-0 nova_compute[182935]: 2026-01-22 00:25:00.377 182939 DEBUG nova.compute.manager [None req-23daa7e4-a595-47a3-8771-0ff79d151d69 - - - - - -] [instance: 13e4fbdd-2a25-4ff8-9b96-6af0c16c3484] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:00 compute-0 nova_compute[182935]: 2026-01-22 00:25:00.437 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:00 compute-0 nova_compute[182935]: 2026-01-22 00:25:00.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:00 compute-0 nova_compute[182935]: 2026-01-22 00:25:00.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:01 compute-0 podman[238539]: 2026-01-22 00:25:01.685049388 +0000 UTC m=+0.052728686 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:25:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:03.221 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:03.222 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:03.222 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:03 compute-0 nova_compute[182935]: 2026-01-22 00:25:03.628 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:05 compute-0 nova_compute[182935]: 2026-01-22 00:25:05.439 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:05 compute-0 nova_compute[182935]: 2026-01-22 00:25:05.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:07 compute-0 podman[238559]: 2026-01-22 00:25:07.684006259 +0000 UTC m=+0.055292377 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 22 00:25:07 compute-0 podman[238558]: 2026-01-22 00:25:07.705363577 +0000 UTC m=+0.079813550 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, architecture=x86_64, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Jan 22 00:25:08 compute-0 nova_compute[182935]: 2026-01-22 00:25:08.630 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:10 compute-0 nova_compute[182935]: 2026-01-22 00:25:10.441 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:13 compute-0 nova_compute[182935]: 2026-01-22 00:25:13.632 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.170 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "063fddce-2634-4aa5-a9c5-17ca977ea05a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.171 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.200 182939 DEBUG nova.compute.manager [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.361 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.362 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.383 182939 DEBUG nova.virt.hardware [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.384 182939 INFO nova.compute.claims [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.443 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.529 182939 DEBUG nova.scheduler.client.report [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.784 182939 DEBUG nova.scheduler.client.report [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.785 182939 DEBUG nova.compute.provider_tree [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.816 182939 DEBUG nova.scheduler.client.report [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:25:15 compute-0 nova_compute[182935]: 2026-01-22 00:25:15.843 182939 DEBUG nova.scheduler.client.report [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.194 182939 DEBUG nova.compute.provider_tree [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.217 182939 DEBUG nova.scheduler.client.report [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.290 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.292 182939 DEBUG nova.compute.manager [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.368 182939 DEBUG nova.compute.manager [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.368 182939 DEBUG nova.network.neutron [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.415 182939 INFO nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.442 182939 DEBUG nova.compute.manager [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.617 182939 DEBUG nova.compute.manager [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.619 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.620 182939 INFO nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Creating image(s)
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.621 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.621 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.622 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.639 182939 DEBUG oslo_concurrency.processutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.735 182939 DEBUG nova.policy [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.738 182939 DEBUG oslo_concurrency.processutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.739 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.739 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.750 182939 DEBUG oslo_concurrency.processutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.813 182939 DEBUG oslo_concurrency.processutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.815 182939 DEBUG oslo_concurrency.processutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.865 182939 DEBUG oslo_concurrency.processutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.867 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.867 182939 DEBUG oslo_concurrency.processutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.927 182939 DEBUG oslo_concurrency.processutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.929 182939 DEBUG nova.virt.disk.api [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Checking if we can resize image /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:25:16 compute-0 nova_compute[182935]: 2026-01-22 00:25:16.930 182939 DEBUG oslo_concurrency.processutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:17 compute-0 nova_compute[182935]: 2026-01-22 00:25:17.003 182939 DEBUG oslo_concurrency.processutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:17 compute-0 nova_compute[182935]: 2026-01-22 00:25:17.004 182939 DEBUG nova.virt.disk.api [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Cannot resize image /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:25:17 compute-0 nova_compute[182935]: 2026-01-22 00:25:17.005 182939 DEBUG nova.objects.instance [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid 063fddce-2634-4aa5-a9c5-17ca977ea05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:17 compute-0 nova_compute[182935]: 2026-01-22 00:25:17.051 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:25:17 compute-0 nova_compute[182935]: 2026-01-22 00:25:17.052 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Ensure instance console log exists: /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:25:17 compute-0 nova_compute[182935]: 2026-01-22 00:25:17.053 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:17 compute-0 nova_compute[182935]: 2026-01-22 00:25:17.053 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:17 compute-0 nova_compute[182935]: 2026-01-22 00:25:17.053 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:18 compute-0 nova_compute[182935]: 2026-01-22 00:25:18.681 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:19 compute-0 nova_compute[182935]: 2026-01-22 00:25:19.496 182939 DEBUG nova.network.neutron [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Successfully created port: 21fc60b1-3906-4de7-95e7-4113c45ab3d6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:25:20 compute-0 nova_compute[182935]: 2026-01-22 00:25:20.445 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:20 compute-0 nova_compute[182935]: 2026-01-22 00:25:20.834 182939 DEBUG nova.network.neutron [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Successfully updated port: 21fc60b1-3906-4de7-95e7-4113c45ab3d6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:25:20 compute-0 nova_compute[182935]: 2026-01-22 00:25:20.869 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:25:20 compute-0 nova_compute[182935]: 2026-01-22 00:25:20.870 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:25:20 compute-0 nova_compute[182935]: 2026-01-22 00:25:20.870 182939 DEBUG nova.network.neutron [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:25:21 compute-0 nova_compute[182935]: 2026-01-22 00:25:21.045 182939 DEBUG nova.compute.manager [req-c669b9ab-bbc7-4bf9-a7bd-86616dfea11d req-0155d8eb-f888-40cb-9cb0-93130b19b55c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-changed-21fc60b1-3906-4de7-95e7-4113c45ab3d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:21 compute-0 nova_compute[182935]: 2026-01-22 00:25:21.046 182939 DEBUG nova.compute.manager [req-c669b9ab-bbc7-4bf9-a7bd-86616dfea11d req-0155d8eb-f888-40cb-9cb0-93130b19b55c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Refreshing instance network info cache due to event network-changed-21fc60b1-3906-4de7-95e7-4113c45ab3d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:25:21 compute-0 nova_compute[182935]: 2026-01-22 00:25:21.046 182939 DEBUG oslo_concurrency.lockutils [req-c669b9ab-bbc7-4bf9-a7bd-86616dfea11d req-0155d8eb-f888-40cb-9cb0-93130b19b55c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:25:21 compute-0 nova_compute[182935]: 2026-01-22 00:25:21.460 182939 DEBUG nova.network.neutron [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:25:22 compute-0 sshd-session[238614]: Invalid user mongodb from 188.166.69.60 port 60538
Jan 22 00:25:22 compute-0 sshd-session[238614]: Connection closed by invalid user mongodb 188.166.69.60 port 60538 [preauth]
Jan 22 00:25:22 compute-0 nova_compute[182935]: 2026-01-22 00:25:22.733 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:22.735 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:25:22 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:22.736 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:25:23 compute-0 nova_compute[182935]: 2026-01-22 00:25:23.684 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.275 182939 DEBUG nova.network.neutron [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Updating instance_info_cache with network_info: [{"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.303 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.303 182939 DEBUG nova.compute.manager [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Instance network_info: |[{"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.304 182939 DEBUG oslo_concurrency.lockutils [req-c669b9ab-bbc7-4bf9-a7bd-86616dfea11d req-0155d8eb-f888-40cb-9cb0-93130b19b55c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.304 182939 DEBUG nova.network.neutron [req-c669b9ab-bbc7-4bf9-a7bd-86616dfea11d req-0155d8eb-f888-40cb-9cb0-93130b19b55c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Refreshing network info cache for port 21fc60b1-3906-4de7-95e7-4113c45ab3d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.308 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Start _get_guest_xml network_info=[{"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.313 182939 WARNING nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.319 182939 DEBUG nova.virt.libvirt.host [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.320 182939 DEBUG nova.virt.libvirt.host [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.326 182939 DEBUG nova.virt.libvirt.host [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.327 182939 DEBUG nova.virt.libvirt.host [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.328 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.329 182939 DEBUG nova.virt.hardware [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.329 182939 DEBUG nova.virt.hardware [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.330 182939 DEBUG nova.virt.hardware [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.330 182939 DEBUG nova.virt.hardware [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.330 182939 DEBUG nova.virt.hardware [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.330 182939 DEBUG nova.virt.hardware [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.331 182939 DEBUG nova.virt.hardware [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.331 182939 DEBUG nova.virt.hardware [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.331 182939 DEBUG nova.virt.hardware [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.332 182939 DEBUG nova.virt.hardware [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.332 182939 DEBUG nova.virt.hardware [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.338 182939 DEBUG nova.virt.libvirt.vif [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:25:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-507265383',display_name='tempest-TestNetworkAdvancedServerOps-server-507265383',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-507265383',id=162,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBV5lrE/0o3wylW8styAqIEq3o6gpKICYx+OOjZpZ4HsoPBmWU0vw3pi82ayKIiy54WBno0h74p6C9GNm4wvSrVujqZ6E1U9i7nevpU2R6qM9p7TZcVT1ycIcAKLfUQOYQ==',key_name='tempest-TestNetworkAdvancedServerOps-1079565092',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-w0q2r970',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:25:16Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=063fddce-2634-4aa5-a9c5-17ca977ea05a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.338 182939 DEBUG nova.network.os_vif_util [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.339 182939 DEBUG nova.network.os_vif_util [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.341 182939 DEBUG nova.objects.instance [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid 063fddce-2634-4aa5-a9c5-17ca977ea05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.357 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:25:24 compute-0 nova_compute[182935]:   <uuid>063fddce-2634-4aa5-a9c5-17ca977ea05a</uuid>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   <name>instance-000000a2</name>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-507265383</nova:name>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:25:24</nova:creationTime>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:25:24 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:25:24 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:25:24 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:25:24 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:25:24 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:25:24 compute-0 nova_compute[182935]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:25:24 compute-0 nova_compute[182935]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:25:24 compute-0 nova_compute[182935]:         <nova:port uuid="21fc60b1-3906-4de7-95e7-4113c45ab3d6">
Jan 22 00:25:24 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <system>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <entry name="serial">063fddce-2634-4aa5-a9c5-17ca977ea05a</entry>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <entry name="uuid">063fddce-2634-4aa5-a9c5-17ca977ea05a</entry>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     </system>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   <os>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   </os>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   <features>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   </features>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.config"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:d7:78:96"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <target dev="tap21fc60b1-39"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/console.log" append="off"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <video>
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     </video>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:25:24 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:25:24 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:25:24 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:25:24 compute-0 nova_compute[182935]: </domain>
Jan 22 00:25:24 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.359 182939 DEBUG nova.compute.manager [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Preparing to wait for external event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.360 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.360 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.360 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.361 182939 DEBUG nova.virt.libvirt.vif [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:25:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-507265383',display_name='tempest-TestNetworkAdvancedServerOps-server-507265383',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-507265383',id=162,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBV5lrE/0o3wylW8styAqIEq3o6gpKICYx+OOjZpZ4HsoPBmWU0vw3pi82ayKIiy54WBno0h74p6C9GNm4wvSrVujqZ6E1U9i7nevpU2R6qM9p7TZcVT1ycIcAKLfUQOYQ==',key_name='tempest-TestNetworkAdvancedServerOps-1079565092',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-w0q2r970',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:25:16Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=063fddce-2634-4aa5-a9c5-17ca977ea05a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.361 182939 DEBUG nova.network.os_vif_util [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.361 182939 DEBUG nova.network.os_vif_util [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.362 182939 DEBUG os_vif [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.363 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.363 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.364 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.367 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.367 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap21fc60b1-39, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.367 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap21fc60b1-39, col_values=(('external_ids', {'iface-id': '21fc60b1-3906-4de7-95e7-4113c45ab3d6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:78:96', 'vm-uuid': '063fddce-2634-4aa5-a9c5-17ca977ea05a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.369 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:24 compute-0 NetworkManager[55139]: <info>  [1769041524.3705] manager: (tap21fc60b1-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.371 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.376 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.377 182939 INFO os_vif [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39')
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.433 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.433 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.434 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No VIF found with MAC fa:16:3e:d7:78:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.435 182939 INFO nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Using config drive
Jan 22 00:25:24 compute-0 podman[238619]: 2026-01-22 00:25:24.709004514 +0000 UTC m=+0.082775531 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:25:24 compute-0 podman[238618]: 2026-01-22 00:25:24.738735252 +0000 UTC m=+0.104796415 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.958 182939 INFO nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Creating config drive at /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.config
Jan 22 00:25:24 compute-0 nova_compute[182935]: 2026-01-22 00:25:24.962 182939 DEBUG oslo_concurrency.processutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwj18gitj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.089 182939 DEBUG oslo_concurrency.processutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwj18gitj" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:25 compute-0 kernel: tap21fc60b1-39: entered promiscuous mode
Jan 22 00:25:25 compute-0 NetworkManager[55139]: <info>  [1769041525.1468] manager: (tap21fc60b1-39): new Tun device (/org/freedesktop/NetworkManager/Devices/301)
Jan 22 00:25:25 compute-0 ovn_controller[95047]: 2026-01-22T00:25:25Z|00623|binding|INFO|Claiming lport 21fc60b1-3906-4de7-95e7-4113c45ab3d6 for this chassis.
Jan 22 00:25:25 compute-0 ovn_controller[95047]: 2026-01-22T00:25:25Z|00624|binding|INFO|21fc60b1-3906-4de7-95e7-4113c45ab3d6: Claiming fa:16:3e:d7:78:96 10.100.0.11
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.146 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.149 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.176 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:78:96 10.100.0.11'], port_security=['fa:16:3e:d7:78:96 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '063fddce-2634-4aa5-a9c5-17ca977ea05a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '49b0e0b5-57a2-46f3-afea-009e30140090', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d1fe7e8-9bbf-49b0-85b1-1237f5ccda8d, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=21fc60b1-3906-4de7-95e7-4113c45ab3d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.177 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 21fc60b1-3906-4de7-95e7-4113c45ab3d6 in datapath 4d1865a1-9397-451c-b754-7ed1f784f0b7 bound to our chassis
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.178 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4d1865a1-9397-451c-b754-7ed1f784f0b7
Jan 22 00:25:25 compute-0 systemd-udevd[238680]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.189 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cd6ed83f-0209-4043-92b7-c25974cb0604]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.189 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4d1865a1-91 in ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.192 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4d1865a1-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.192 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bad2895e-292b-4a2c-a13e-268fe9305854]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.193 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b2385a13-56f1-42db-b144-4d1bfcd06e45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 NetworkManager[55139]: <info>  [1769041525.1982] device (tap21fc60b1-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:25:25 compute-0 NetworkManager[55139]: <info>  [1769041525.1996] device (tap21fc60b1-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.204 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[eaad51bf-e3d7-42fb-ba93-85286c91940c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 systemd-machined[154182]: New machine qemu-82-instance-000000a2.
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.230 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c589fbb2-b858-44e7-8ed3-089635c17e9b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.231 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:25 compute-0 ovn_controller[95047]: 2026-01-22T00:25:25Z|00625|binding|INFO|Setting lport 21fc60b1-3906-4de7-95e7-4113c45ab3d6 ovn-installed in OVS
Jan 22 00:25:25 compute-0 ovn_controller[95047]: 2026-01-22T00:25:25Z|00626|binding|INFO|Setting lport 21fc60b1-3906-4de7-95e7-4113c45ab3d6 up in Southbound
Jan 22 00:25:25 compute-0 systemd[1]: Started Virtual Machine qemu-82-instance-000000a2.
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.238 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.258 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[04817456-c0f3-4db3-ae05-2e88ec1c977c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.264 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4f58a3f7-af2e-4c84-bd8e-db7e1960388f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 NetworkManager[55139]: <info>  [1769041525.2651] manager: (tap4d1865a1-90): new Veth device (/org/freedesktop/NetworkManager/Devices/302)
Jan 22 00:25:25 compute-0 systemd-udevd[238685]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.296 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[42c50a29-4f4d-4be3-b675-dcd873062268]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.299 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8806dc85-26aa-465f-8743-f0e9968ee786]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 NetworkManager[55139]: <info>  [1769041525.3222] device (tap4d1865a1-90): carrier: link connected
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.328 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c2ec2c5d-2b49-469d-81e7-f22bd5608776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.346 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[91c73c22-2247-4346-acdf-4af007ef8e26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4d1865a1-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:ff:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602806, 'reachable_time': 19370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238716, 'error': None, 'target': 'ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.364 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7ad8ac48-8e5c-4650-87f1-af0bc8ab2193]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fead:ff60'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602806, 'tstamp': 602806}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238717, 'error': None, 'target': 'ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.380 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[67bb0f9a-0ccf-446a-9b5b-8d7480706c79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4d1865a1-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:ff:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 199], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602806, 'reachable_time': 19370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238718, 'error': None, 'target': 'ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.412 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dc363387-2875-43d9-87dc-bc0acac8a94c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.491 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5521f2a5-8daf-4779-ba08-30023047b624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.493 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d1865a1-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.493 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.493 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d1865a1-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:25 compute-0 NetworkManager[55139]: <info>  [1769041525.4960] manager: (tap4d1865a1-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/303)
Jan 22 00:25:25 compute-0 kernel: tap4d1865a1-90: entered promiscuous mode
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.495 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.497 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.499 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4d1865a1-90, col_values=(('external_ids', {'iface-id': '3df5b1a0-ef2c-4842-a03e-07f65c7f72f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:25 compute-0 ovn_controller[95047]: 2026-01-22T00:25:25Z|00627|binding|INFO|Releasing lport 3df5b1a0-ef2c-4842-a03e-07f65c7f72f2 from this chassis (sb_readonly=0)
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.500 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.515 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.516 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.517 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4d1865a1-9397-451c-b754-7ed1f784f0b7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4d1865a1-9397-451c-b754-7ed1f784f0b7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.518 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[60a84b6b-1076-4007-b37e-07ec6e33e78e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.519 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-4d1865a1-9397-451c-b754-7ed1f784f0b7
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/4d1865a1-9397-451c-b754-7ed1f784f0b7.pid.haproxy
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 4d1865a1-9397-451c-b754-7ed1f784f0b7
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:25:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:25.520 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'env', 'PROCESS_TAG=haproxy-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4d1865a1-9397-451c-b754-7ed1f784f0b7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.535 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041525.5346646, 063fddce-2634-4aa5-a9c5-17ca977ea05a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.535 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] VM Started (Lifecycle Event)
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.555 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.561 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041525.5354447, 063fddce-2634-4aa5-a9c5-17ca977ea05a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.562 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] VM Paused (Lifecycle Event)
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.589 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.591 182939 DEBUG nova.compute.manager [req-22af77a5-6fa6-4986-81c8-851754ac9f5f req-0f9d41d7-acda-4b94-836a-2054a06b681d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.592 182939 DEBUG oslo_concurrency.lockutils [req-22af77a5-6fa6-4986-81c8-851754ac9f5f req-0f9d41d7-acda-4b94-836a-2054a06b681d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.592 182939 DEBUG oslo_concurrency.lockutils [req-22af77a5-6fa6-4986-81c8-851754ac9f5f req-0f9d41d7-acda-4b94-836a-2054a06b681d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.592 182939 DEBUG oslo_concurrency.lockutils [req-22af77a5-6fa6-4986-81c8-851754ac9f5f req-0f9d41d7-acda-4b94-836a-2054a06b681d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.592 182939 DEBUG nova.compute.manager [req-22af77a5-6fa6-4986-81c8-851754ac9f5f req-0f9d41d7-acda-4b94-836a-2054a06b681d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Processing event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.593 182939 DEBUG nova.compute.manager [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.596 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.598 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.601 182939 INFO nova.virt.libvirt.driver [-] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Instance spawned successfully.
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.602 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.639 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.640 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041525.5971334, 063fddce-2634-4aa5-a9c5-17ca977ea05a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.640 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] VM Resumed (Lifecycle Event)
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.647 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.648 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.648 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.649 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.650 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.651 182939 DEBUG nova.virt.libvirt.driver [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.685 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.690 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:25:25 compute-0 nova_compute[182935]: 2026-01-22 00:25:25.726 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:25:25 compute-0 podman[238757]: 2026-01-22 00:25:25.904389851 +0000 UTC m=+0.057395667 container create 9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:25:25 compute-0 systemd[1]: Started libpod-conmon-9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87.scope.
Jan 22 00:25:25 compute-0 podman[238757]: 2026-01-22 00:25:25.871839397 +0000 UTC m=+0.024845233 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:25:25 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:25:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/437073ebdc2c9ab4aeee080510f0e23f7e7faffcfad485658a42de277d3572c8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:25:25 compute-0 podman[238770]: 2026-01-22 00:25:25.998939442 +0000 UTC m=+0.054289393 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:25:26 compute-0 podman[238757]: 2026-01-22 00:25:26.005543938 +0000 UTC m=+0.158549734 container init 9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:25:26 compute-0 podman[238757]: 2026-01-22 00:25:26.012084394 +0000 UTC m=+0.165090190 container start 9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:25:26 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[238773]: [NOTICE]   (238801) : New worker (238803) forked
Jan 22 00:25:26 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[238773]: [NOTICE]   (238801) : Loading success.
Jan 22 00:25:26 compute-0 nova_compute[182935]: 2026-01-22 00:25:26.125 182939 INFO nova.compute.manager [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Took 9.51 seconds to spawn the instance on the hypervisor.
Jan 22 00:25:26 compute-0 nova_compute[182935]: 2026-01-22 00:25:26.126 182939 DEBUG nova.compute.manager [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:26 compute-0 nova_compute[182935]: 2026-01-22 00:25:26.226 182939 DEBUG nova.network.neutron [req-c669b9ab-bbc7-4bf9-a7bd-86616dfea11d req-0155d8eb-f888-40cb-9cb0-93130b19b55c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Updated VIF entry in instance network info cache for port 21fc60b1-3906-4de7-95e7-4113c45ab3d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:25:26 compute-0 nova_compute[182935]: 2026-01-22 00:25:26.227 182939 DEBUG nova.network.neutron [req-c669b9ab-bbc7-4bf9-a7bd-86616dfea11d req-0155d8eb-f888-40cb-9cb0-93130b19b55c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Updating instance_info_cache with network_info: [{"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:25:26 compute-0 nova_compute[182935]: 2026-01-22 00:25:26.508 182939 DEBUG oslo_concurrency.lockutils [req-c669b9ab-bbc7-4bf9-a7bd-86616dfea11d req-0155d8eb-f888-40cb-9cb0-93130b19b55c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:25:26 compute-0 nova_compute[182935]: 2026-01-22 00:25:26.962 182939 INFO nova.compute.manager [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Took 11.66 seconds to build instance.
Jan 22 00:25:27 compute-0 nova_compute[182935]: 2026-01-22 00:25:27.122 182939 DEBUG oslo_concurrency.lockutils [None req-27486371-44c6-4bc4-83d5-f5df8df4d88c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:27 compute-0 nova_compute[182935]: 2026-01-22 00:25:27.759 182939 DEBUG nova.compute.manager [req-f4255224-ba26-47ca-8436-80b03bd917b6 req-f7465985-1cb9-483d-aaf0-a0b5009506a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:27 compute-0 nova_compute[182935]: 2026-01-22 00:25:27.759 182939 DEBUG oslo_concurrency.lockutils [req-f4255224-ba26-47ca-8436-80b03bd917b6 req-f7465985-1cb9-483d-aaf0-a0b5009506a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:27 compute-0 nova_compute[182935]: 2026-01-22 00:25:27.759 182939 DEBUG oslo_concurrency.lockutils [req-f4255224-ba26-47ca-8436-80b03bd917b6 req-f7465985-1cb9-483d-aaf0-a0b5009506a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:27 compute-0 nova_compute[182935]: 2026-01-22 00:25:27.760 182939 DEBUG oslo_concurrency.lockutils [req-f4255224-ba26-47ca-8436-80b03bd917b6 req-f7465985-1cb9-483d-aaf0-a0b5009506a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:27 compute-0 nova_compute[182935]: 2026-01-22 00:25:27.760 182939 DEBUG nova.compute.manager [req-f4255224-ba26-47ca-8436-80b03bd917b6 req-f7465985-1cb9-483d-aaf0-a0b5009506a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] No waiting events found dispatching network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:25:27 compute-0 nova_compute[182935]: 2026-01-22 00:25:27.760 182939 WARNING nova.compute.manager [req-f4255224-ba26-47ca-8436-80b03bd917b6 req-f7465985-1cb9-483d-aaf0-a0b5009506a9 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received unexpected event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 for instance with vm_state active and task_state None.
Jan 22 00:25:28 compute-0 nova_compute[182935]: 2026-01-22 00:25:28.685 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:29 compute-0 nova_compute[182935]: 2026-01-22 00:25:29.371 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:30 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:30.738 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:32 compute-0 podman[238813]: 2026-01-22 00:25:32.677630638 +0000 UTC m=+0.053881004 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 22 00:25:33 compute-0 nova_compute[182935]: 2026-01-22 00:25:33.211 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:33 compute-0 NetworkManager[55139]: <info>  [1769041533.2124] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Jan 22 00:25:33 compute-0 NetworkManager[55139]: <info>  [1769041533.2136] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/305)
Jan 22 00:25:33 compute-0 nova_compute[182935]: 2026-01-22 00:25:33.312 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:33 compute-0 ovn_controller[95047]: 2026-01-22T00:25:33Z|00628|binding|INFO|Releasing lport 3df5b1a0-ef2c-4842-a03e-07f65c7f72f2 from this chassis (sb_readonly=0)
Jan 22 00:25:33 compute-0 nova_compute[182935]: 2026-01-22 00:25:33.325 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:33 compute-0 nova_compute[182935]: 2026-01-22 00:25:33.679 182939 DEBUG nova.compute.manager [req-703da089-4f50-49ec-8373-7497c8bb100d req-da3e9aeb-c592-4436-8597-469ab77a47af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-changed-21fc60b1-3906-4de7-95e7-4113c45ab3d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:33 compute-0 nova_compute[182935]: 2026-01-22 00:25:33.680 182939 DEBUG nova.compute.manager [req-703da089-4f50-49ec-8373-7497c8bb100d req-da3e9aeb-c592-4436-8597-469ab77a47af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Refreshing instance network info cache due to event network-changed-21fc60b1-3906-4de7-95e7-4113c45ab3d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:25:33 compute-0 nova_compute[182935]: 2026-01-22 00:25:33.681 182939 DEBUG oslo_concurrency.lockutils [req-703da089-4f50-49ec-8373-7497c8bb100d req-da3e9aeb-c592-4436-8597-469ab77a47af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:25:33 compute-0 nova_compute[182935]: 2026-01-22 00:25:33.681 182939 DEBUG oslo_concurrency.lockutils [req-703da089-4f50-49ec-8373-7497c8bb100d req-da3e9aeb-c592-4436-8597-469ab77a47af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:25:33 compute-0 nova_compute[182935]: 2026-01-22 00:25:33.681 182939 DEBUG nova.network.neutron [req-703da089-4f50-49ec-8373-7497c8bb100d req-da3e9aeb-c592-4436-8597-469ab77a47af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Refreshing network info cache for port 21fc60b1-3906-4de7-95e7-4113c45ab3d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:25:33 compute-0 nova_compute[182935]: 2026-01-22 00:25:33.687 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:34 compute-0 nova_compute[182935]: 2026-01-22 00:25:34.416 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:36 compute-0 nova_compute[182935]: 2026-01-22 00:25:36.577 182939 DEBUG nova.network.neutron [req-703da089-4f50-49ec-8373-7497c8bb100d req-da3e9aeb-c592-4436-8597-469ab77a47af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Updated VIF entry in instance network info cache for port 21fc60b1-3906-4de7-95e7-4113c45ab3d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:25:36 compute-0 nova_compute[182935]: 2026-01-22 00:25:36.579 182939 DEBUG nova.network.neutron [req-703da089-4f50-49ec-8373-7497c8bb100d req-da3e9aeb-c592-4436-8597-469ab77a47af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Updating instance_info_cache with network_info: [{"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:25:36 compute-0 nova_compute[182935]: 2026-01-22 00:25:36.614 182939 DEBUG oslo_concurrency.lockutils [req-703da089-4f50-49ec-8373-7497c8bb100d req-da3e9aeb-c592-4436-8597-469ab77a47af 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:25:37 compute-0 nova_compute[182935]: 2026-01-22 00:25:37.560 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:38 compute-0 ovn_controller[95047]: 2026-01-22T00:25:38Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d7:78:96 10.100.0.11
Jan 22 00:25:38 compute-0 ovn_controller[95047]: 2026-01-22T00:25:38Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d7:78:96 10.100.0.11
Jan 22 00:25:38 compute-0 nova_compute[182935]: 2026-01-22 00:25:38.690 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:38 compute-0 podman[238849]: 2026-01-22 00:25:38.704219856 +0000 UTC m=+0.069168667 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=)
Jan 22 00:25:38 compute-0 podman[238850]: 2026-01-22 00:25:38.704616335 +0000 UTC m=+0.062511518 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 22 00:25:39 compute-0 nova_compute[182935]: 2026-01-22 00:25:39.418 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:40 compute-0 nova_compute[182935]: 2026-01-22 00:25:40.768 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:43 compute-0 nova_compute[182935]: 2026-01-22 00:25:43.720 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:44 compute-0 nova_compute[182935]: 2026-01-22 00:25:44.422 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:45 compute-0 nova_compute[182935]: 2026-01-22 00:25:45.107 182939 INFO nova.compute.manager [None req-70c7c22d-fb40-419a-a1bf-1a1cef862442 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Get console output
Jan 22 00:25:45 compute-0 nova_compute[182935]: 2026-01-22 00:25:45.113 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:25:47 compute-0 nova_compute[182935]: 2026-01-22 00:25:47.552 182939 INFO nova.compute.manager [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Rebuilding instance
Jan 22 00:25:48 compute-0 nova_compute[182935]: 2026-01-22 00:25:48.690 182939 DEBUG nova.compute.manager [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:48 compute-0 nova_compute[182935]: 2026-01-22 00:25:48.723 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:48 compute-0 nova_compute[182935]: 2026-01-22 00:25:48.770 182939 DEBUG nova.objects.instance [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_requests' on Instance uuid 063fddce-2634-4aa5-a9c5-17ca977ea05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:48 compute-0 nova_compute[182935]: 2026-01-22 00:25:48.784 182939 DEBUG nova.objects.instance [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid 063fddce-2634-4aa5-a9c5-17ca977ea05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:48 compute-0 nova_compute[182935]: 2026-01-22 00:25:48.796 182939 DEBUG nova.objects.instance [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid 063fddce-2634-4aa5-a9c5-17ca977ea05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:48 compute-0 nova_compute[182935]: 2026-01-22 00:25:48.812 182939 DEBUG nova.objects.instance [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid 063fddce-2634-4aa5-a9c5-17ca977ea05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:48 compute-0 nova_compute[182935]: 2026-01-22 00:25:48.826 182939 DEBUG nova.objects.instance [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 00:25:48 compute-0 nova_compute[182935]: 2026-01-22 00:25:48.831 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:25:49 compute-0 nova_compute[182935]: 2026-01-22 00:25:49.425 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:49 compute-0 nova_compute[182935]: 2026-01-22 00:25:49.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:49 compute-0 nova_compute[182935]: 2026-01-22 00:25:49.842 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:49 compute-0 nova_compute[182935]: 2026-01-22 00:25:49.842 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:49 compute-0 nova_compute[182935]: 2026-01-22 00:25:49.843 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:49 compute-0 nova_compute[182935]: 2026-01-22 00:25:49.843 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.153 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.214 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.216 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.282 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.422 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.424 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5490MB free_disk=73.09835433959961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.425 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.425 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.666 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 063fddce-2634-4aa5-a9c5-17ca977ea05a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.667 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.667 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.730 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.813 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.935 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:25:50 compute-0 nova_compute[182935]: 2026-01-22 00:25:50.936 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.511s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:51 compute-0 kernel: tap21fc60b1-39 (unregistering): left promiscuous mode
Jan 22 00:25:51 compute-0 NetworkManager[55139]: <info>  [1769041551.0550] device (tap21fc60b1-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:25:51 compute-0 ovn_controller[95047]: 2026-01-22T00:25:51Z|00629|binding|INFO|Releasing lport 21fc60b1-3906-4de7-95e7-4113c45ab3d6 from this chassis (sb_readonly=0)
Jan 22 00:25:51 compute-0 ovn_controller[95047]: 2026-01-22T00:25:51Z|00630|binding|INFO|Setting lport 21fc60b1-3906-4de7-95e7-4113c45ab3d6 down in Southbound
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.062 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:51 compute-0 ovn_controller[95047]: 2026-01-22T00:25:51Z|00631|binding|INFO|Removing iface tap21fc60b1-39 ovn-installed in OVS
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.064 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.072 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:78:96 10.100.0.11'], port_security=['fa:16:3e:d7:78:96 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '063fddce-2634-4aa5-a9c5-17ca977ea05a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '4', 'neutron:security_group_ids': '49b0e0b5-57a2-46f3-afea-009e30140090', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d1fe7e8-9bbf-49b0-85b1-1237f5ccda8d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=21fc60b1-3906-4de7-95e7-4113c45ab3d6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.074 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 21fc60b1-3906-4de7-95e7-4113c45ab3d6 in datapath 4d1865a1-9397-451c-b754-7ed1f784f0b7 unbound from our chassis
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.075 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4d1865a1-9397-451c-b754-7ed1f784f0b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.076 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[40282ed4-21fb-4f12-930e-d22a28f63de8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.077 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7 namespace which is not needed anymore
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.084 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:51 compute-0 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Jan 22 00:25:51 compute-0 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000a2.scope: Consumed 14.076s CPU time.
Jan 22 00:25:51 compute-0 systemd-machined[154182]: Machine qemu-82-instance-000000a2 terminated.
Jan 22 00:25:51 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[238773]: [NOTICE]   (238801) : haproxy version is 2.8.14-c23fe91
Jan 22 00:25:51 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[238773]: [NOTICE]   (238801) : path to executable is /usr/sbin/haproxy
Jan 22 00:25:51 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[238773]: [WARNING]  (238801) : Exiting Master process...
Jan 22 00:25:51 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[238773]: [ALERT]    (238801) : Current worker (238803) exited with code 143 (Terminated)
Jan 22 00:25:51 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[238773]: [WARNING]  (238801) : All workers exited. Exiting... (0)
Jan 22 00:25:51 compute-0 systemd[1]: libpod-9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87.scope: Deactivated successfully.
Jan 22 00:25:51 compute-0 podman[238922]: 2026-01-22 00:25:51.210452095 +0000 UTC m=+0.044398808 container died 9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 00:25:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87-userdata-shm.mount: Deactivated successfully.
Jan 22 00:25:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-437073ebdc2c9ab4aeee080510f0e23f7e7faffcfad485658a42de277d3572c8-merged.mount: Deactivated successfully.
Jan 22 00:25:51 compute-0 podman[238922]: 2026-01-22 00:25:51.252619749 +0000 UTC m=+0.086566462 container cleanup 9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 00:25:51 compute-0 systemd[1]: libpod-conmon-9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87.scope: Deactivated successfully.
Jan 22 00:25:51 compute-0 podman[238952]: 2026-01-22 00:25:51.313882997 +0000 UTC m=+0.041325875 container remove 9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.320 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[94080f70-66fe-4995-b588-0276b950e99b]: (4, ('Thu Jan 22 12:25:51 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7 (9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87)\n9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87\nThu Jan 22 12:25:51 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7 (9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87)\n9d6e5d64021358c7a0c68ca0ea356945c0380d68050b42b347c79bb2f413bf87\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.322 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb03460-8a43-4add-b319-3b771aa1dc84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.323 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d1865a1-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:51 compute-0 kernel: tap4d1865a1-90: left promiscuous mode
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.325 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.339 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.341 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8713d8c4-0b4e-4d8c-a193-004abe900111]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.362 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2b4e5c2e-e23c-4dee-9534-c5d6abba28bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.363 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ea376a06-19ca-44f4-9c09-d17a1ee210cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.378 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f4ef6710-6aa8-4318-b78c-dcdace155a7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602799, 'reachable_time': 28218, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238988, 'error': None, 'target': 'ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.380 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:25:51 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:51.380 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[1e4e16ec-3beb-41e2-b3c1-04d5d82c5153]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:51 compute-0 systemd[1]: run-netns-ovnmeta\x2d4d1865a1\x2d9397\x2d451c\x2db754\x2d7ed1f784f0b7.mount: Deactivated successfully.
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.758 182939 DEBUG nova.compute.manager [req-2e478b3c-12f6-4233-8efb-add53a24cc94 req-c2f29cf1-630b-40bb-a987-e148617b5f54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-vif-unplugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.759 182939 DEBUG oslo_concurrency.lockutils [req-2e478b3c-12f6-4233-8efb-add53a24cc94 req-c2f29cf1-630b-40bb-a987-e148617b5f54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.759 182939 DEBUG oslo_concurrency.lockutils [req-2e478b3c-12f6-4233-8efb-add53a24cc94 req-c2f29cf1-630b-40bb-a987-e148617b5f54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.760 182939 DEBUG oslo_concurrency.lockutils [req-2e478b3c-12f6-4233-8efb-add53a24cc94 req-c2f29cf1-630b-40bb-a987-e148617b5f54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.760 182939 DEBUG nova.compute.manager [req-2e478b3c-12f6-4233-8efb-add53a24cc94 req-c2f29cf1-630b-40bb-a987-e148617b5f54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] No waiting events found dispatching network-vif-unplugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.761 182939 WARNING nova.compute.manager [req-2e478b3c-12f6-4233-8efb-add53a24cc94 req-c2f29cf1-630b-40bb-a987-e148617b5f54 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received unexpected event network-vif-unplugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 for instance with vm_state active and task_state rebuilding.
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.847 182939 INFO nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Instance shutdown successfully after 3 seconds.
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.854 182939 INFO nova.virt.libvirt.driver [-] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Instance destroyed successfully.
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.859 182939 INFO nova.virt.libvirt.driver [-] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Instance destroyed successfully.
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.860 182939 DEBUG nova.virt.libvirt.vif [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:25:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-507265383',display_name='tempest-TestNetworkAdvancedServerOps-server-507265383',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-507265383',id=162,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBV5lrE/0o3wylW8styAqIEq3o6gpKICYx+OOjZpZ4HsoPBmWU0vw3pi82ayKIiy54WBno0h74p6C9GNm4wvSrVujqZ6E1U9i7nevpU2R6qM9p7TZcVT1ycIcAKLfUQOYQ==',key_name='tempest-TestNetworkAdvancedServerOps-1079565092',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:25:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-w0q2r970',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:25:46Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=063fddce-2634-4aa5-a9c5-17ca977ea05a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.860 182939 DEBUG nova.network.os_vif_util [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.861 182939 DEBUG nova.network.os_vif_util [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.862 182939 DEBUG os_vif [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.863 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.864 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap21fc60b1-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.865 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.867 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.870 182939 INFO os_vif [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39')
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.871 182939 INFO nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Deleting instance files /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a_del
Jan 22 00:25:51 compute-0 nova_compute[182935]: 2026-01-22 00:25:51.871 182939 INFO nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Deletion of /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a_del complete
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.512 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.512 182939 INFO nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Creating image(s)
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.513 182939 DEBUG oslo_concurrency.lockutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.513 182939 DEBUG oslo_concurrency.lockutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.514 182939 DEBUG oslo_concurrency.lockutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.526 182939 DEBUG oslo_concurrency.processutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.607 182939 DEBUG oslo_concurrency.processutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.608 182939 DEBUG oslo_concurrency.lockutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "28d799b333bd7d52e5e892149f424e185effed74" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.608 182939 DEBUG oslo_concurrency.lockutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.621 182939 DEBUG oslo_concurrency.processutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.678 182939 DEBUG oslo_concurrency.processutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.679 182939 DEBUG oslo_concurrency.processutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.730 182939 DEBUG oslo_concurrency.processutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74,backing_fmt=raw /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.732 182939 DEBUG oslo_concurrency.lockutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "28d799b333bd7d52e5e892149f424e185effed74" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.732 182939 DEBUG oslo_concurrency.processutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.789 182939 DEBUG oslo_concurrency.processutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.790 182939 DEBUG nova.virt.disk.api [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Checking if we can resize image /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.791 182939 DEBUG oslo_concurrency.processutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.872 182939 DEBUG oslo_concurrency.processutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.873 182939 DEBUG nova.virt.disk.api [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Cannot resize image /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.873 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.873 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Ensure instance console log exists: /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.874 182939 DEBUG oslo_concurrency.lockutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.874 182939 DEBUG oslo_concurrency.lockutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.874 182939 DEBUG oslo_concurrency.lockutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.876 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Start _get_guest_xml network_info=[{"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.881 182939 WARNING nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.890 182939 DEBUG nova.virt.libvirt.host [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.891 182939 DEBUG nova.virt.libvirt.host [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.894 182939 DEBUG nova.virt.libvirt.host [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.894 182939 DEBUG nova.virt.libvirt.host [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.895 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.895 182939 DEBUG nova.virt.hardware [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:53Z,direct_url=<?>,disk_format='qcow2',id=3e1dda74-3c6a-4d29-8792-32134d1c36c5,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.896 182939 DEBUG nova.virt.hardware [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.896 182939 DEBUG nova.virt.hardware [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.896 182939 DEBUG nova.virt.hardware [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.896 182939 DEBUG nova.virt.hardware [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.896 182939 DEBUG nova.virt.hardware [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.897 182939 DEBUG nova.virt.hardware [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.897 182939 DEBUG nova.virt.hardware [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.897 182939 DEBUG nova.virt.hardware [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.897 182939 DEBUG nova.virt.hardware [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.897 182939 DEBUG nova.virt.hardware [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.898 182939 DEBUG nova.objects.instance [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'vcpu_model' on Instance uuid 063fddce-2634-4aa5-a9c5-17ca977ea05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.920 182939 DEBUG nova.virt.libvirt.vif [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:25:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-507265383',display_name='tempest-TestNetworkAdvancedServerOps-server-507265383',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-507265383',id=162,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBV5lrE/0o3wylW8styAqIEq3o6gpKICYx+OOjZpZ4HsoPBmWU0vw3pi82ayKIiy54WBno0h74p6C9GNm4wvSrVujqZ6E1U9i7nevpU2R6qM9p7TZcVT1ycIcAKLfUQOYQ==',key_name='tempest-TestNetworkAdvancedServerOps-1079565092',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:25:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-w0q2r970',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:25:51Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=063fddce-2634-4aa5-a9c5-17ca977ea05a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.920 182939 DEBUG nova.network.os_vif_util [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.921 182939 DEBUG nova.network.os_vif_util [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.922 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:25:52 compute-0 nova_compute[182935]:   <uuid>063fddce-2634-4aa5-a9c5-17ca977ea05a</uuid>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   <name>instance-000000a2</name>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-507265383</nova:name>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:25:52</nova:creationTime>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:25:52 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:25:52 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:25:52 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:25:52 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:25:52 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:25:52 compute-0 nova_compute[182935]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:25:52 compute-0 nova_compute[182935]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="3e1dda74-3c6a-4d29-8792-32134d1c36c5"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:25:52 compute-0 nova_compute[182935]:         <nova:port uuid="21fc60b1-3906-4de7-95e7-4113c45ab3d6">
Jan 22 00:25:52 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <system>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <entry name="serial">063fddce-2634-4aa5-a9c5-17ca977ea05a</entry>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <entry name="uuid">063fddce-2634-4aa5-a9c5-17ca977ea05a</entry>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     </system>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   <os>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   </os>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   <features>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   </features>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.config"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:d7:78:96"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <target dev="tap21fc60b1-39"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/console.log" append="off"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <video>
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     </video>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:25:52 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:25:52 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:25:52 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:25:52 compute-0 nova_compute[182935]: </domain>
Jan 22 00:25:52 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.924 182939 DEBUG nova.virt.libvirt.vif [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:25:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-507265383',display_name='tempest-TestNetworkAdvancedServerOps-server-507265383',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-507265383',id=162,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBV5lrE/0o3wylW8styAqIEq3o6gpKICYx+OOjZpZ4HsoPBmWU0vw3pi82ayKIiy54WBno0h74p6C9GNm4wvSrVujqZ6E1U9i7nevpU2R6qM9p7TZcVT1ycIcAKLfUQOYQ==',key_name='tempest-TestNetworkAdvancedServerOps-1079565092',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:25:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-w0q2r970',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:25:51Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=063fddce-2634-4aa5-a9c5-17ca977ea05a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.924 182939 DEBUG nova.network.os_vif_util [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.924 182939 DEBUG nova.network.os_vif_util [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.925 182939 DEBUG os_vif [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.925 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.926 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.926 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.929 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.929 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap21fc60b1-39, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.930 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap21fc60b1-39, col_values=(('external_ids', {'iface-id': '21fc60b1-3906-4de7-95e7-4113c45ab3d6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:78:96', 'vm-uuid': '063fddce-2634-4aa5-a9c5-17ca977ea05a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.931 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:52 compute-0 NetworkManager[55139]: <info>  [1769041552.9326] manager: (tap21fc60b1-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/306)
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.933 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.937 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.938 182939 INFO os_vif [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39')
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.983 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.984 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.984 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No VIF found with MAC fa:16:3e:d7:78:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:25:52 compute-0 nova_compute[182935]: 2026-01-22 00:25:52.984 182939 INFO nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Using config drive
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.003 182939 DEBUG nova.objects.instance [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'ec2_ids' on Instance uuid 063fddce-2634-4aa5-a9c5-17ca977ea05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.048 182939 DEBUG nova.objects.instance [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'keypairs' on Instance uuid 063fddce-2634-4aa5-a9c5-17ca977ea05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.627 182939 INFO nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Creating config drive at /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.config
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.632 182939 DEBUG oslo_concurrency.processutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp03_9menz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.725 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.760 182939 DEBUG oslo_concurrency.processutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp03_9menz" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:25:53 compute-0 kernel: tap21fc60b1-39: entered promiscuous mode
Jan 22 00:25:53 compute-0 systemd-udevd[238900]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:25:53 compute-0 ovn_controller[95047]: 2026-01-22T00:25:53Z|00632|binding|INFO|Claiming lport 21fc60b1-3906-4de7-95e7-4113c45ab3d6 for this chassis.
Jan 22 00:25:53 compute-0 ovn_controller[95047]: 2026-01-22T00:25:53Z|00633|binding|INFO|21fc60b1-3906-4de7-95e7-4113c45ab3d6: Claiming fa:16:3e:d7:78:96 10.100.0.11
Jan 22 00:25:53 compute-0 NetworkManager[55139]: <info>  [1769041553.8157] manager: (tap21fc60b1-39): new Tun device (/org/freedesktop/NetworkManager/Devices/307)
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.815 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.822 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:78:96 10.100.0.11'], port_security=['fa:16:3e:d7:78:96 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '063fddce-2634-4aa5-a9c5-17ca977ea05a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '5', 'neutron:security_group_ids': '49b0e0b5-57a2-46f3-afea-009e30140090', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d1fe7e8-9bbf-49b0-85b1-1237f5ccda8d, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=21fc60b1-3906-4de7-95e7-4113c45ab3d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.823 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 21fc60b1-3906-4de7-95e7-4113c45ab3d6 in datapath 4d1865a1-9397-451c-b754-7ed1f784f0b7 bound to our chassis
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.824 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4d1865a1-9397-451c-b754-7ed1f784f0b7
Jan 22 00:25:53 compute-0 NetworkManager[55139]: <info>  [1769041553.8259] device (tap21fc60b1-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:25:53 compute-0 NetworkManager[55139]: <info>  [1769041553.8271] device (tap21fc60b1-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:25:53 compute-0 ovn_controller[95047]: 2026-01-22T00:25:53Z|00634|binding|INFO|Setting lport 21fc60b1-3906-4de7-95e7-4113c45ab3d6 ovn-installed in OVS
Jan 22 00:25:53 compute-0 ovn_controller[95047]: 2026-01-22T00:25:53Z|00635|binding|INFO|Setting lport 21fc60b1-3906-4de7-95e7-4113c45ab3d6 up in Southbound
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.830 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.832 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.836 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[df4cdc85-9b36-497d-981b-9cc36722b79b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.836 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4d1865a1-91 in ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.839 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4d1865a1-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.839 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c87adde1-b121-4bfa-aca1-40c62b4fb5cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.840 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[da108d5b-d4de-42de-b71b-99ce6bf2c6b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.850 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[320b9c36-f720-4cb1-b911-19a58f79f69f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:53 compute-0 systemd-machined[154182]: New machine qemu-83-instance-000000a2.
Jan 22 00:25:53 compute-0 systemd[1]: Started Virtual Machine qemu-83-instance-000000a2.
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.865 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2665f11a-4ba8-472d-a30b-da94ef7a4600]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.877 182939 DEBUG nova.compute.manager [req-761d0c29-1492-4bea-b035-316fd3b78fb1 req-b296f240-0390-4500-b4ad-3e6252a0879f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.878 182939 DEBUG oslo_concurrency.lockutils [req-761d0c29-1492-4bea-b035-316fd3b78fb1 req-b296f240-0390-4500-b4ad-3e6252a0879f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.879 182939 DEBUG oslo_concurrency.lockutils [req-761d0c29-1492-4bea-b035-316fd3b78fb1 req-b296f240-0390-4500-b4ad-3e6252a0879f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.879 182939 DEBUG oslo_concurrency.lockutils [req-761d0c29-1492-4bea-b035-316fd3b78fb1 req-b296f240-0390-4500-b4ad-3e6252a0879f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.879 182939 DEBUG nova.compute.manager [req-761d0c29-1492-4bea-b035-316fd3b78fb1 req-b296f240-0390-4500-b4ad-3e6252a0879f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] No waiting events found dispatching network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.879 182939 WARNING nova.compute.manager [req-761d0c29-1492-4bea-b035-316fd3b78fb1 req-b296f240-0390-4500-b4ad-3e6252a0879f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received unexpected event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 for instance with vm_state active and task_state rebuild_spawning.
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.892 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7571b230-e8ce-4d4d-8b18-494e2c91989f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.899 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4c664a94-a1d7-47b5-aa0c-d81c3b000b8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:53 compute-0 NetworkManager[55139]: <info>  [1769041553.9000] manager: (tap4d1865a1-90): new Veth device (/org/freedesktop/NetworkManager/Devices/308)
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.928 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[48ff84c4-1970-491b-8b47-72112a1b3426]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.932 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[0516448f-4c54-444b-aa10-f51090b5044a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.935 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.936 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.936 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:25:53 compute-0 NetworkManager[55139]: <info>  [1769041553.9538] device (tap4d1865a1-90): carrier: link connected
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.957 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.958 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.958 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:25:53 compute-0 nova_compute[182935]: 2026-01-22 00:25:53.958 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 063fddce-2634-4aa5-a9c5-17ca977ea05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.960 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[80ee2cac-2a22-4f04-8080-95f84a6f2e68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:53 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:53.979 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[52cc33c5-a766-47d4-97b0-d03db777be2e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4d1865a1-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:ff:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 605669, 'reachable_time': 22150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239054, 'error': None, 'target': 'ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:54.001 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[55a1984d-3cdc-49de-8eca-d5dc878aabe4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fead:ff60'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 605669, 'tstamp': 605669}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239055, 'error': None, 'target': 'ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:54.023 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[809458ca-9d68-4ea1-9123-796017711b29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4d1865a1-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:ff:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 605669, 'reachable_time': 22150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239056, 'error': None, 'target': 'ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:54.057 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[95c5dfdf-f5ad-4fd0-8c41-9b64414f1866]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.113 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for 063fddce-2634-4aa5-a9c5-17ca977ea05a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.113 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041554.112632, 063fddce-2634-4aa5-a9c5-17ca977ea05a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.114 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] VM Resumed (Lifecycle Event)
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.116 182939 DEBUG nova.compute.manager [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:54.115 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b605170e-b6dc-4cd7-997d-36d9b26f463d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.116 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:54.117 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d1865a1-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:54.117 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:54.117 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d1865a1-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.119 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:54 compute-0 kernel: tap4d1865a1-90: entered promiscuous mode
Jan 22 00:25:54 compute-0 NetworkManager[55139]: <info>  [1769041554.1204] manager: (tap4d1865a1-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/309)
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.123 182939 INFO nova.virt.libvirt.driver [-] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Instance spawned successfully.
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.124 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:54.126 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4d1865a1-90, col_values=(('external_ids', {'iface-id': '3df5b1a0-ef2c-4842-a03e-07f65c7f72f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.128 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:54 compute-0 ovn_controller[95047]: 2026-01-22T00:25:54Z|00636|binding|INFO|Releasing lport 3df5b1a0-ef2c-4842-a03e-07f65c7f72f2 from this chassis (sb_readonly=0)
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:54.129 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4d1865a1-9397-451c-b754-7ed1f784f0b7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4d1865a1-9397-451c-b754-7ed1f784f0b7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:54.131 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0127e4a6-f20a-4a29-9174-e0f68fa619df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:54.133 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-4d1865a1-9397-451c-b754-7ed1f784f0b7
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/4d1865a1-9397-451c-b754-7ed1f784f0b7.pid.haproxy
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 4d1865a1-9397-451c-b754-7ed1f784f0b7
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:25:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:25:54.134 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'env', 'PROCESS_TAG=haproxy-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4d1865a1-9397-451c-b754-7ed1f784f0b7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.141 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.159 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.166 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.169 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.169 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.170 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.170 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.170 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.171 182939 DEBUG nova.virt.libvirt.driver [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.202 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.202 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041554.1135077, 063fddce-2634-4aa5-a9c5-17ca977ea05a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.202 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] VM Started (Lifecycle Event)
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.259 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.262 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.304 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.330 182939 DEBUG nova.compute.manager [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.409 182939 DEBUG oslo_concurrency.lockutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.411 182939 DEBUG oslo_concurrency.lockutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.411 182939 DEBUG nova.objects.instance [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Jan 22 00:25:54 compute-0 podman[239095]: 2026-01-22 00:25:54.498330419 +0000 UTC m=+0.048905555 container create 06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 00:25:54 compute-0 systemd[1]: Started libpod-conmon-06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673.scope.
Jan 22 00:25:54 compute-0 nova_compute[182935]: 2026-01-22 00:25:54.538 182939 DEBUG oslo_concurrency.lockutils [None req-fb48fb20-03b9-4a86-9500-c7cfbdf0d9fe 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:54 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:25:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69b697db9bd95cb85d57509bd55a89f7144c9783f9fa89d4d3064593d62b888b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:25:54 compute-0 podman[239095]: 2026-01-22 00:25:54.470058456 +0000 UTC m=+0.020633612 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:25:54 compute-0 podman[239095]: 2026-01-22 00:25:54.573744704 +0000 UTC m=+0.124319850 container init 06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:25:54 compute-0 podman[239095]: 2026-01-22 00:25:54.578718473 +0000 UTC m=+0.129293609 container start 06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:25:54 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[239110]: [NOTICE]   (239114) : New worker (239116) forked
Jan 22 00:25:54 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[239110]: [NOTICE]   (239114) : Loading success.
Jan 22 00:25:55 compute-0 podman[239126]: 2026-01-22 00:25:55.678530345 +0000 UTC m=+0.051206269 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:25:55 compute-0 podman[239125]: 2026-01-22 00:25:55.707230488 +0000 UTC m=+0.083696053 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:25:55 compute-0 nova_compute[182935]: 2026-01-22 00:25:55.911 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Updating instance_info_cache with network_info: [{"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:25:55 compute-0 nova_compute[182935]: 2026-01-22 00:25:55.928 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:25:55 compute-0 nova_compute[182935]: 2026-01-22 00:25:55.928 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:25:55 compute-0 nova_compute[182935]: 2026-01-22 00:25:55.929 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:55 compute-0 nova_compute[182935]: 2026-01-22 00:25:55.929 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:55 compute-0 nova_compute[182935]: 2026-01-22 00:25:55.929 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:25:56 compute-0 nova_compute[182935]: 2026-01-22 00:25:56.479 182939 DEBUG nova.compute.manager [req-cafa2e31-aafd-4087-9d47-060bdc3b0a3b req-9c0480c1-28cf-4f1d-8090-c0b781b70b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:56 compute-0 nova_compute[182935]: 2026-01-22 00:25:56.479 182939 DEBUG oslo_concurrency.lockutils [req-cafa2e31-aafd-4087-9d47-060bdc3b0a3b req-9c0480c1-28cf-4f1d-8090-c0b781b70b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:56 compute-0 nova_compute[182935]: 2026-01-22 00:25:56.480 182939 DEBUG oslo_concurrency.lockutils [req-cafa2e31-aafd-4087-9d47-060bdc3b0a3b req-9c0480c1-28cf-4f1d-8090-c0b781b70b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:56 compute-0 nova_compute[182935]: 2026-01-22 00:25:56.480 182939 DEBUG oslo_concurrency.lockutils [req-cafa2e31-aafd-4087-9d47-060bdc3b0a3b req-9c0480c1-28cf-4f1d-8090-c0b781b70b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:56 compute-0 nova_compute[182935]: 2026-01-22 00:25:56.480 182939 DEBUG nova.compute.manager [req-cafa2e31-aafd-4087-9d47-060bdc3b0a3b req-9c0480c1-28cf-4f1d-8090-c0b781b70b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] No waiting events found dispatching network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:25:56 compute-0 nova_compute[182935]: 2026-01-22 00:25:56.480 182939 WARNING nova.compute.manager [req-cafa2e31-aafd-4087-9d47-060bdc3b0a3b req-9c0480c1-28cf-4f1d-8090-c0b781b70b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received unexpected event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 for instance with vm_state active and task_state None.
Jan 22 00:25:56 compute-0 nova_compute[182935]: 2026-01-22 00:25:56.481 182939 DEBUG nova.compute.manager [req-cafa2e31-aafd-4087-9d47-060bdc3b0a3b req-9c0480c1-28cf-4f1d-8090-c0b781b70b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:25:56 compute-0 nova_compute[182935]: 2026-01-22 00:25:56.481 182939 DEBUG oslo_concurrency.lockutils [req-cafa2e31-aafd-4087-9d47-060bdc3b0a3b req-9c0480c1-28cf-4f1d-8090-c0b781b70b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:25:56 compute-0 nova_compute[182935]: 2026-01-22 00:25:56.481 182939 DEBUG oslo_concurrency.lockutils [req-cafa2e31-aafd-4087-9d47-060bdc3b0a3b req-9c0480c1-28cf-4f1d-8090-c0b781b70b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:25:56 compute-0 nova_compute[182935]: 2026-01-22 00:25:56.482 182939 DEBUG oslo_concurrency.lockutils [req-cafa2e31-aafd-4087-9d47-060bdc3b0a3b req-9c0480c1-28cf-4f1d-8090-c0b781b70b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:25:56 compute-0 nova_compute[182935]: 2026-01-22 00:25:56.482 182939 DEBUG nova.compute.manager [req-cafa2e31-aafd-4087-9d47-060bdc3b0a3b req-9c0480c1-28cf-4f1d-8090-c0b781b70b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] No waiting events found dispatching network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:25:56 compute-0 nova_compute[182935]: 2026-01-22 00:25:56.482 182939 WARNING nova.compute.manager [req-cafa2e31-aafd-4087-9d47-060bdc3b0a3b req-9c0480c1-28cf-4f1d-8090-c0b781b70b3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received unexpected event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 for instance with vm_state active and task_state None.
Jan 22 00:25:56 compute-0 podman[239175]: 2026-01-22 00:25:56.683101782 +0000 UTC m=+0.053363001 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:25:57 compute-0 nova_compute[182935]: 2026-01-22 00:25:57.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:25:57 compute-0 nova_compute[182935]: 2026-01-22 00:25:57.932 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:58 compute-0 nova_compute[182935]: 2026-01-22 00:25:58.727 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:25:59 compute-0 nova_compute[182935]: 2026-01-22 00:25:59.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:00 compute-0 nova_compute[182935]: 2026-01-22 00:26:00.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:02 compute-0 nova_compute[182935]: 2026-01-22 00:26:02.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:02 compute-0 nova_compute[182935]: 2026-01-22 00:26:02.977 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:03.222 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:03.223 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:03.223 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:03 compute-0 podman[239198]: 2026-01-22 00:26:03.681491087 +0000 UTC m=+0.046828085 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:26:03 compute-0 nova_compute[182935]: 2026-01-22 00:26:03.729 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:04 compute-0 sshd-session[239217]: Invalid user mongodb from 188.166.69.60 port 59288
Jan 22 00:26:04 compute-0 sshd-session[239217]: Connection closed by invalid user mongodb 188.166.69.60 port 59288 [preauth]
Jan 22 00:26:06 compute-0 nova_compute[182935]: 2026-01-22 00:26:06.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:07 compute-0 ovn_controller[95047]: 2026-01-22T00:26:07Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d7:78:96 10.100.0.11
Jan 22 00:26:07 compute-0 ovn_controller[95047]: 2026-01-22T00:26:07Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d7:78:96 10.100.0.11
Jan 22 00:26:07 compute-0 nova_compute[182935]: 2026-01-22 00:26:07.979 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:08 compute-0 nova_compute[182935]: 2026-01-22 00:26:08.729 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:08 compute-0 nova_compute[182935]: 2026-01-22 00:26:08.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:09 compute-0 nova_compute[182935]: 2026-01-22 00:26:09.670 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:09.669 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:26:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:09.669 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:26:09 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:09.670 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:09 compute-0 podman[239235]: 2026-01-22 00:26:09.691904149 +0000 UTC m=+0.055219616 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, maintainer=Red Hat, Inc.)
Jan 22 00:26:09 compute-0 podman[239236]: 2026-01-22 00:26:09.698177358 +0000 UTC m=+0.058565845 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:26:12 compute-0 nova_compute[182935]: 2026-01-22 00:26:12.981 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:13 compute-0 nova_compute[182935]: 2026-01-22 00:26:13.780 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:13 compute-0 nova_compute[182935]: 2026-01-22 00:26:13.871 182939 INFO nova.compute.manager [None req-d657c363-8301-4737-928c-6b42d9597b90 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Get console output
Jan 22 00:26:13 compute-0 nova_compute[182935]: 2026-01-22 00:26:13.877 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:26:14 compute-0 nova_compute[182935]: 2026-01-22 00:26:14.901 182939 DEBUG nova.compute.manager [req-a6a3e081-46d1-4191-bc86-564b000586dd req-b03257e8-dead-44f6-b1f3-2f117c42bf10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-changed-21fc60b1-3906-4de7-95e7-4113c45ab3d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:26:14 compute-0 nova_compute[182935]: 2026-01-22 00:26:14.901 182939 DEBUG nova.compute.manager [req-a6a3e081-46d1-4191-bc86-564b000586dd req-b03257e8-dead-44f6-b1f3-2f117c42bf10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Refreshing instance network info cache due to event network-changed-21fc60b1-3906-4de7-95e7-4113c45ab3d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:26:14 compute-0 nova_compute[182935]: 2026-01-22 00:26:14.902 182939 DEBUG oslo_concurrency.lockutils [req-a6a3e081-46d1-4191-bc86-564b000586dd req-b03257e8-dead-44f6-b1f3-2f117c42bf10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:26:14 compute-0 nova_compute[182935]: 2026-01-22 00:26:14.902 182939 DEBUG oslo_concurrency.lockutils [req-a6a3e081-46d1-4191-bc86-564b000586dd req-b03257e8-dead-44f6-b1f3-2f117c42bf10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:26:14 compute-0 nova_compute[182935]: 2026-01-22 00:26:14.902 182939 DEBUG nova.network.neutron [req-a6a3e081-46d1-4191-bc86-564b000586dd req-b03257e8-dead-44f6-b1f3-2f117c42bf10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Refreshing network info cache for port 21fc60b1-3906-4de7-95e7-4113c45ab3d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:26:14 compute-0 nova_compute[182935]: 2026-01-22 00:26:14.996 182939 DEBUG oslo_concurrency.lockutils [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "063fddce-2634-4aa5-a9c5-17ca977ea05a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:14 compute-0 nova_compute[182935]: 2026-01-22 00:26:14.996 182939 DEBUG oslo_concurrency.lockutils [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:14 compute-0 nova_compute[182935]: 2026-01-22 00:26:14.997 182939 DEBUG oslo_concurrency.lockutils [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:14 compute-0 nova_compute[182935]: 2026-01-22 00:26:14.997 182939 DEBUG oslo_concurrency.lockutils [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:14 compute-0 nova_compute[182935]: 2026-01-22 00:26:14.997 182939 DEBUG oslo_concurrency.lockutils [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.011 182939 INFO nova.compute.manager [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Terminating instance
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.027 182939 DEBUG nova.compute.manager [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:26:15 compute-0 kernel: tap21fc60b1-39 (unregistering): left promiscuous mode
Jan 22 00:26:15 compute-0 NetworkManager[55139]: <info>  [1769041575.0537] device (tap21fc60b1-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.074 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:15 compute-0 ovn_controller[95047]: 2026-01-22T00:26:15Z|00637|binding|INFO|Releasing lport 21fc60b1-3906-4de7-95e7-4113c45ab3d6 from this chassis (sb_readonly=0)
Jan 22 00:26:15 compute-0 ovn_controller[95047]: 2026-01-22T00:26:15Z|00638|binding|INFO|Setting lport 21fc60b1-3906-4de7-95e7-4113c45ab3d6 down in Southbound
Jan 22 00:26:15 compute-0 ovn_controller[95047]: 2026-01-22T00:26:15Z|00639|binding|INFO|Removing iface tap21fc60b1-39 ovn-installed in OVS
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.079 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.095 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.096 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:78:96 10.100.0.11'], port_security=['fa:16:3e:d7:78:96 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '063fddce-2634-4aa5-a9c5-17ca977ea05a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '6', 'neutron:security_group_ids': '49b0e0b5-57a2-46f3-afea-009e30140090', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d1fe7e8-9bbf-49b0-85b1-1237f5ccda8d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=21fc60b1-3906-4de7-95e7-4113c45ab3d6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.097 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 21fc60b1-3906-4de7-95e7-4113c45ab3d6 in datapath 4d1865a1-9397-451c-b754-7ed1f784f0b7 unbound from our chassis
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.098 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4d1865a1-9397-451c-b754-7ed1f784f0b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.099 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ece756-ca28-4e5d-86c5-45354740161b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.100 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7 namespace which is not needed anymore
Jan 22 00:26:15 compute-0 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Jan 22 00:26:15 compute-0 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000a2.scope: Consumed 12.855s CPU time.
Jan 22 00:26:15 compute-0 systemd-machined[154182]: Machine qemu-83-instance-000000a2 terminated.
Jan 22 00:26:15 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[239110]: [NOTICE]   (239114) : haproxy version is 2.8.14-c23fe91
Jan 22 00:26:15 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[239110]: [NOTICE]   (239114) : path to executable is /usr/sbin/haproxy
Jan 22 00:26:15 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[239110]: [WARNING]  (239114) : Exiting Master process...
Jan 22 00:26:15 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[239110]: [ALERT]    (239114) : Current worker (239116) exited with code 143 (Terminated)
Jan 22 00:26:15 compute-0 neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7[239110]: [WARNING]  (239114) : All workers exited. Exiting... (0)
Jan 22 00:26:15 compute-0 systemd[1]: libpod-06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673.scope: Deactivated successfully.
Jan 22 00:26:15 compute-0 podman[239296]: 2026-01-22 00:26:15.229195063 +0000 UTC m=+0.047746987 container died 06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 00:26:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673-userdata-shm.mount: Deactivated successfully.
Jan 22 00:26:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-69b697db9bd95cb85d57509bd55a89f7144c9783f9fa89d4d3064593d62b888b-merged.mount: Deactivated successfully.
Jan 22 00:26:15 compute-0 podman[239296]: 2026-01-22 00:26:15.26184114 +0000 UTC m=+0.080393064 container cleanup 06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:26:15 compute-0 systemd[1]: libpod-conmon-06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673.scope: Deactivated successfully.
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.289 182939 INFO nova.virt.libvirt.driver [-] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Instance destroyed successfully.
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.290 182939 DEBUG nova.objects.instance [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid 063fddce-2634-4aa5-a9c5-17ca977ea05a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.307 182939 DEBUG nova.virt.libvirt.vif [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-22T00:25:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-507265383',display_name='tempest-TestNetworkAdvancedServerOps-server-507265383',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-507265383',id=162,image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBV5lrE/0o3wylW8styAqIEq3o6gpKICYx+OOjZpZ4HsoPBmWU0vw3pi82ayKIiy54WBno0h74p6C9GNm4wvSrVujqZ6E1U9i7nevpU2R6qM9p7TZcVT1ycIcAKLfUQOYQ==',key_name='tempest-TestNetworkAdvancedServerOps-1079565092',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:25:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-w0q2r970',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3e1dda74-3c6a-4d29-8792-32134d1c36c5',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:25:54Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=063fddce-2634-4aa5-a9c5-17ca977ea05a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.308 182939 DEBUG nova.network.os_vif_util [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.309 182939 DEBUG nova.network.os_vif_util [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.309 182939 DEBUG os_vif [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.311 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.311 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap21fc60b1-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.312 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.315 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.317 182939 INFO os_vif [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:78:96,bridge_name='br-int',has_traffic_filtering=True,id=21fc60b1-3906-4de7-95e7-4113c45ab3d6,network=Network(4d1865a1-9397-451c-b754-7ed1f784f0b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21fc60b1-39')
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.318 182939 INFO nova.virt.libvirt.driver [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Deleting instance files /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a_del
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.318 182939 INFO nova.virt.libvirt.driver [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Deletion of /var/lib/nova/instances/063fddce-2634-4aa5-a9c5-17ca977ea05a_del complete
Jan 22 00:26:15 compute-0 podman[239340]: 2026-01-22 00:26:15.326724304 +0000 UTC m=+0.041534529 container remove 06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.331 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[899f43bc-4058-4833-a268-8eb2084ef067]: (4, ('Thu Jan 22 12:26:15 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7 (06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673)\n06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673\nThu Jan 22 12:26:15 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7 (06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673)\n06511503270c05975f71089ad053bfd4ed6d0c647fef39d1ebb17c130363b673\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.333 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c47e8979-c7de-4e9a-b0f4-efa895411bad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.334 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d1865a1-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.335 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:15 compute-0 kernel: tap4d1865a1-90: left promiscuous mode
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.349 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.352 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd137b8-8ed8-4ec4-908e-cde9ce6f98c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.376 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[70df51f3-ac6d-481f-b8e5-b587a4d9b757]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.378 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bc1265e8-2f25-487f-9c50-d402d05d45c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.393 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ee804dee-ee0d-48e9-9a9e-53fc570ef568]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 605662, 'reachable_time': 37029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239357, 'error': None, 'target': 'ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:15 compute-0 systemd[1]: run-netns-ovnmeta\x2d4d1865a1\x2d9397\x2d451c\x2db754\x2d7ed1f784f0b7.mount: Deactivated successfully.
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.397 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4d1865a1-9397-451c-b754-7ed1f784f0b7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:26:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:26:15.397 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb14d8f-8f46-4dcd-ab0d-154a1e4f9cfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.409 182939 INFO nova.compute.manager [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.409 182939 DEBUG oslo.service.loopingcall [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.410 182939 DEBUG nova.compute.manager [-] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.410 182939 DEBUG nova.network.neutron [-] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:26:15 compute-0 nova_compute[182935]: 2026-01-22 00:26:15.835 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:16 compute-0 nova_compute[182935]: 2026-01-22 00:26:16.016 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:16 compute-0 nova_compute[182935]: 2026-01-22 00:26:16.910 182939 DEBUG nova.network.neutron [req-a6a3e081-46d1-4191-bc86-564b000586dd req-b03257e8-dead-44f6-b1f3-2f117c42bf10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Updated VIF entry in instance network info cache for port 21fc60b1-3906-4de7-95e7-4113c45ab3d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:26:16 compute-0 nova_compute[182935]: 2026-01-22 00:26:16.911 182939 DEBUG nova.network.neutron [req-a6a3e081-46d1-4191-bc86-564b000586dd req-b03257e8-dead-44f6-b1f3-2f117c42bf10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Updating instance_info_cache with network_info: [{"id": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "address": "fa:16:3e:d7:78:96", "network": {"id": "4d1865a1-9397-451c-b754-7ed1f784f0b7", "bridge": "br-int", "label": "tempest-network-smoke--1838177961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21fc60b1-39", "ovs_interfaceid": "21fc60b1-3906-4de7-95e7-4113c45ab3d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.097 182939 DEBUG nova.compute.manager [req-3b75629c-ee6e-4b58-94fe-940a46b540d5 req-10dbccac-e7f6-4bd5-b49b-9548e0482bf6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-vif-unplugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.098 182939 DEBUG oslo_concurrency.lockutils [req-3b75629c-ee6e-4b58-94fe-940a46b540d5 req-10dbccac-e7f6-4bd5-b49b-9548e0482bf6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.099 182939 DEBUG oslo_concurrency.lockutils [req-3b75629c-ee6e-4b58-94fe-940a46b540d5 req-10dbccac-e7f6-4bd5-b49b-9548e0482bf6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.100 182939 DEBUG oslo_concurrency.lockutils [req-3b75629c-ee6e-4b58-94fe-940a46b540d5 req-10dbccac-e7f6-4bd5-b49b-9548e0482bf6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.100 182939 DEBUG nova.compute.manager [req-3b75629c-ee6e-4b58-94fe-940a46b540d5 req-10dbccac-e7f6-4bd5-b49b-9548e0482bf6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] No waiting events found dispatching network-vif-unplugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.101 182939 DEBUG nova.compute.manager [req-3b75629c-ee6e-4b58-94fe-940a46b540d5 req-10dbccac-e7f6-4bd5-b49b-9548e0482bf6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-vif-unplugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.101 182939 DEBUG nova.compute.manager [req-3b75629c-ee6e-4b58-94fe-940a46b540d5 req-10dbccac-e7f6-4bd5-b49b-9548e0482bf6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.102 182939 DEBUG oslo_concurrency.lockutils [req-3b75629c-ee6e-4b58-94fe-940a46b540d5 req-10dbccac-e7f6-4bd5-b49b-9548e0482bf6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.102 182939 DEBUG oslo_concurrency.lockutils [req-3b75629c-ee6e-4b58-94fe-940a46b540d5 req-10dbccac-e7f6-4bd5-b49b-9548e0482bf6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.103 182939 DEBUG oslo_concurrency.lockutils [req-3b75629c-ee6e-4b58-94fe-940a46b540d5 req-10dbccac-e7f6-4bd5-b49b-9548e0482bf6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.104 182939 DEBUG nova.compute.manager [req-3b75629c-ee6e-4b58-94fe-940a46b540d5 req-10dbccac-e7f6-4bd5-b49b-9548e0482bf6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] No waiting events found dispatching network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.104 182939 WARNING nova.compute.manager [req-3b75629c-ee6e-4b58-94fe-940a46b540d5 req-10dbccac-e7f6-4bd5-b49b-9548e0482bf6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received unexpected event network-vif-plugged-21fc60b1-3906-4de7-95e7-4113c45ab3d6 for instance with vm_state active and task_state deleting.
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.133 182939 DEBUG oslo_concurrency.lockutils [req-a6a3e081-46d1-4191-bc86-564b000586dd req-b03257e8-dead-44f6-b1f3-2f117c42bf10 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-063fddce-2634-4aa5-a9c5-17ca977ea05a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:26:17 compute-0 nova_compute[182935]: 2026-01-22 00:26:17.234 182939 DEBUG nova.network.neutron [-] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:26:18 compute-0 nova_compute[182935]: 2026-01-22 00:26:18.285 182939 INFO nova.compute.manager [-] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Took 2.87 seconds to deallocate network for instance.
Jan 22 00:26:18 compute-0 nova_compute[182935]: 2026-01-22 00:26:18.782 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:19 compute-0 nova_compute[182935]: 2026-01-22 00:26:19.550 182939 DEBUG oslo_concurrency.lockutils [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:19 compute-0 nova_compute[182935]: 2026-01-22 00:26:19.550 182939 DEBUG oslo_concurrency.lockutils [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:19 compute-0 nova_compute[182935]: 2026-01-22 00:26:19.870 182939 DEBUG nova.compute.manager [req-bbce073e-1041-46be-be9a-06fe8598f529 req-db395471-a9b2-44e3-9e2c-d086dd2ea723 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Received event network-vif-deleted-21fc60b1-3906-4de7-95e7-4113c45ab3d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:26:19 compute-0 nova_compute[182935]: 2026-01-22 00:26:19.882 182939 DEBUG nova.compute.provider_tree [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:26:19 compute-0 nova_compute[182935]: 2026-01-22 00:26:19.916 182939 DEBUG nova.scheduler.client.report [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:26:20 compute-0 nova_compute[182935]: 2026-01-22 00:26:20.045 182939 DEBUG oslo_concurrency.lockutils [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:20 compute-0 nova_compute[182935]: 2026-01-22 00:26:20.210 182939 INFO nova.scheduler.client.report [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Deleted allocations for instance 063fddce-2634-4aa5-a9c5-17ca977ea05a
Jan 22 00:26:20 compute-0 nova_compute[182935]: 2026-01-22 00:26:20.313 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:20 compute-0 nova_compute[182935]: 2026-01-22 00:26:20.807 182939 DEBUG oslo_concurrency.lockutils [None req-ab15a7b8-f86d-4808-a9ca-0a96d132bcf9 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "063fddce-2634-4aa5-a9c5-17ca977ea05a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:26:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:26:23 compute-0 nova_compute[182935]: 2026-01-22 00:26:23.784 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:25 compute-0 nova_compute[182935]: 2026-01-22 00:26:25.315 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:26 compute-0 podman[239359]: 2026-01-22 00:26:26.699879498 +0000 UTC m=+0.066717729 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:26:26 compute-0 podman[239358]: 2026-01-22 00:26:26.745238488 +0000 UTC m=+0.108207707 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:26:26 compute-0 podman[239399]: 2026-01-22 00:26:26.769423193 +0000 UTC m=+0.046586779 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:26:28 compute-0 nova_compute[182935]: 2026-01-22 00:26:28.786 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:30 compute-0 nova_compute[182935]: 2026-01-22 00:26:30.288 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041575.2875407, 063fddce-2634-4aa5-a9c5-17ca977ea05a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:26:30 compute-0 nova_compute[182935]: 2026-01-22 00:26:30.288 182939 INFO nova.compute.manager [-] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] VM Stopped (Lifecycle Event)
Jan 22 00:26:30 compute-0 nova_compute[182935]: 2026-01-22 00:26:30.317 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:30 compute-0 nova_compute[182935]: 2026-01-22 00:26:30.324 182939 DEBUG nova.compute.manager [None req-fde7b14d-f408-4d6f-99ee-0f5ffa77fb95 - - - - - -] [instance: 063fddce-2634-4aa5-a9c5-17ca977ea05a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:26:33 compute-0 nova_compute[182935]: 2026-01-22 00:26:33.828 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:34 compute-0 podman[239433]: 2026-01-22 00:26:34.675650943 +0000 UTC m=+0.049666194 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:26:35 compute-0 nova_compute[182935]: 2026-01-22 00:26:35.319 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:38 compute-0 nova_compute[182935]: 2026-01-22 00:26:38.830 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:40 compute-0 nova_compute[182935]: 2026-01-22 00:26:40.321 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:40 compute-0 podman[239456]: 2026-01-22 00:26:40.696447554 +0000 UTC m=+0.068935712 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 22 00:26:40 compute-0 podman[239455]: 2026-01-22 00:26:40.699271791 +0000 UTC m=+0.073909730 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container)
Jan 22 00:26:43 compute-0 nova_compute[182935]: 2026-01-22 00:26:43.832 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:45 compute-0 nova_compute[182935]: 2026-01-22 00:26:45.322 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:47 compute-0 sshd-session[239499]: Invalid user mongodb from 188.166.69.60 port 41512
Jan 22 00:26:47 compute-0 sshd-session[239499]: Connection closed by invalid user mongodb 188.166.69.60 port 41512 [preauth]
Jan 22 00:26:48 compute-0 nova_compute[182935]: 2026-01-22 00:26:48.834 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.491 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.491 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.514 182939 DEBUG nova.compute.manager [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.620 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.621 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.627 182939 DEBUG nova.virt.hardware [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.627 182939 INFO nova.compute.claims [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.735 182939 DEBUG nova.compute.provider_tree [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.756 182939 DEBUG nova.scheduler.client.report [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.786 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.787 182939 DEBUG nova.compute.manager [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.835 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.835 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.836 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.836 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.881 182939 DEBUG nova.compute.manager [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.881 182939 DEBUG nova.network.neutron [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.903 182939 INFO nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:26:49 compute-0 nova_compute[182935]: 2026-01-22 00:26:49.929 182939 DEBUG nova.compute.manager [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.047 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.049 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5684MB free_disk=73.12690734863281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.050 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.050 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.065 182939 DEBUG nova.compute.manager [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.067 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.067 182939 INFO nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Creating image(s)
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.068 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.068 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.069 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.089 182939 DEBUG oslo_concurrency.processutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.155 182939 DEBUG oslo_concurrency.processutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.157 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.158 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.182 182939 DEBUG oslo_concurrency.processutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.209 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 53686c08-86df-445a-b433-6a2c7c590fdb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.210 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.210 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.221 182939 DEBUG nova.policy [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.246 182939 DEBUG oslo_concurrency.processutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.247 182939 DEBUG oslo_concurrency.processutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.310 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.326 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.331 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.380 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.381 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.510 182939 DEBUG oslo_concurrency.processutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk 1073741824" returned: 0 in 0.263s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.512 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.513 182939 DEBUG oslo_concurrency.processutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.612 182939 DEBUG oslo_concurrency.processutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.614 182939 DEBUG nova.virt.disk.api [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Checking if we can resize image /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.615 182939 DEBUG oslo_concurrency.processutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.701 182939 DEBUG oslo_concurrency.processutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.702 182939 DEBUG nova.virt.disk.api [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Cannot resize image /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.703 182939 DEBUG nova.objects.instance [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid 53686c08-86df-445a-b433-6a2c7c590fdb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.726 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.726 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Ensure instance console log exists: /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.727 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.727 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:50 compute-0 nova_compute[182935]: 2026-01-22 00:26:50.728 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:53 compute-0 nova_compute[182935]: 2026-01-22 00:26:53.383 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:53 compute-0 nova_compute[182935]: 2026-01-22 00:26:53.383 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:26:53 compute-0 nova_compute[182935]: 2026-01-22 00:26:53.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:53 compute-0 nova_compute[182935]: 2026-01-22 00:26:53.836 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:54 compute-0 nova_compute[182935]: 2026-01-22 00:26:54.012 182939 DEBUG nova.network.neutron [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Successfully created port: ac62ef89-aec4-41c9-83dd-366bdfc1c0bd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:26:54 compute-0 nova_compute[182935]: 2026-01-22 00:26:54.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:54 compute-0 nova_compute[182935]: 2026-01-22 00:26:54.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:26:54 compute-0 nova_compute[182935]: 2026-01-22 00:26:54.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:26:54 compute-0 nova_compute[182935]: 2026-01-22 00:26:54.818 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 22 00:26:54 compute-0 nova_compute[182935]: 2026-01-22 00:26:54.818 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:26:55 compute-0 nova_compute[182935]: 2026-01-22 00:26:55.329 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:56 compute-0 nova_compute[182935]: 2026-01-22 00:26:56.072 182939 DEBUG nova.network.neutron [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Successfully updated port: ac62ef89-aec4-41c9-83dd-366bdfc1c0bd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:26:56 compute-0 nova_compute[182935]: 2026-01-22 00:26:56.097 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:26:56 compute-0 nova_compute[182935]: 2026-01-22 00:26:56.098 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:26:56 compute-0 nova_compute[182935]: 2026-01-22 00:26:56.098 182939 DEBUG nova.network.neutron [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:26:56 compute-0 nova_compute[182935]: 2026-01-22 00:26:56.268 182939 DEBUG nova.network.neutron [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:26:56 compute-0 nova_compute[182935]: 2026-01-22 00:26:56.352 182939 DEBUG nova.compute.manager [req-adc1e65a-2802-496b-ace9-5d5fc5208d9c req-20389400-6022-4cb2-95eb-d20ca066a68a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-changed-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:26:56 compute-0 nova_compute[182935]: 2026-01-22 00:26:56.353 182939 DEBUG nova.compute.manager [req-adc1e65a-2802-496b-ace9-5d5fc5208d9c req-20389400-6022-4cb2-95eb-d20ca066a68a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Refreshing instance network info cache due to event network-changed-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:26:56 compute-0 nova_compute[182935]: 2026-01-22 00:26:56.353 182939 DEBUG oslo_concurrency.lockutils [req-adc1e65a-2802-496b-ace9-5d5fc5208d9c req-20389400-6022-4cb2-95eb-d20ca066a68a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:26:57 compute-0 podman[239517]: 2026-01-22 00:26:57.686739538 +0000 UTC m=+0.058323208 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:26:57 compute-0 podman[239519]: 2026-01-22 00:26:57.69225069 +0000 UTC m=+0.055377719 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:26:57 compute-0 podman[239518]: 2026-01-22 00:26:57.744158675 +0000 UTC m=+0.106973986 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 22 00:26:58 compute-0 nova_compute[182935]: 2026-01-22 00:26:58.838 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.603 182939 DEBUG nova.network.neutron [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating instance_info_cache with network_info: [{"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.702 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.703 182939 DEBUG nova.compute.manager [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Instance network_info: |[{"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.703 182939 DEBUG oslo_concurrency.lockutils [req-adc1e65a-2802-496b-ace9-5d5fc5208d9c req-20389400-6022-4cb2-95eb-d20ca066a68a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.703 182939 DEBUG nova.network.neutron [req-adc1e65a-2802-496b-ace9-5d5fc5208d9c req-20389400-6022-4cb2-95eb-d20ca066a68a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Refreshing network info cache for port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.706 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Start _get_guest_xml network_info=[{"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.711 182939 WARNING nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.714 182939 DEBUG nova.virt.libvirt.host [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.715 182939 DEBUG nova.virt.libvirt.host [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.718 182939 DEBUG nova.virt.libvirt.host [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.719 182939 DEBUG nova.virt.libvirt.host [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.720 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.721 182939 DEBUG nova.virt.hardware [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.721 182939 DEBUG nova.virt.hardware [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.722 182939 DEBUG nova.virt.hardware [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.722 182939 DEBUG nova.virt.hardware [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.722 182939 DEBUG nova.virt.hardware [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.723 182939 DEBUG nova.virt.hardware [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.723 182939 DEBUG nova.virt.hardware [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.723 182939 DEBUG nova.virt.hardware [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.724 182939 DEBUG nova.virt.hardware [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.724 182939 DEBUG nova.virt.hardware [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.725 182939 DEBUG nova.virt.hardware [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.730 182939 DEBUG nova.virt.libvirt.vif [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:26:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1419654419',display_name='tempest-TestNetworkAdvancedServerOps-server-1419654419',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1419654419',id=166,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5yIuZRAnyr0fY4MG0JJtrl2YmC7LkxFjTLIDSY0MneCjEwMPb+R0i/C3i76549W+tX7/jAJYcJ/Zhm6OZjTa9donlIVfIM40NClFZ/uOy/0cpEFxgyZR4Q96O10ulXCA==',key_name='tempest-TestNetworkAdvancedServerOps-623011376',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1itpmhcp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:26:49Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=53686c08-86df-445a-b433-6a2c7c590fdb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.731 182939 DEBUG nova.network.os_vif_util [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.731 182939 DEBUG nova.network.os_vif_util [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.733 182939 DEBUG nova.objects.instance [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid 53686c08-86df-445a-b433-6a2c7c590fdb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.749 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:26:59 compute-0 nova_compute[182935]:   <uuid>53686c08-86df-445a-b433-6a2c7c590fdb</uuid>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   <name>instance-000000a6</name>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1419654419</nova:name>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:26:59</nova:creationTime>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:26:59 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:26:59 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:26:59 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:26:59 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:26:59 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:26:59 compute-0 nova_compute[182935]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:26:59 compute-0 nova_compute[182935]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:26:59 compute-0 nova_compute[182935]:         <nova:port uuid="ac62ef89-aec4-41c9-83dd-366bdfc1c0bd">
Jan 22 00:26:59 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <system>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <entry name="serial">53686c08-86df-445a-b433-6a2c7c590fdb</entry>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <entry name="uuid">53686c08-86df-445a-b433-6a2c7c590fdb</entry>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     </system>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   <os>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   </os>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   <features>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   </features>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.config"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:5b:20:cd"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <target dev="tapac62ef89-ae"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/console.log" append="off"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <video>
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     </video>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:26:59 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:26:59 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:26:59 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:26:59 compute-0 nova_compute[182935]: </domain>
Jan 22 00:26:59 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.751 182939 DEBUG nova.compute.manager [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Preparing to wait for external event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.752 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.753 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.753 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.755 182939 DEBUG nova.virt.libvirt.vif [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:26:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1419654419',display_name='tempest-TestNetworkAdvancedServerOps-server-1419654419',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1419654419',id=166,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5yIuZRAnyr0fY4MG0JJtrl2YmC7LkxFjTLIDSY0MneCjEwMPb+R0i/C3i76549W+tX7/jAJYcJ/Zhm6OZjTa9donlIVfIM40NClFZ/uOy/0cpEFxgyZR4Q96O10ulXCA==',key_name='tempest-TestNetworkAdvancedServerOps-623011376',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1itpmhcp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:26:49Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=53686c08-86df-445a-b433-6a2c7c590fdb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.755 182939 DEBUG nova.network.os_vif_util [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.756 182939 DEBUG nova.network.os_vif_util [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.757 182939 DEBUG os_vif [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.758 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.759 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.760 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.766 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.767 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac62ef89-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.768 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapac62ef89-ae, col_values=(('external_ids', {'iface-id': 'ac62ef89-aec4-41c9-83dd-366bdfc1c0bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:20:cd', 'vm-uuid': '53686c08-86df-445a-b433-6a2c7c590fdb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.770 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:59 compute-0 NetworkManager[55139]: <info>  [1769041619.7711] manager: (tapac62ef89-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/310)
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.772 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.777 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.779 182939 INFO os_vif [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae')
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.837 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.838 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.838 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No VIF found with MAC fa:16:3e:5b:20:cd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:26:59 compute-0 nova_compute[182935]: 2026-01-22 00:26:59.838 182939 INFO nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Using config drive
Jan 22 00:27:00 compute-0 nova_compute[182935]: 2026-01-22 00:27:00.739 182939 INFO nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Creating config drive at /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.config
Jan 22 00:27:00 compute-0 nova_compute[182935]: 2026-01-22 00:27:00.744 182939 DEBUG oslo_concurrency.processutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1136_2aj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:00 compute-0 nova_compute[182935]: 2026-01-22 00:27:00.872 182939 DEBUG oslo_concurrency.processutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1136_2aj" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:00 compute-0 kernel: tapac62ef89-ae: entered promiscuous mode
Jan 22 00:27:00 compute-0 nova_compute[182935]: 2026-01-22 00:27:00.954 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:00 compute-0 NetworkManager[55139]: <info>  [1769041620.9563] manager: (tapac62ef89-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/311)
Jan 22 00:27:00 compute-0 ovn_controller[95047]: 2026-01-22T00:27:00Z|00640|binding|INFO|Claiming lport ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for this chassis.
Jan 22 00:27:00 compute-0 ovn_controller[95047]: 2026-01-22T00:27:00Z|00641|binding|INFO|ac62ef89-aec4-41c9-83dd-366bdfc1c0bd: Claiming fa:16:3e:5b:20:cd 10.100.0.10
Jan 22 00:27:00 compute-0 nova_compute[182935]: 2026-01-22 00:27:00.958 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:00 compute-0 nova_compute[182935]: 2026-01-22 00:27:00.965 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:00 compute-0 nova_compute[182935]: 2026-01-22 00:27:00.969 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:00 compute-0 systemd-udevd[239609]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:27:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:00.988 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:20:cd 10.100.0.10'], port_security=['fa:16:3e:5b:20:cd 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '53686c08-86df-445a-b433-6a2c7c590fdb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b1ad694-cd0e-4047-b840-b090066a26f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ddb2905c-b7d9-4e7e-b5f3-61f1bd651115', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07b94cbf-fe21-425f-b9b0-192a8a6fba61, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:27:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:00.989 104408 INFO neutron.agent.ovn.metadata.agent [-] Port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd in datapath 3b1ad694-cd0e-4047-b840-b090066a26f4 bound to our chassis
Jan 22 00:27:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:00.990 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3b1ad694-cd0e-4047-b840-b090066a26f4
Jan 22 00:27:00 compute-0 systemd-machined[154182]: New machine qemu-84-instance-000000a6.
Jan 22 00:27:01 compute-0 NetworkManager[55139]: <info>  [1769041621.0039] device (tapac62ef89-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:27:01 compute-0 NetworkManager[55139]: <info>  [1769041621.0051] device (tapac62ef89-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.008 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d882ad68-c4f5-427b-be18-70ad0d215b1f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.010 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3b1ad694-c1 in ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.012 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3b1ad694-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.012 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[60659058-3306-4c0f-b06e-54ba18a2e39d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.013 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[12dd6aa9-2fe0-4fc9-ba18-ea57a448a65b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_controller[95047]: 2026-01-22T00:27:01Z|00642|binding|INFO|Setting lport ac62ef89-aec4-41c9-83dd-366bdfc1c0bd ovn-installed in OVS
Jan 22 00:27:01 compute-0 ovn_controller[95047]: 2026-01-22T00:27:01Z|00643|binding|INFO|Setting lport ac62ef89-aec4-41c9-83dd-366bdfc1c0bd up in Southbound
Jan 22 00:27:01 compute-0 systemd[1]: Started Virtual Machine qemu-84-instance-000000a6.
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.022 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.029 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[f7813706-ebf9-42bf-b87f-7fc087de5ab4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.056 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b5744b-02d9-484f-b11c-6870ad01d899]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.082 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[61e894ef-823b-4831-b611-99d22ec90a29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 systemd-udevd[239613]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:27:01 compute-0 NetworkManager[55139]: <info>  [1769041621.0892] manager: (tap3b1ad694-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/312)
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.089 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd306ba-ab7b-40b6-b98c-12cb5dd99cd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.126 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[693a5c83-e284-44cb-a0b1-4851964bab78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.132 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[84557e2f-c8a3-4532-82ae-8e2f874ae7ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 NetworkManager[55139]: <info>  [1769041621.1576] device (tap3b1ad694-c0): carrier: link connected
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.163 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d13e9fd7-d7cf-411e-b7fa-08bc9ca145df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.180 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3b543b25-2e42-4026-8052-e4fdee8e45b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3b1ad694-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:3a:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 612389, 'reachable_time': 37723, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239643, 'error': None, 'target': 'ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.197 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0eca3c41-5895-45fa-b517-af10c73b16ff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9c:3a17'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 612389, 'tstamp': 612389}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239644, 'error': None, 'target': 'ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.215 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dc827a6e-7205-4f06-a4fb-6e0e3c5f9174]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3b1ad694-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:3a:17'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 612389, 'reachable_time': 37723, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239645, 'error': None, 'target': 'ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.251 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1628d5ac-78ca-4ef0-ac8d-0b6a513fcc0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.311 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[345a1fbe-f4b9-4121-97b8-00acb1457955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.312 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b1ad694-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.313 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.313 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b1ad694-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.314 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:01 compute-0 NetworkManager[55139]: <info>  [1769041621.3156] manager: (tap3b1ad694-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/313)
Jan 22 00:27:01 compute-0 kernel: tap3b1ad694-c0: entered promiscuous mode
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.318 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.322 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3b1ad694-c0, col_values=(('external_ids', {'iface-id': '9b691818-8ed3-4906-ae03-85650608d26c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:01 compute-0 ovn_controller[95047]: 2026-01-22T00:27:01Z|00644|binding|INFO|Releasing lport 9b691818-8ed3-4906-ae03-85650608d26c from this chassis (sb_readonly=0)
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.324 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.336 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.336 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3b1ad694-cd0e-4047-b840-b090066a26f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3b1ad694-cd0e-4047-b840-b090066a26f4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.337 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ced53f97-24d4-4614-83cf-33f5c15a6f20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.338 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-3b1ad694-cd0e-4047-b840-b090066a26f4
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/3b1ad694-cd0e-4047-b840-b090066a26f4.pid.haproxy
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 3b1ad694-cd0e-4047-b840-b090066a26f4
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:27:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:01.339 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4', 'env', 'PROCESS_TAG=haproxy-3b1ad694-cd0e-4047-b840-b090066a26f4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3b1ad694-cd0e-4047-b840-b090066a26f4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.418 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041621.418023, 53686c08-86df-445a-b433-6a2c7c590fdb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.420 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] VM Started (Lifecycle Event)
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.445 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.449 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041621.4193587, 53686c08-86df-445a-b433-6a2c7c590fdb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.450 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] VM Paused (Lifecycle Event)
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.466 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.470 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:27:01 compute-0 nova_compute[182935]: 2026-01-22 00:27:01.495 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:27:01 compute-0 podman[239684]: 2026-01-22 00:27:01.692886646 +0000 UTC m=+0.050689249 container create b7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 00:27:01 compute-0 systemd[1]: Started libpod-conmon-b7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba.scope.
Jan 22 00:27:01 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:27:01 compute-0 podman[239684]: 2026-01-22 00:27:01.664766736 +0000 UTC m=+0.022569379 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:27:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1dbbacc019c748b89bbd0f67759edb808eed5a0a9372da6a87b9587aeb619bc2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:27:01 compute-0 podman[239684]: 2026-01-22 00:27:01.776162597 +0000 UTC m=+0.133965240 container init b7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 00:27:01 compute-0 podman[239684]: 2026-01-22 00:27:01.781559215 +0000 UTC m=+0.139361818 container start b7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:27:01 compute-0 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[239699]: [NOTICE]   (239703) : New worker (239705) forked
Jan 22 00:27:01 compute-0 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[239699]: [NOTICE]   (239703) : Loading success.
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.047 182939 DEBUG nova.network.neutron [req-adc1e65a-2802-496b-ace9-5d5fc5208d9c req-20389400-6022-4cb2-95eb-d20ca066a68a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updated VIF entry in instance network info cache for port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.048 182939 DEBUG nova.network.neutron [req-adc1e65a-2802-496b-ace9-5d5fc5208d9c req-20389400-6022-4cb2-95eb-d20ca066a68a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating instance_info_cache with network_info: [{"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.069 182939 DEBUG oslo_concurrency.lockutils [req-adc1e65a-2802-496b-ace9-5d5fc5208d9c req-20389400-6022-4cb2-95eb-d20ca066a68a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.233 182939 DEBUG nova.compute.manager [req-434e9529-bfbb-46f4-9f79-7ac1a9c81e17 req-319a883f-5356-4318-a423-b170d10e362f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.234 182939 DEBUG oslo_concurrency.lockutils [req-434e9529-bfbb-46f4-9f79-7ac1a9c81e17 req-319a883f-5356-4318-a423-b170d10e362f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.234 182939 DEBUG oslo_concurrency.lockutils [req-434e9529-bfbb-46f4-9f79-7ac1a9c81e17 req-319a883f-5356-4318-a423-b170d10e362f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.235 182939 DEBUG oslo_concurrency.lockutils [req-434e9529-bfbb-46f4-9f79-7ac1a9c81e17 req-319a883f-5356-4318-a423-b170d10e362f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.235 182939 DEBUG nova.compute.manager [req-434e9529-bfbb-46f4-9f79-7ac1a9c81e17 req-319a883f-5356-4318-a423-b170d10e362f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Processing event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.236 182939 DEBUG nova.compute.manager [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.241 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041622.240729, 53686c08-86df-445a-b433-6a2c7c590fdb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.241 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] VM Resumed (Lifecycle Event)
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.244 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.247 182939 INFO nova.virt.libvirt.driver [-] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Instance spawned successfully.
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.247 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.263 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.268 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.272 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.273 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.273 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.274 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.274 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.274 182939 DEBUG nova.virt.libvirt.driver [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.303 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.353 182939 INFO nova.compute.manager [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Took 12.29 seconds to spawn the instance on the hypervisor.
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.354 182939 DEBUG nova.compute.manager [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.449 182939 INFO nova.compute.manager [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Took 12.87 seconds to build instance.
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.466 182939 DEBUG oslo_concurrency.lockutils [None req-dd8873a2-0751-4120-9ff8-5ff57bc8608f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.975s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:02 compute-0 nova_compute[182935]: 2026-01-22 00:27:02.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:03.223 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:03.224 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:03.225 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:03 compute-0 nova_compute[182935]: 2026-01-22 00:27:03.878 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:04 compute-0 nova_compute[182935]: 2026-01-22 00:27:04.325 182939 DEBUG nova.compute.manager [req-98237414-7775-4d79-a458-2bbb69e1b117 req-f8047044-fa84-45de-9ec1-9552c2738e2b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:04 compute-0 nova_compute[182935]: 2026-01-22 00:27:04.325 182939 DEBUG oslo_concurrency.lockutils [req-98237414-7775-4d79-a458-2bbb69e1b117 req-f8047044-fa84-45de-9ec1-9552c2738e2b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:04 compute-0 nova_compute[182935]: 2026-01-22 00:27:04.325 182939 DEBUG oslo_concurrency.lockutils [req-98237414-7775-4d79-a458-2bbb69e1b117 req-f8047044-fa84-45de-9ec1-9552c2738e2b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:04 compute-0 nova_compute[182935]: 2026-01-22 00:27:04.326 182939 DEBUG oslo_concurrency.lockutils [req-98237414-7775-4d79-a458-2bbb69e1b117 req-f8047044-fa84-45de-9ec1-9552c2738e2b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:04 compute-0 nova_compute[182935]: 2026-01-22 00:27:04.326 182939 DEBUG nova.compute.manager [req-98237414-7775-4d79-a458-2bbb69e1b117 req-f8047044-fa84-45de-9ec1-9552c2738e2b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] No waiting events found dispatching network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:27:04 compute-0 nova_compute[182935]: 2026-01-22 00:27:04.327 182939 WARNING nova.compute.manager [req-98237414-7775-4d79-a458-2bbb69e1b117 req-f8047044-fa84-45de-9ec1-9552c2738e2b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received unexpected event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for instance with vm_state active and task_state None.
Jan 22 00:27:04 compute-0 nova_compute[182935]: 2026-01-22 00:27:04.770 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:05 compute-0 nova_compute[182935]: 2026-01-22 00:27:05.097 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:05 compute-0 NetworkManager[55139]: <info>  [1769041625.0984] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/314)
Jan 22 00:27:05 compute-0 NetworkManager[55139]: <info>  [1769041625.0993] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Jan 22 00:27:05 compute-0 nova_compute[182935]: 2026-01-22 00:27:05.176 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:05 compute-0 ovn_controller[95047]: 2026-01-22T00:27:05Z|00645|binding|INFO|Releasing lport 9b691818-8ed3-4906-ae03-85650608d26c from this chassis (sb_readonly=0)
Jan 22 00:27:05 compute-0 nova_compute[182935]: 2026-01-22 00:27:05.195 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:05 compute-0 podman[239715]: 2026-01-22 00:27:05.710446783 +0000 UTC m=+0.069318070 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:27:06 compute-0 nova_compute[182935]: 2026-01-22 00:27:06.047 182939 DEBUG nova.compute.manager [req-434ebeb0-0161-4ee4-abb6-15f8ede3544b req-b75e28fc-f22b-404f-b0f0-d6dd02256815 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-changed-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:06 compute-0 nova_compute[182935]: 2026-01-22 00:27:06.047 182939 DEBUG nova.compute.manager [req-434ebeb0-0161-4ee4-abb6-15f8ede3544b req-b75e28fc-f22b-404f-b0f0-d6dd02256815 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Refreshing instance network info cache due to event network-changed-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:27:06 compute-0 nova_compute[182935]: 2026-01-22 00:27:06.048 182939 DEBUG oslo_concurrency.lockutils [req-434ebeb0-0161-4ee4-abb6-15f8ede3544b req-b75e28fc-f22b-404f-b0f0-d6dd02256815 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:27:06 compute-0 nova_compute[182935]: 2026-01-22 00:27:06.048 182939 DEBUG oslo_concurrency.lockutils [req-434ebeb0-0161-4ee4-abb6-15f8ede3544b req-b75e28fc-f22b-404f-b0f0-d6dd02256815 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:27:06 compute-0 nova_compute[182935]: 2026-01-22 00:27:06.048 182939 DEBUG nova.network.neutron [req-434ebeb0-0161-4ee4-abb6-15f8ede3544b req-b75e28fc-f22b-404f-b0f0-d6dd02256815 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Refreshing network info cache for port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:27:08 compute-0 nova_compute[182935]: 2026-01-22 00:27:08.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:08 compute-0 nova_compute[182935]: 2026-01-22 00:27:08.880 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:09 compute-0 nova_compute[182935]: 2026-01-22 00:27:09.772 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:11 compute-0 nova_compute[182935]: 2026-01-22 00:27:11.636 182939 DEBUG nova.network.neutron [req-434ebeb0-0161-4ee4-abb6-15f8ede3544b req-b75e28fc-f22b-404f-b0f0-d6dd02256815 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updated VIF entry in instance network info cache for port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:27:11 compute-0 nova_compute[182935]: 2026-01-22 00:27:11.637 182939 DEBUG nova.network.neutron [req-434ebeb0-0161-4ee4-abb6-15f8ede3544b req-b75e28fc-f22b-404f-b0f0-d6dd02256815 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating instance_info_cache with network_info: [{"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:27:11 compute-0 nova_compute[182935]: 2026-01-22 00:27:11.697 182939 DEBUG oslo_concurrency.lockutils [req-434ebeb0-0161-4ee4-abb6-15f8ede3544b req-b75e28fc-f22b-404f-b0f0-d6dd02256815 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:27:11 compute-0 podman[239735]: 2026-01-22 00:27:11.709602659 +0000 UTC m=+0.071674546 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.tags=minimal rhel9)
Jan 22 00:27:11 compute-0 podman[239736]: 2026-01-22 00:27:11.735545996 +0000 UTC m=+0.082497614 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:27:13 compute-0 nova_compute[182935]: 2026-01-22 00:27:13.882 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:14 compute-0 ovn_controller[95047]: 2026-01-22T00:27:14Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:20:cd 10.100.0.10
Jan 22 00:27:14 compute-0 ovn_controller[95047]: 2026-01-22T00:27:14Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:20:cd 10.100.0.10
Jan 22 00:27:14 compute-0 nova_compute[182935]: 2026-01-22 00:27:14.775 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:18 compute-0 nova_compute[182935]: 2026-01-22 00:27:18.885 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:19 compute-0 nova_compute[182935]: 2026-01-22 00:27:19.777 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:20 compute-0 nova_compute[182935]: 2026-01-22 00:27:20.960 182939 INFO nova.compute.manager [None req-f2e4a5c4-c400-4510-8616-53a61d144eef 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Get console output
Jan 22 00:27:20 compute-0 nova_compute[182935]: 2026-01-22 00:27:20.966 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:27:23 compute-0 nova_compute[182935]: 2026-01-22 00:27:23.914 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:24 compute-0 nova_compute[182935]: 2026-01-22 00:27:24.779 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:26 compute-0 nova_compute[182935]: 2026-01-22 00:27:26.520 182939 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:27:26 compute-0 nova_compute[182935]: 2026-01-22 00:27:26.520 182939 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:27:26 compute-0 nova_compute[182935]: 2026-01-22 00:27:26.520 182939 DEBUG nova.network.neutron [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:27:28 compute-0 podman[239793]: 2026-01-22 00:27:28.721607964 +0000 UTC m=+0.083020046 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:27:28 compute-0 podman[239791]: 2026-01-22 00:27:28.734244315 +0000 UTC m=+0.099828497 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:27:28 compute-0 podman[239792]: 2026-01-22 00:27:28.743917875 +0000 UTC m=+0.104218912 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 00:27:28 compute-0 nova_compute[182935]: 2026-01-22 00:27:28.917 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:29 compute-0 sshd-session[239867]: Invalid user mongodb from 188.166.69.60 port 33956
Jan 22 00:27:29 compute-0 sshd-session[239867]: Connection closed by invalid user mongodb 188.166.69.60 port 33956 [preauth]
Jan 22 00:27:29 compute-0 nova_compute[182935]: 2026-01-22 00:27:29.783 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:30 compute-0 nova_compute[182935]: 2026-01-22 00:27:30.674 182939 DEBUG nova.network.neutron [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating instance_info_cache with network_info: [{"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:27:30 compute-0 nova_compute[182935]: 2026-01-22 00:27:30.695 182939 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:27:30 compute-0 nova_compute[182935]: 2026-01-22 00:27:30.927 182939 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 22 00:27:30 compute-0 nova_compute[182935]: 2026-01-22 00:27:30.928 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Creating file /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/406dc5f0d04446658c50861cc06a8a23.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 22 00:27:30 compute-0 nova_compute[182935]: 2026-01-22 00:27:30.928 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/406dc5f0d04446658c50861cc06a8a23.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:31 compute-0 nova_compute[182935]: 2026-01-22 00:27:31.519 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/406dc5f0d04446658c50861cc06a8a23.tmp" returned: 1 in 0.591s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:31 compute-0 nova_compute[182935]: 2026-01-22 00:27:31.520 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/406dc5f0d04446658c50861cc06a8a23.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 22 00:27:31 compute-0 nova_compute[182935]: 2026-01-22 00:27:31.521 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Creating directory /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 22 00:27:31 compute-0 nova_compute[182935]: 2026-01-22 00:27:31.521 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:31 compute-0 nova_compute[182935]: 2026-01-22 00:27:31.752 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb" returned: 0 in 0.231s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:31 compute-0 nova_compute[182935]: 2026-01-22 00:27:31.757 182939 DEBUG nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:27:33 compute-0 nova_compute[182935]: 2026-01-22 00:27:33.920 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:33 compute-0 kernel: tapac62ef89-ae (unregistering): left promiscuous mode
Jan 22 00:27:33 compute-0 NetworkManager[55139]: <info>  [1769041653.9801] device (tapac62ef89-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:27:33 compute-0 ovn_controller[95047]: 2026-01-22T00:27:33Z|00646|binding|INFO|Releasing lport ac62ef89-aec4-41c9-83dd-366bdfc1c0bd from this chassis (sb_readonly=0)
Jan 22 00:27:33 compute-0 ovn_controller[95047]: 2026-01-22T00:27:33Z|00647|binding|INFO|Setting lport ac62ef89-aec4-41c9-83dd-366bdfc1c0bd down in Southbound
Jan 22 00:27:33 compute-0 nova_compute[182935]: 2026-01-22 00:27:33.992 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:33 compute-0 ovn_controller[95047]: 2026-01-22T00:27:33Z|00648|binding|INFO|Removing iface tapac62ef89-ae ovn-installed in OVS
Jan 22 00:27:33 compute-0 nova_compute[182935]: 2026-01-22 00:27:33.994 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.008 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:34 compute-0 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000a6.scope: Deactivated successfully.
Jan 22 00:27:34 compute-0 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000a6.scope: Consumed 13.859s CPU time.
Jan 22 00:27:34 compute-0 systemd-machined[154182]: Machine qemu-84-instance-000000a6 terminated.
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.213 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.218 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.366 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:20:cd 10.100.0.10'], port_security=['fa:16:3e:5b:20:cd 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '53686c08-86df-445a-b433-6a2c7c590fdb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3b1ad694-cd0e-4047-b840-b090066a26f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ddb2905c-b7d9-4e7e-b5f3-61f1bd651115', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07b94cbf-fe21-425f-b9b0-192a8a6fba61, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.367 104408 INFO neutron.agent.ovn.metadata.agent [-] Port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd in datapath 3b1ad694-cd0e-4047-b840-b090066a26f4 unbound from our chassis
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.368 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3b1ad694-cd0e-4047-b840-b090066a26f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.370 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[adcf335b-4c11-4981-9a2c-0b4d6da1b81e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.370 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4 namespace which is not needed anymore
Jan 22 00:27:34 compute-0 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[239699]: [NOTICE]   (239703) : haproxy version is 2.8.14-c23fe91
Jan 22 00:27:34 compute-0 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[239699]: [NOTICE]   (239703) : path to executable is /usr/sbin/haproxy
Jan 22 00:27:34 compute-0 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[239699]: [WARNING]  (239703) : Exiting Master process...
Jan 22 00:27:34 compute-0 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[239699]: [ALERT]    (239703) : Current worker (239705) exited with code 143 (Terminated)
Jan 22 00:27:34 compute-0 neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4[239699]: [WARNING]  (239703) : All workers exited. Exiting... (0)
Jan 22 00:27:34 compute-0 systemd[1]: libpod-b7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba.scope: Deactivated successfully.
Jan 22 00:27:34 compute-0 podman[239914]: 2026-01-22 00:27:34.523564188 +0000 UTC m=+0.049239973 container died b7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:27:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba-userdata-shm.mount: Deactivated successfully.
Jan 22 00:27:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-1dbbacc019c748b89bbd0f67759edb808eed5a0a9372da6a87b9587aeb619bc2-merged.mount: Deactivated successfully.
Jan 22 00:27:34 compute-0 podman[239914]: 2026-01-22 00:27:34.561413648 +0000 UTC m=+0.087089433 container cleanup b7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:27:34 compute-0 systemd[1]: libpod-conmon-b7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba.scope: Deactivated successfully.
Jan 22 00:27:34 compute-0 podman[239944]: 2026-01-22 00:27:34.621712913 +0000 UTC m=+0.036518840 container remove b7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.626 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[70657be9-5df9-4d19-9104-c42b820ea508]: (4, ('Thu Jan 22 12:27:34 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4 (b7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba)\nb7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba\nThu Jan 22 12:27:34 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4 (b7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba)\nb7f193fc938f31d85ee90d341a9e28ea10299388b32df8f4d162b2957c7283ba\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.628 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2875b1f1-fbab-4f0f-9ce4-1fc1ef367185]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.629 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b1ad694-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.631 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:34 compute-0 kernel: tap3b1ad694-c0: left promiscuous mode
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.645 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.646 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.647 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3c394cd2-7a94-4347-b5c9-b1f013abb15d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.669 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[af7bd46a-2e99-4402-8c70-a956e1fa72c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.670 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3d19bb34-0c05-494e-815a-e81525aaac17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.687 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[86fa2920-8edf-4b53-8358-98fbf80f38a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 612381, 'reachable_time': 30256, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239963, 'error': None, 'target': 'ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.688 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3b1ad694-cd0e-4047-b840-b090066a26f4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:27:34 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:34.688 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[14649002-2a31-4710-a331-e9fdfa240dc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:27:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d3b1ad694\x2dcd0e\x2d4047\x2db840\x2db090066a26f4.mount: Deactivated successfully.
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.773 182939 INFO nova.virt.libvirt.driver [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Instance shutdown successfully after 3 seconds.
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.779 182939 INFO nova.virt.libvirt.driver [-] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Instance destroyed successfully.
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.780 182939 DEBUG nova.virt.libvirt.vif [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:26:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1419654419',display_name='tempest-TestNetworkAdvancedServerOps-server-1419654419',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1419654419',id=166,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5yIuZRAnyr0fY4MG0JJtrl2YmC7LkxFjTLIDSY0MneCjEwMPb+R0i/C3i76549W+tX7/jAJYcJ/Zhm6OZjTa9donlIVfIM40NClFZ/uOy/0cpEFxgyZR4Q96O10ulXCA==',key_name='tempest-TestNetworkAdvancedServerOps-623011376',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:27:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1itpmhcp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:27:26Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=53686c08-86df-445a-b433-6a2c7c590fdb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1985720748", "vif_mac": "fa:16:3e:5b:20:cd"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.781 182939 DEBUG nova.network.os_vif_util [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1985720748", "vif_mac": "fa:16:3e:5b:20:cd"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.781 182939 DEBUG nova.network.os_vif_util [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.782 182939 DEBUG os_vif [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.784 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.785 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac62ef89-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.786 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.787 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.790 182939 INFO os_vif [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae')
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.794 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.859 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.860 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.919 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.921 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Copying file /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb_resize/disk to 192.168.122.101:/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:27:34 compute-0 nova_compute[182935]: 2026-01-22 00:27:34.921 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb_resize/disk 192.168.122.101:/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.500 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "scp -r /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb_resize/disk 192.168.122.101:/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk" returned: 0 in 0.579s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.501 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Copying file /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.501 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb_resize/disk.config 192.168.122.101:/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.752 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "scp -C -r /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb_resize/disk.config 192.168.122.101:/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.config" returned: 0 in 0.251s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.754 182939 DEBUG nova.virt.libvirt.volume.remotefs [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Copying file /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.754 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb_resize/disk.info 192.168.122.101:/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.974 182939 DEBUG nova.compute.manager [req-3a47cc33-4bbd-45d3-8f44-43fb487e322e req-08a689e5-329a-4756-a08e-6578927d611b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-unplugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.974 182939 DEBUG oslo_concurrency.lockutils [req-3a47cc33-4bbd-45d3-8f44-43fb487e322e req-08a689e5-329a-4756-a08e-6578927d611b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.975 182939 DEBUG oslo_concurrency.lockutils [req-3a47cc33-4bbd-45d3-8f44-43fb487e322e req-08a689e5-329a-4756-a08e-6578927d611b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.975 182939 DEBUG oslo_concurrency.lockutils [req-3a47cc33-4bbd-45d3-8f44-43fb487e322e req-08a689e5-329a-4756-a08e-6578927d611b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.975 182939 DEBUG nova.compute.manager [req-3a47cc33-4bbd-45d3-8f44-43fb487e322e req-08a689e5-329a-4756-a08e-6578927d611b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] No waiting events found dispatching network-vif-unplugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.975 182939 WARNING nova.compute.manager [req-3a47cc33-4bbd-45d3-8f44-43fb487e322e req-08a689e5-329a-4756-a08e-6578927d611b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received unexpected event network-vif-unplugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for instance with vm_state active and task_state resize_migrating.
Jan 22 00:27:35 compute-0 nova_compute[182935]: 2026-01-22 00:27:35.988 182939 DEBUG oslo_concurrency.processutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "scp -C -r /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb_resize/disk.info 192.168.122.101:/var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk.info" returned: 0 in 0.234s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:27:36 compute-0 nova_compute[182935]: 2026-01-22 00:27:36.622 182939 DEBUG neutronclient.v2_0.client [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 00:27:36 compute-0 podman[239976]: 2026-01-22 00:27:36.676699168 +0000 UTC m=+0.053651439 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:27:36 compute-0 nova_compute[182935]: 2026-01-22 00:27:36.799 182939 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:36 compute-0 nova_compute[182935]: 2026-01-22 00:27:36.799 182939 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:36 compute-0 nova_compute[182935]: 2026-01-22 00:27:36.800 182939 DEBUG oslo_concurrency.lockutils [None req-d64d90c4-baf8-4f55-b56a-f30899272d45 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:37.233 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:27:37 compute-0 nova_compute[182935]: 2026-01-22 00:27:37.234 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:37.234 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:27:38 compute-0 nova_compute[182935]: 2026-01-22 00:27:38.155 182939 DEBUG nova.compute.manager [req-e42ec2be-7388-433f-8df1-40167c5bf885 req-62dd5fe9-aa97-462e-a31b-193d36af9092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:38 compute-0 nova_compute[182935]: 2026-01-22 00:27:38.155 182939 DEBUG oslo_concurrency.lockutils [req-e42ec2be-7388-433f-8df1-40167c5bf885 req-62dd5fe9-aa97-462e-a31b-193d36af9092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:38 compute-0 nova_compute[182935]: 2026-01-22 00:27:38.155 182939 DEBUG oslo_concurrency.lockutils [req-e42ec2be-7388-433f-8df1-40167c5bf885 req-62dd5fe9-aa97-462e-a31b-193d36af9092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:38 compute-0 nova_compute[182935]: 2026-01-22 00:27:38.156 182939 DEBUG oslo_concurrency.lockutils [req-e42ec2be-7388-433f-8df1-40167c5bf885 req-62dd5fe9-aa97-462e-a31b-193d36af9092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:38 compute-0 nova_compute[182935]: 2026-01-22 00:27:38.156 182939 DEBUG nova.compute.manager [req-e42ec2be-7388-433f-8df1-40167c5bf885 req-62dd5fe9-aa97-462e-a31b-193d36af9092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] No waiting events found dispatching network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:27:38 compute-0 nova_compute[182935]: 2026-01-22 00:27:38.156 182939 WARNING nova.compute.manager [req-e42ec2be-7388-433f-8df1-40167c5bf885 req-62dd5fe9-aa97-462e-a31b-193d36af9092 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received unexpected event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for instance with vm_state active and task_state resize_migrated.
Jan 22 00:27:38 compute-0 nova_compute[182935]: 2026-01-22 00:27:38.966 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:39 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:27:39.238 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:39 compute-0 nova_compute[182935]: 2026-01-22 00:27:39.787 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:40 compute-0 nova_compute[182935]: 2026-01-22 00:27:40.393 182939 DEBUG nova.compute.manager [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-changed-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:40 compute-0 nova_compute[182935]: 2026-01-22 00:27:40.394 182939 DEBUG nova.compute.manager [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Refreshing instance network info cache due to event network-changed-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:27:40 compute-0 nova_compute[182935]: 2026-01-22 00:27:40.395 182939 DEBUG oslo_concurrency.lockutils [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:27:40 compute-0 nova_compute[182935]: 2026-01-22 00:27:40.395 182939 DEBUG oslo_concurrency.lockutils [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:27:40 compute-0 nova_compute[182935]: 2026-01-22 00:27:40.396 182939 DEBUG nova.network.neutron [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Refreshing network info cache for port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:27:42 compute-0 podman[239994]: 2026-01-22 00:27:42.681581698 +0000 UTC m=+0.057018527 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:27:42 compute-0 podman[239993]: 2026-01-22 00:27:42.695787806 +0000 UTC m=+0.072851514 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Jan 22 00:27:43 compute-0 nova_compute[182935]: 2026-01-22 00:27:43.029 182939 DEBUG nova.network.neutron [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updated VIF entry in instance network info cache for port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:27:43 compute-0 nova_compute[182935]: 2026-01-22 00:27:43.030 182939 DEBUG nova.network.neutron [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating instance_info_cache with network_info: [{"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:27:43 compute-0 nova_compute[182935]: 2026-01-22 00:27:43.301 182939 DEBUG oslo_concurrency.lockutils [req-eaffd75f-0ed7-4dbd-a946-dc60e5a0e634 req-fe340fb6-e67a-4e24-afbb-7893f533be27 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:27:43 compute-0 nova_compute[182935]: 2026-01-22 00:27:43.968 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:44 compute-0 nova_compute[182935]: 2026-01-22 00:27:44.788 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:44 compute-0 nova_compute[182935]: 2026-01-22 00:27:44.879 182939 DEBUG nova.compute.manager [req-c7d24026-805c-4dbb-9e54-0e8a9d67218e req-f5d575d5-b938-4c70-b142-d2c3582f4ff5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:44 compute-0 nova_compute[182935]: 2026-01-22 00:27:44.880 182939 DEBUG oslo_concurrency.lockutils [req-c7d24026-805c-4dbb-9e54-0e8a9d67218e req-f5d575d5-b938-4c70-b142-d2c3582f4ff5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:44 compute-0 nova_compute[182935]: 2026-01-22 00:27:44.880 182939 DEBUG oslo_concurrency.lockutils [req-c7d24026-805c-4dbb-9e54-0e8a9d67218e req-f5d575d5-b938-4c70-b142-d2c3582f4ff5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:44 compute-0 nova_compute[182935]: 2026-01-22 00:27:44.881 182939 DEBUG oslo_concurrency.lockutils [req-c7d24026-805c-4dbb-9e54-0e8a9d67218e req-f5d575d5-b938-4c70-b142-d2c3582f4ff5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:44 compute-0 nova_compute[182935]: 2026-01-22 00:27:44.881 182939 DEBUG nova.compute.manager [req-c7d24026-805c-4dbb-9e54-0e8a9d67218e req-f5d575d5-b938-4c70-b142-d2c3582f4ff5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] No waiting events found dispatching network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:27:44 compute-0 nova_compute[182935]: 2026-01-22 00:27:44.881 182939 WARNING nova.compute.manager [req-c7d24026-805c-4dbb-9e54-0e8a9d67218e req-f5d575d5-b938-4c70-b142-d2c3582f4ff5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received unexpected event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for instance with vm_state active and task_state resize_finish.
Jan 22 00:27:48 compute-0 nova_compute[182935]: 2026-01-22 00:27:48.200 182939 DEBUG nova.compute.manager [req-99b05eba-a415-475d-8e88-7be7a73275b8 req-c77cb988-bf89-444e-9bd0-9c34a43c8994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:27:48 compute-0 nova_compute[182935]: 2026-01-22 00:27:48.201 182939 DEBUG oslo_concurrency.lockutils [req-99b05eba-a415-475d-8e88-7be7a73275b8 req-c77cb988-bf89-444e-9bd0-9c34a43c8994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:48 compute-0 nova_compute[182935]: 2026-01-22 00:27:48.201 182939 DEBUG oslo_concurrency.lockutils [req-99b05eba-a415-475d-8e88-7be7a73275b8 req-c77cb988-bf89-444e-9bd0-9c34a43c8994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:48 compute-0 nova_compute[182935]: 2026-01-22 00:27:48.201 182939 DEBUG oslo_concurrency.lockutils [req-99b05eba-a415-475d-8e88-7be7a73275b8 req-c77cb988-bf89-444e-9bd0-9c34a43c8994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:48 compute-0 nova_compute[182935]: 2026-01-22 00:27:48.202 182939 DEBUG nova.compute.manager [req-99b05eba-a415-475d-8e88-7be7a73275b8 req-c77cb988-bf89-444e-9bd0-9c34a43c8994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] No waiting events found dispatching network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:27:48 compute-0 nova_compute[182935]: 2026-01-22 00:27:48.202 182939 WARNING nova.compute.manager [req-99b05eba-a415-475d-8e88-7be7a73275b8 req-c77cb988-bf89-444e-9bd0-9c34a43c8994 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Received unexpected event network-vif-plugged-ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for instance with vm_state resized and task_state None.
Jan 22 00:27:48 compute-0 nova_compute[182935]: 2026-01-22 00:27:48.971 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.260 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041654.2596152, 53686c08-86df-445a-b433-6a2c7c590fdb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.261 182939 INFO nova.compute.manager [-] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] VM Stopped (Lifecycle Event)
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.749 182939 DEBUG nova.compute.manager [None req-2456304c-97ac-47e7-b329-fd0d4aec496d - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.754 182939 DEBUG nova.compute.manager [None req-2456304c-97ac-47e7-b329-fd0d4aec496d - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.791 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.899 182939 INFO nova.compute.manager [None req-2456304c-97ac-47e7-b329-fd0d4aec496d - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.904 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.904 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.905 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.905 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.915 182939 DEBUG oslo_concurrency.lockutils [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "53686c08-86df-445a-b433-6a2c7c590fdb" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.916 182939 DEBUG oslo_concurrency.lockutils [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.916 182939 DEBUG nova.compute.manager [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Going to confirm migration 20 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Jan 22 00:27:49 compute-0 nova_compute[182935]: 2026-01-22 00:27:49.980 182939 DEBUG nova.objects.instance [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'info_cache' on Instance uuid 53686c08-86df-445a-b433-6a2c7c590fdb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:27:50 compute-0 nova_compute[182935]: 2026-01-22 00:27:50.018 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-000000a6, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/53686c08-86df-445a-b433-6a2c7c590fdb/disk
Jan 22 00:27:50 compute-0 nova_compute[182935]: 2026-01-22 00:27:50.161 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:27:50 compute-0 nova_compute[182935]: 2026-01-22 00:27:50.162 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5685MB free_disk=73.09823608398438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:27:50 compute-0 nova_compute[182935]: 2026-01-22 00:27:50.162 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:50 compute-0 nova_compute[182935]: 2026-01-22 00:27:50.163 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:50 compute-0 nova_compute[182935]: 2026-01-22 00:27:50.233 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Migration for instance 53686c08-86df-445a-b433-6a2c7c590fdb refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Jan 22 00:27:50 compute-0 nova_compute[182935]: 2026-01-22 00:27:50.333 182939 INFO nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating resource usage from migration f701be3c-7bb1-4932-accc-5d8672c233bc
Jan 22 00:27:50 compute-0 nova_compute[182935]: 2026-01-22 00:27:50.333 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Starting to track outgoing migration f701be3c-7bb1-4932-accc-5d8672c233bc with flavor c3389c03-89c4-4ff5-9e03-1a99d41713d4 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444
Jan 22 00:27:50 compute-0 nova_compute[182935]: 2026-01-22 00:27:50.377 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Migration f701be3c-7bb1-4932-accc-5d8672c233bc is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Jan 22 00:27:50 compute-0 nova_compute[182935]: 2026-01-22 00:27:50.377 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:27:50 compute-0 nova_compute[182935]: 2026-01-22 00:27:50.378 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:27:50 compute-0 nova_compute[182935]: 2026-01-22 00:27:50.442 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:27:51 compute-0 nova_compute[182935]: 2026-01-22 00:27:51.017 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:27:51 compute-0 nova_compute[182935]: 2026-01-22 00:27:51.089 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:27:51 compute-0 nova_compute[182935]: 2026-01-22 00:27:51.090 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:51 compute-0 nova_compute[182935]: 2026-01-22 00:27:51.656 182939 DEBUG neutronclient.v2_0.client [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port ac62ef89-aec4-41c9-83dd-366bdfc1c0bd for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 22 00:27:51 compute-0 nova_compute[182935]: 2026-01-22 00:27:51.656 182939 DEBUG oslo_concurrency.lockutils [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:27:51 compute-0 nova_compute[182935]: 2026-01-22 00:27:51.657 182939 DEBUG oslo_concurrency.lockutils [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:27:51 compute-0 nova_compute[182935]: 2026-01-22 00:27:51.657 182939 DEBUG nova.network.neutron [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:27:53 compute-0 nova_compute[182935]: 2026-01-22 00:27:53.973 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:54 compute-0 nova_compute[182935]: 2026-01-22 00:27:54.091 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:54 compute-0 nova_compute[182935]: 2026-01-22 00:27:54.092 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:27:54 compute-0 nova_compute[182935]: 2026-01-22 00:27:54.689 182939 DEBUG nova.network.neutron [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: 53686c08-86df-445a-b433-6a2c7c590fdb] Updating instance_info_cache with network_info: [{"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:27:54 compute-0 nova_compute[182935]: 2026-01-22 00:27:54.793 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:54 compute-0 nova_compute[182935]: 2026-01-22 00:27:54.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:54 compute-0 nova_compute[182935]: 2026-01-22 00:27:54.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:27:54 compute-0 nova_compute[182935]: 2026-01-22 00:27:54.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:27:55 compute-0 nova_compute[182935]: 2026-01-22 00:27:55.317 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:27:55 compute-0 nova_compute[182935]: 2026-01-22 00:27:55.318 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:55 compute-0 nova_compute[182935]: 2026-01-22 00:27:55.880 182939 DEBUG oslo_concurrency.lockutils [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-53686c08-86df-445a-b433-6a2c7c590fdb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:27:55 compute-0 nova_compute[182935]: 2026-01-22 00:27:55.881 182939 DEBUG nova.objects.instance [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid 53686c08-86df-445a-b433-6a2c7c590fdb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.003 182939 DEBUG nova.virt.libvirt.vif [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:26:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1419654419',display_name='tempest-TestNetworkAdvancedServerOps-server-1419654419',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1419654419',id=166,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ5yIuZRAnyr0fY4MG0JJtrl2YmC7LkxFjTLIDSY0MneCjEwMPb+R0i/C3i76549W+tX7/jAJYcJ/Zhm6OZjTa9donlIVfIM40NClFZ/uOy/0cpEFxgyZR4Q96O10ulXCA==',key_name='tempest-TestNetworkAdvancedServerOps-623011376',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:27:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-1itpmhcp',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:27:45Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=53686c08-86df-445a-b433-6a2c7c590fdb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.004 182939 DEBUG nova.network.os_vif_util [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "address": "fa:16:3e:5b:20:cd", "network": {"id": "3b1ad694-cd0e-4047-b840-b090066a26f4", "bridge": "br-int", "label": "tempest-network-smoke--1985720748", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac62ef89-ae", "ovs_interfaceid": "ac62ef89-aec4-41c9-83dd-366bdfc1c0bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.005 182939 DEBUG nova.network.os_vif_util [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.005 182939 DEBUG os_vif [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.006 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.006 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac62ef89-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.007 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.009 182939 INFO os_vif [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:20:cd,bridge_name='br-int',has_traffic_filtering=True,id=ac62ef89-aec4-41c9-83dd-366bdfc1c0bd,network=Network(3b1ad694-cd0e-4047-b840-b090066a26f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac62ef89-ae')
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.009 182939 DEBUG oslo_concurrency.lockutils [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.009 182939 DEBUG oslo_concurrency.lockutils [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.166 182939 DEBUG nova.compute.provider_tree [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.243 182939 DEBUG nova.scheduler.client.report [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.337 182939 DEBUG oslo_concurrency.lockutils [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.328s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.571 182939 INFO nova.scheduler.client.report [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Deleted allocation for migration f701be3c-7bb1-4932-accc-5d8672c233bc
Jan 22 00:27:56 compute-0 nova_compute[182935]: 2026-01-22 00:27:56.708 182939 DEBUG oslo_concurrency.lockutils [None req-442cbd4b-423c-4ded-8743-62406c6ea748 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "53686c08-86df-445a-b433-6a2c7c590fdb" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 6.792s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:27:58 compute-0 nova_compute[182935]: 2026-01-22 00:27:58.975 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:27:59 compute-0 podman[240030]: 2026-01-22 00:27:59.694303781 +0000 UTC m=+0.055853761 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:27:59 compute-0 podman[240037]: 2026-01-22 00:27:59.71611545 +0000 UTC m=+0.064008104 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:27:59 compute-0 podman[240031]: 2026-01-22 00:27:59.725163835 +0000 UTC m=+0.081227674 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:27:59 compute-0 nova_compute[182935]: 2026-01-22 00:27:59.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:27:59 compute-0 nova_compute[182935]: 2026-01-22 00:27:59.794 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:01 compute-0 nova_compute[182935]: 2026-01-22 00:28:01.821 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:01 compute-0 nova_compute[182935]: 2026-01-22 00:28:01.821 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:02 compute-0 nova_compute[182935]: 2026-01-22 00:28:02.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:03.224 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:03.225 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:03.225 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:03 compute-0 nova_compute[182935]: 2026-01-22 00:28:03.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:04 compute-0 nova_compute[182935]: 2026-01-22 00:28:04.014 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:04 compute-0 nova_compute[182935]: 2026-01-22 00:28:04.064 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:04 compute-0 nova_compute[182935]: 2026-01-22 00:28:04.795 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:07 compute-0 podman[240100]: 2026-01-22 00:28:07.670737261 +0000 UTC m=+0.047010570 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 22 00:28:08 compute-0 nova_compute[182935]: 2026-01-22 00:28:08.817 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:09 compute-0 nova_compute[182935]: 2026-01-22 00:28:09.016 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:09 compute-0 nova_compute[182935]: 2026-01-22 00:28:09.798 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:10 compute-0 nova_compute[182935]: 2026-01-22 00:28:10.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:11 compute-0 sshd-session[240120]: Invalid user mongodb from 188.166.69.60 port 48508
Jan 22 00:28:11 compute-0 sshd-session[240120]: Connection closed by invalid user mongodb 188.166.69.60 port 48508 [preauth]
Jan 22 00:28:11 compute-0 nova_compute[182935]: 2026-01-22 00:28:11.699 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:11 compute-0 nova_compute[182935]: 2026-01-22 00:28:11.845 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:13 compute-0 podman[240124]: 2026-01-22 00:28:13.689191844 +0000 UTC m=+0.057599472 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 22 00:28:13 compute-0 podman[240123]: 2026-01-22 00:28:13.690420663 +0000 UTC m=+0.059648990 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:28:14 compute-0 nova_compute[182935]: 2026-01-22 00:28:14.080 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:14 compute-0 nova_compute[182935]: 2026-01-22 00:28:14.800 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:16 compute-0 nova_compute[182935]: 2026-01-22 00:28:16.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:16 compute-0 nova_compute[182935]: 2026-01-22 00:28:16.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:28:19 compute-0 nova_compute[182935]: 2026-01-22 00:28:19.082 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:19 compute-0 nova_compute[182935]: 2026-01-22 00:28:19.802 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:28:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:28:24 compute-0 nova_compute[182935]: 2026-01-22 00:28:24.084 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:24 compute-0 nova_compute[182935]: 2026-01-22 00:28:24.804 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:29 compute-0 nova_compute[182935]: 2026-01-22 00:28:29.084 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:29 compute-0 nova_compute[182935]: 2026-01-22 00:28:29.806 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:30 compute-0 podman[240164]: 2026-01-22 00:28:30.680625879 +0000 UTC m=+0.055093262 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:28:30 compute-0 podman[240166]: 2026-01-22 00:28:30.688685461 +0000 UTC m=+0.052708196 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:28:30 compute-0 podman[240165]: 2026-01-22 00:28:30.723592251 +0000 UTC m=+0.090453254 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:28:34 compute-0 nova_compute[182935]: 2026-01-22 00:28:34.124 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:34 compute-0 nova_compute[182935]: 2026-01-22 00:28:34.808 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:34 compute-0 nova_compute[182935]: 2026-01-22 00:28:34.835 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:34 compute-0 nova_compute[182935]: 2026-01-22 00:28:34.836 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:28:34 compute-0 nova_compute[182935]: 2026-01-22 00:28:34.866 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:28:37 compute-0 nova_compute[182935]: 2026-01-22 00:28:37.966 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:37 compute-0 nova_compute[182935]: 2026-01-22 00:28:37.967 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:38 compute-0 nova_compute[182935]: 2026-01-22 00:28:38.090 182939 DEBUG nova.compute.manager [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:28:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:38.255 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:28:38 compute-0 nova_compute[182935]: 2026-01-22 00:28:38.255 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:38.256 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:28:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:38.256 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:28:38 compute-0 nova_compute[182935]: 2026-01-22 00:28:38.371 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:38 compute-0 nova_compute[182935]: 2026-01-22 00:28:38.372 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:38 compute-0 nova_compute[182935]: 2026-01-22 00:28:38.379 182939 DEBUG nova.virt.hardware [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:28:38 compute-0 nova_compute[182935]: 2026-01-22 00:28:38.380 182939 INFO nova.compute.claims [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:28:38 compute-0 podman[240235]: 2026-01-22 00:28:38.675647281 +0000 UTC m=+0.050052072 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:28:38 compute-0 nova_compute[182935]: 2026-01-22 00:28:38.890 182939 DEBUG nova.compute.provider_tree [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.037 182939 DEBUG nova.scheduler.client.report [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.158 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.232 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.861s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.233 182939 DEBUG nova.compute.manager [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.463 182939 DEBUG nova.compute.manager [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.465 182939 DEBUG nova.network.neutron [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.513 182939 INFO nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.559 182939 DEBUG nova.compute.manager [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.766 182939 DEBUG nova.compute.manager [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.769 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.770 182939 INFO nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Creating image(s)
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.770 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.771 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.772 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.790 182939 DEBUG oslo_concurrency.processutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.818 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.874 182939 DEBUG nova.policy [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.878 182939 DEBUG oslo_concurrency.processutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.878 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.879 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.890 182939 DEBUG oslo_concurrency.processutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.959 182939 DEBUG oslo_concurrency.processutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:28:39 compute-0 nova_compute[182935]: 2026-01-22 00:28:39.960 182939 DEBUG oslo_concurrency.processutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.003 182939 DEBUG oslo_concurrency.processutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.004 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.005 182939 DEBUG oslo_concurrency.processutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.075 182939 DEBUG oslo_concurrency.processutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.076 182939 DEBUG nova.virt.disk.api [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Checking if we can resize image /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.077 182939 DEBUG oslo_concurrency.processutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.140 182939 DEBUG oslo_concurrency.processutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.142 182939 DEBUG nova.virt.disk.api [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Cannot resize image /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.143 182939 DEBUG nova.objects.instance [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.172 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.172 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Ensure instance console log exists: /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.173 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.173 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:40 compute-0 nova_compute[182935]: 2026-01-22 00:28:40.173 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:41 compute-0 nova_compute[182935]: 2026-01-22 00:28:41.996 182939 DEBUG nova.network.neutron [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Successfully created port: 06e8eaa8-d435-4dcd-af5c-959e8d49754d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:28:43 compute-0 nova_compute[182935]: 2026-01-22 00:28:43.616 182939 DEBUG nova.network.neutron [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Successfully updated port: 06e8eaa8-d435-4dcd-af5c-959e8d49754d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:28:43 compute-0 nova_compute[182935]: 2026-01-22 00:28:43.631 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:28:43 compute-0 nova_compute[182935]: 2026-01-22 00:28:43.632 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:28:43 compute-0 nova_compute[182935]: 2026-01-22 00:28:43.632 182939 DEBUG nova.network.neutron [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:28:43 compute-0 nova_compute[182935]: 2026-01-22 00:28:43.856 182939 DEBUG nova.compute.manager [req-7047c21c-83ea-41c4-ae5b-f0df485e2d7b req-6f7ebce6-f1a9-40f7-a377-c8999ecfaee6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-changed-06e8eaa8-d435-4dcd-af5c-959e8d49754d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:28:43 compute-0 nova_compute[182935]: 2026-01-22 00:28:43.856 182939 DEBUG nova.compute.manager [req-7047c21c-83ea-41c4-ae5b-f0df485e2d7b req-6f7ebce6-f1a9-40f7-a377-c8999ecfaee6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Refreshing instance network info cache due to event network-changed-06e8eaa8-d435-4dcd-af5c-959e8d49754d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:28:43 compute-0 nova_compute[182935]: 2026-01-22 00:28:43.856 182939 DEBUG oslo_concurrency.lockutils [req-7047c21c-83ea-41c4-ae5b-f0df485e2d7b req-6f7ebce6-f1a9-40f7-a377-c8999ecfaee6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:28:43 compute-0 nova_compute[182935]: 2026-01-22 00:28:43.988 182939 DEBUG nova.network.neutron [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:28:44 compute-0 nova_compute[182935]: 2026-01-22 00:28:44.160 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:44 compute-0 podman[240269]: 2026-01-22 00:28:44.674069967 +0000 UTC m=+0.052905554 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Jan 22 00:28:44 compute-0 podman[240270]: 2026-01-22 00:28:44.703691809 +0000 UTC m=+0.078572681 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:28:44 compute-0 nova_compute[182935]: 2026-01-22 00:28:44.820 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.356 182939 DEBUG nova.network.neutron [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Updating instance_info_cache with network_info: [{"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.384 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.385 182939 DEBUG nova.compute.manager [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Instance network_info: |[{"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.385 182939 DEBUG oslo_concurrency.lockutils [req-7047c21c-83ea-41c4-ae5b-f0df485e2d7b req-6f7ebce6-f1a9-40f7-a377-c8999ecfaee6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.386 182939 DEBUG nova.network.neutron [req-7047c21c-83ea-41c4-ae5b-f0df485e2d7b req-6f7ebce6-f1a9-40f7-a377-c8999ecfaee6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Refreshing network info cache for port 06e8eaa8-d435-4dcd-af5c-959e8d49754d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.389 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Start _get_guest_xml network_info=[{"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.395 182939 WARNING nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.400 182939 DEBUG nova.virt.libvirt.host [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.401 182939 DEBUG nova.virt.libvirt.host [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.405 182939 DEBUG nova.virt.libvirt.host [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.406 182939 DEBUG nova.virt.libvirt.host [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.408 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.409 182939 DEBUG nova.virt.hardware [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.410 182939 DEBUG nova.virt.hardware [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.410 182939 DEBUG nova.virt.hardware [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.410 182939 DEBUG nova.virt.hardware [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.411 182939 DEBUG nova.virt.hardware [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.411 182939 DEBUG nova.virt.hardware [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.412 182939 DEBUG nova.virt.hardware [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.412 182939 DEBUG nova.virt.hardware [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.413 182939 DEBUG nova.virt.hardware [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.413 182939 DEBUG nova.virt.hardware [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.414 182939 DEBUG nova.virt.hardware [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.422 182939 DEBUG nova.virt.libvirt.vif [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:28:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-380645357',display_name='tempest-TestNetworkAdvancedServerOps-server-380645357',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-380645357',id=167,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH7XBLZCDFN1SjQd5pdAiqzITq+BHor59G0Z0hkCJ16rGPV5ckUS2yxojxpRQnEFr5g70UZgNl62OwdVVuKhMenMnTtj7l+/b1lrTkAtU0xv67CvUH7GWdEje0B1GYX0Bg==',key_name='tempest-TestNetworkAdvancedServerOps-346227063',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-4t40upso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:28:39Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=c4b2800e-1798-4a1f-b4a2-e870e907eb2a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.423 182939 DEBUG nova.network.os_vif_util [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.424 182939 DEBUG nova.network.os_vif_util [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.427 182939 DEBUG nova.objects.instance [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.450 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:28:45 compute-0 nova_compute[182935]:   <uuid>c4b2800e-1798-4a1f-b4a2-e870e907eb2a</uuid>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   <name>instance-000000a7</name>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-380645357</nova:name>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:28:45</nova:creationTime>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:28:45 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:28:45 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:28:45 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:28:45 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:28:45 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:28:45 compute-0 nova_compute[182935]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:28:45 compute-0 nova_compute[182935]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:28:45 compute-0 nova_compute[182935]:         <nova:port uuid="06e8eaa8-d435-4dcd-af5c-959e8d49754d">
Jan 22 00:28:45 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <system>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <entry name="serial">c4b2800e-1798-4a1f-b4a2-e870e907eb2a</entry>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <entry name="uuid">c4b2800e-1798-4a1f-b4a2-e870e907eb2a</entry>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     </system>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   <os>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   </os>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   <features>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   </features>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.config"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:d3:d1:68"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <target dev="tap06e8eaa8-d4"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/console.log" append="off"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <video>
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     </video>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:28:45 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:28:45 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:28:45 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:28:45 compute-0 nova_compute[182935]: </domain>
Jan 22 00:28:45 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.452 182939 DEBUG nova.compute.manager [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Preparing to wait for external event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.452 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.452 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.453 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.453 182939 DEBUG nova.virt.libvirt.vif [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:28:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-380645357',display_name='tempest-TestNetworkAdvancedServerOps-server-380645357',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-380645357',id=167,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH7XBLZCDFN1SjQd5pdAiqzITq+BHor59G0Z0hkCJ16rGPV5ckUS2yxojxpRQnEFr5g70UZgNl62OwdVVuKhMenMnTtj7l+/b1lrTkAtU0xv67CvUH7GWdEje0B1GYX0Bg==',key_name='tempest-TestNetworkAdvancedServerOps-346227063',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-4t40upso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:28:39Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=c4b2800e-1798-4a1f-b4a2-e870e907eb2a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.454 182939 DEBUG nova.network.os_vif_util [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.455 182939 DEBUG nova.network.os_vif_util [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.455 182939 DEBUG os_vif [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.456 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.456 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.457 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.459 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.460 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e8eaa8-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.460 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06e8eaa8-d4, col_values=(('external_ids', {'iface-id': '06e8eaa8-d435-4dcd-af5c-959e8d49754d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:d1:68', 'vm-uuid': 'c4b2800e-1798-4a1f-b4a2-e870e907eb2a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.462 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:45 compute-0 NetworkManager[55139]: <info>  [1769041725.4631] manager: (tap06e8eaa8-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.465 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.467 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.468 182939 INFO os_vif [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4')
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.526 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.526 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.527 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No VIF found with MAC fa:16:3e:d3:d1:68, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:28:45 compute-0 nova_compute[182935]: 2026-01-22 00:28:45.528 182939 INFO nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Using config drive
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.315 182939 INFO nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Creating config drive at /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.config
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.320 182939 DEBUG oslo_concurrency.processutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkyp8nuzl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.447 182939 DEBUG oslo_concurrency.processutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkyp8nuzl" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:28:46 compute-0 kernel: tap06e8eaa8-d4: entered promiscuous mode
Jan 22 00:28:46 compute-0 NetworkManager[55139]: <info>  [1769041726.5022] manager: (tap06e8eaa8-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/317)
Jan 22 00:28:46 compute-0 systemd-udevd[240328]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:28:46 compute-0 ovn_controller[95047]: 2026-01-22T00:28:46Z|00649|binding|INFO|Claiming lport 06e8eaa8-d435-4dcd-af5c-959e8d49754d for this chassis.
Jan 22 00:28:46 compute-0 ovn_controller[95047]: 2026-01-22T00:28:46Z|00650|binding|INFO|06e8eaa8-d435-4dcd-af5c-959e8d49754d: Claiming fa:16:3e:d3:d1:68 10.100.0.3
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.537 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.540 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:46 compute-0 NetworkManager[55139]: <info>  [1769041726.5482] device (tap06e8eaa8-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:28:46 compute-0 NetworkManager[55139]: <info>  [1769041726.5491] device (tap06e8eaa8-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.551 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:d1:68 10.100.0.3'], port_security=['fa:16:3e:d3:d1:68 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c4b2800e-1798-4a1f-b4a2-e870e907eb2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8a99868-0c12-4b68-ad26-6e85ed918505', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c4d4a9b7-d02f-4e4d-8562-8ba78b9b38a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbe04a09-5fce-4283-b95b-9116ec0f414f, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=06e8eaa8-d435-4dcd-af5c-959e8d49754d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.552 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 06e8eaa8-d435-4dcd-af5c-959e8d49754d in datapath d8a99868-0c12-4b68-ad26-6e85ed918505 bound to our chassis
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.552 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d8a99868-0c12-4b68-ad26-6e85ed918505
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.567 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d3e220c3-69d2-4791-bf7a-3784f51219c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.568 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd8a99868-01 in ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.571 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd8a99868-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.571 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b5b1db57-5e60-4e04-8fc0-c10577b45882]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.572 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c4ed68c0-a7e4-4bb9-9e54-724947476baf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 systemd-machined[154182]: New machine qemu-85-instance-000000a7.
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.582 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[11719fb3-39c1-4eb1-95b5-d4b202efa6dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 ovn_controller[95047]: 2026-01-22T00:28:46Z|00651|binding|INFO|Setting lport 06e8eaa8-d435-4dcd-af5c-959e8d49754d ovn-installed in OVS
Jan 22 00:28:46 compute-0 ovn_controller[95047]: 2026-01-22T00:28:46Z|00652|binding|INFO|Setting lport 06e8eaa8-d435-4dcd-af5c-959e8d49754d up in Southbound
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.594 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.594 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4b20e160-1781-49ea-97f3-364c8b11a005]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 systemd[1]: Started Virtual Machine qemu-85-instance-000000a7.
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.631 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[4c089d7a-bcb7-4d4b-b9cc-7b5c1ed96f9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 NetworkManager[55139]: <info>  [1769041726.6387] manager: (tapd8a99868-00): new Veth device (/org/freedesktop/NetworkManager/Devices/318)
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.636 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[10b6f560-9dfa-4be2-a15d-9772cf653513]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.667 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[75275505-7880-4d4a-ab0a-4c99701fab58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.670 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2b3d78-bdf5-4072-b557-4f70586d8a5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 NetworkManager[55139]: <info>  [1769041726.6872] device (tapd8a99868-00): carrier: link connected
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.690 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8257f6e1-a0bc-4c70-9c1c-def624a0fa3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.705 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[482ab142-9c8e-4d74-8a49-3801449dce79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8a99868-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:8e:b3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 622942, 'reachable_time': 16992, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240364, 'error': None, 'target': 'ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.718 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2f23a556-d707-4ed3-8a8f-f68cb145544f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9c:8eb3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 622942, 'tstamp': 622942}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240365, 'error': None, 'target': 'ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.731 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2cf5a534-2ac7-4bc3-a7be-ffa8a020a78b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8a99868-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:8e:b3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 208], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 622942, 'reachable_time': 16992, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240366, 'error': None, 'target': 'ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.760 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d3936db5-2099-424c-8f25-5ca66eabdcfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.822 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[83ed4d1e-41ac-475a-ab1b-39f5c85ab875]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.824 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8a99868-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.824 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.824 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8a99868-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:28:46 compute-0 NetworkManager[55139]: <info>  [1769041726.8268] manager: (tapd8a99868-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Jan 22 00:28:46 compute-0 kernel: tapd8a99868-00: entered promiscuous mode
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.826 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.834 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd8a99868-00, col_values=(('external_ids', {'iface-id': '20a373d3-eb11-4b35-a16e-6f1df7e4cf14'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.835 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:46 compute-0 ovn_controller[95047]: 2026-01-22T00:28:46Z|00653|binding|INFO|Releasing lport 20a373d3-eb11-4b35-a16e-6f1df7e4cf14 from this chassis (sb_readonly=0)
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.835 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.847 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.848 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d8a99868-0c12-4b68-ad26-6e85ed918505.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d8a99868-0c12-4b68-ad26-6e85ed918505.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.849 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f81b6935-0206-451e-bd76-33440f76d36f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.850 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-d8a99868-0c12-4b68-ad26-6e85ed918505
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/d8a99868-0c12-4b68-ad26-6e85ed918505.pid.haproxy
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID d8a99868-0c12-4b68-ad26-6e85ed918505
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:28:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:28:46.850 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505', 'env', 'PROCESS_TAG=haproxy-d8a99868-0c12-4b68-ad26-6e85ed918505', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d8a99868-0c12-4b68-ad26-6e85ed918505.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.994 182939 DEBUG nova.compute.manager [req-3f19f664-8434-46be-aa4d-edb5bd6b8121 req-08ba4765-0ae4-4a02-af2f-652388322538 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.995 182939 DEBUG oslo_concurrency.lockutils [req-3f19f664-8434-46be-aa4d-edb5bd6b8121 req-08ba4765-0ae4-4a02-af2f-652388322538 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.995 182939 DEBUG oslo_concurrency.lockutils [req-3f19f664-8434-46be-aa4d-edb5bd6b8121 req-08ba4765-0ae4-4a02-af2f-652388322538 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.995 182939 DEBUG oslo_concurrency.lockutils [req-3f19f664-8434-46be-aa4d-edb5bd6b8121 req-08ba4765-0ae4-4a02-af2f-652388322538 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:46 compute-0 nova_compute[182935]: 2026-01-22 00:28:46.995 182939 DEBUG nova.compute.manager [req-3f19f664-8434-46be-aa4d-edb5bd6b8121 req-08ba4765-0ae4-4a02-af2f-652388322538 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Processing event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:28:47 compute-0 podman[240399]: 2026-01-22 00:28:47.204318297 +0000 UTC m=+0.044176393 container create b18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 00:28:47 compute-0 systemd[1]: Started libpod-conmon-b18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a.scope.
Jan 22 00:28:47 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:28:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee104473dc17322af9b7a7cf1c078f383cf4c7497d798a82a3a85ea9a01ae602/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:28:47 compute-0 podman[240399]: 2026-01-22 00:28:47.268369618 +0000 UTC m=+0.108227744 container init b18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:28:47 compute-0 podman[240399]: 2026-01-22 00:28:47.276562525 +0000 UTC m=+0.116420611 container start b18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:28:47 compute-0 podman[240399]: 2026-01-22 00:28:47.181692243 +0000 UTC m=+0.021550339 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:28:47 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240418]: [NOTICE]   (240425) : New worker (240428) forked
Jan 22 00:28:47 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240418]: [NOTICE]   (240425) : Loading success.
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.323 182939 DEBUG nova.compute.manager [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.325 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041727.3227217, c4b2800e-1798-4a1f-b4a2-e870e907eb2a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.325 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] VM Started (Lifecycle Event)
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.328 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.332 182939 INFO nova.virt.libvirt.driver [-] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Instance spawned successfully.
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.332 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.355 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.364 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.364 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.365 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.365 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.366 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.366 182939 DEBUG nova.virt.libvirt.driver [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.371 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.402 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.403 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041727.3231423, c4b2800e-1798-4a1f-b4a2-e870e907eb2a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.403 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] VM Paused (Lifecycle Event)
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.445 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.449 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041727.3275626, c4b2800e-1798-4a1f-b4a2-e870e907eb2a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.450 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] VM Resumed (Lifecycle Event)
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.477 182939 INFO nova.compute.manager [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Took 7.71 seconds to spawn the instance on the hypervisor.
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.481 182939 DEBUG nova.compute.manager [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.483 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.489 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.520 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.584 182939 INFO nova.compute.manager [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Took 9.32 seconds to build instance.
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.609 182939 DEBUG oslo_concurrency.lockutils [None req-41b9c993-1f94-4f8d-97f9-f0afbe85b9bb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.798 182939 DEBUG nova.network.neutron [req-7047c21c-83ea-41c4-ae5b-f0df485e2d7b req-6f7ebce6-f1a9-40f7-a377-c8999ecfaee6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Updated VIF entry in instance network info cache for port 06e8eaa8-d435-4dcd-af5c-959e8d49754d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.800 182939 DEBUG nova.network.neutron [req-7047c21c-83ea-41c4-ae5b-f0df485e2d7b req-6f7ebce6-f1a9-40f7-a377-c8999ecfaee6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Updating instance_info_cache with network_info: [{"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:28:47 compute-0 nova_compute[182935]: 2026-01-22 00:28:47.820 182939 DEBUG oslo_concurrency.lockutils [req-7047c21c-83ea-41c4-ae5b-f0df485e2d7b req-6f7ebce6-f1a9-40f7-a377-c8999ecfaee6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:28:49 compute-0 nova_compute[182935]: 2026-01-22 00:28:49.127 182939 DEBUG nova.compute.manager [req-5bc624c7-2da1-4713-9a72-af90c2620c0e req-11f80747-82a0-4c4b-ac2d-bab594d156bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:28:49 compute-0 nova_compute[182935]: 2026-01-22 00:28:49.128 182939 DEBUG oslo_concurrency.lockutils [req-5bc624c7-2da1-4713-9a72-af90c2620c0e req-11f80747-82a0-4c4b-ac2d-bab594d156bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:49 compute-0 nova_compute[182935]: 2026-01-22 00:28:49.128 182939 DEBUG oslo_concurrency.lockutils [req-5bc624c7-2da1-4713-9a72-af90c2620c0e req-11f80747-82a0-4c4b-ac2d-bab594d156bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:49 compute-0 nova_compute[182935]: 2026-01-22 00:28:49.129 182939 DEBUG oslo_concurrency.lockutils [req-5bc624c7-2da1-4713-9a72-af90c2620c0e req-11f80747-82a0-4c4b-ac2d-bab594d156bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:49 compute-0 nova_compute[182935]: 2026-01-22 00:28:49.129 182939 DEBUG nova.compute.manager [req-5bc624c7-2da1-4713-9a72-af90c2620c0e req-11f80747-82a0-4c4b-ac2d-bab594d156bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] No waiting events found dispatching network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:28:49 compute-0 nova_compute[182935]: 2026-01-22 00:28:49.129 182939 WARNING nova.compute.manager [req-5bc624c7-2da1-4713-9a72-af90c2620c0e req-11f80747-82a0-4c4b-ac2d-bab594d156bd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received unexpected event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d for instance with vm_state active and task_state None.
Jan 22 00:28:49 compute-0 nova_compute[182935]: 2026-01-22 00:28:49.168 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:50 compute-0 nova_compute[182935]: 2026-01-22 00:28:50.464 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:51 compute-0 nova_compute[182935]: 2026-01-22 00:28:51.824 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:51 compute-0 nova_compute[182935]: 2026-01-22 00:28:51.847 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:51 compute-0 nova_compute[182935]: 2026-01-22 00:28:51.847 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:51 compute-0 nova_compute[182935]: 2026-01-22 00:28:51.847 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:51 compute-0 nova_compute[182935]: 2026-01-22 00:28:51.848 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:28:51 compute-0 nova_compute[182935]: 2026-01-22 00:28:51.917 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.011 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.012 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.067 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.221 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.223 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5567MB free_disk=73.11809539794922GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.223 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.223 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.315 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance c4b2800e-1798-4a1f-b4a2-e870e907eb2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.315 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.316 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.358 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.375 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.410 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:28:52 compute-0 nova_compute[182935]: 2026-01-22 00:28:52.410 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:28:53 compute-0 NetworkManager[55139]: <info>  [1769041733.0362] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Jan 22 00:28:53 compute-0 NetworkManager[55139]: <info>  [1769041733.0378] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/321)
Jan 22 00:28:53 compute-0 nova_compute[182935]: 2026-01-22 00:28:53.038 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:53 compute-0 ovn_controller[95047]: 2026-01-22T00:28:53Z|00654|binding|INFO|Releasing lport 20a373d3-eb11-4b35-a16e-6f1df7e4cf14 from this chassis (sb_readonly=0)
Jan 22 00:28:53 compute-0 nova_compute[182935]: 2026-01-22 00:28:53.065 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:53 compute-0 ovn_controller[95047]: 2026-01-22T00:28:53Z|00655|binding|INFO|Releasing lport 20a373d3-eb11-4b35-a16e-6f1df7e4cf14 from this chassis (sb_readonly=0)
Jan 22 00:28:53 compute-0 nova_compute[182935]: 2026-01-22 00:28:53.071 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:53 compute-0 sshd-session[240445]: Invalid user mongodb from 188.166.69.60 port 46268
Jan 22 00:28:53 compute-0 sshd-session[240445]: Connection closed by invalid user mongodb 188.166.69.60 port 46268 [preauth]
Jan 22 00:28:54 compute-0 nova_compute[182935]: 2026-01-22 00:28:54.055 182939 DEBUG nova.compute.manager [req-99da8fdb-3ec4-4649-b1c1-f61ea8e632f3 req-47789aeb-a655-4944-ba2d-6d43545d36ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-changed-06e8eaa8-d435-4dcd-af5c-959e8d49754d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:28:54 compute-0 nova_compute[182935]: 2026-01-22 00:28:54.057 182939 DEBUG nova.compute.manager [req-99da8fdb-3ec4-4649-b1c1-f61ea8e632f3 req-47789aeb-a655-4944-ba2d-6d43545d36ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Refreshing instance network info cache due to event network-changed-06e8eaa8-d435-4dcd-af5c-959e8d49754d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:28:54 compute-0 nova_compute[182935]: 2026-01-22 00:28:54.057 182939 DEBUG oslo_concurrency.lockutils [req-99da8fdb-3ec4-4649-b1c1-f61ea8e632f3 req-47789aeb-a655-4944-ba2d-6d43545d36ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:28:54 compute-0 nova_compute[182935]: 2026-01-22 00:28:54.058 182939 DEBUG oslo_concurrency.lockutils [req-99da8fdb-3ec4-4649-b1c1-f61ea8e632f3 req-47789aeb-a655-4944-ba2d-6d43545d36ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:28:54 compute-0 nova_compute[182935]: 2026-01-22 00:28:54.058 182939 DEBUG nova.network.neutron [req-99da8fdb-3ec4-4649-b1c1-f61ea8e632f3 req-47789aeb-a655-4944-ba2d-6d43545d36ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Refreshing network info cache for port 06e8eaa8-d435-4dcd-af5c-959e8d49754d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:28:54 compute-0 nova_compute[182935]: 2026-01-22 00:28:54.197 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:55 compute-0 nova_compute[182935]: 2026-01-22 00:28:55.380 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:55 compute-0 nova_compute[182935]: 2026-01-22 00:28:55.381 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:28:55 compute-0 nova_compute[182935]: 2026-01-22 00:28:55.381 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:28:55 compute-0 nova_compute[182935]: 2026-01-22 00:28:55.458 182939 DEBUG nova.network.neutron [req-99da8fdb-3ec4-4649-b1c1-f61ea8e632f3 req-47789aeb-a655-4944-ba2d-6d43545d36ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Updated VIF entry in instance network info cache for port 06e8eaa8-d435-4dcd-af5c-959e8d49754d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:28:55 compute-0 nova_compute[182935]: 2026-01-22 00:28:55.459 182939 DEBUG nova.network.neutron [req-99da8fdb-3ec4-4649-b1c1-f61ea8e632f3 req-47789aeb-a655-4944-ba2d-6d43545d36ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Updating instance_info_cache with network_info: [{"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:28:55 compute-0 nova_compute[182935]: 2026-01-22 00:28:55.466 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:28:55 compute-0 nova_compute[182935]: 2026-01-22 00:28:55.478 182939 DEBUG oslo_concurrency.lockutils [req-99da8fdb-3ec4-4649-b1c1-f61ea8e632f3 req-47789aeb-a655-4944-ba2d-6d43545d36ec 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:28:55 compute-0 nova_compute[182935]: 2026-01-22 00:28:55.673 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:28:55 compute-0 nova_compute[182935]: 2026-01-22 00:28:55.674 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:28:55 compute-0 nova_compute[182935]: 2026-01-22 00:28:55.675 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:28:55 compute-0 nova_compute[182935]: 2026-01-22 00:28:55.675 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:28:57 compute-0 nova_compute[182935]: 2026-01-22 00:28:57.584 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Updating instance_info_cache with network_info: [{"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:28:57 compute-0 nova_compute[182935]: 2026-01-22 00:28:57.602 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:28:57 compute-0 nova_compute[182935]: 2026-01-22 00:28:57.603 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:28:57 compute-0 nova_compute[182935]: 2026-01-22 00:28:57.603 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:57 compute-0 nova_compute[182935]: 2026-01-22 00:28:57.604 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:28:57 compute-0 nova_compute[182935]: 2026-01-22 00:28:57.604 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:28:59 compute-0 nova_compute[182935]: 2026-01-22 00:28:59.200 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:00 compute-0 nova_compute[182935]: 2026-01-22 00:29:00.469 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:00 compute-0 ovn_controller[95047]: 2026-01-22T00:29:00Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:d1:68 10.100.0.3
Jan 22 00:29:00 compute-0 ovn_controller[95047]: 2026-01-22T00:29:00Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:d1:68 10.100.0.3
Jan 22 00:29:01 compute-0 podman[240467]: 2026-01-22 00:29:01.691621396 +0000 UTC m=+0.061862589 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:29:01 compute-0 podman[240465]: 2026-01-22 00:29:01.711663519 +0000 UTC m=+0.077966197 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:29:01 compute-0 podman[240466]: 2026-01-22 00:29:01.768182738 +0000 UTC m=+0.134487256 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 00:29:01 compute-0 nova_compute[182935]: 2026-01-22 00:29:01.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:01 compute-0 nova_compute[182935]: 2026-01-22 00:29:01.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:03.225 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:03.226 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:03.226 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:04 compute-0 nova_compute[182935]: 2026-01-22 00:29:04.258 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:04 compute-0 nova_compute[182935]: 2026-01-22 00:29:04.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:05 compute-0 nova_compute[182935]: 2026-01-22 00:29:05.507 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:05 compute-0 nova_compute[182935]: 2026-01-22 00:29:05.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:07 compute-0 nova_compute[182935]: 2026-01-22 00:29:07.399 182939 INFO nova.compute.manager [None req-bd585206-7259-4e4b-ac7b-5f45a53b563b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Get console output
Jan 22 00:29:07 compute-0 nova_compute[182935]: 2026-01-22 00:29:07.405 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:29:07 compute-0 nova_compute[182935]: 2026-01-22 00:29:07.723 182939 DEBUG oslo_concurrency.lockutils [None req-0c4d5ec9-c5c8-49db-bad8-599a1436205a 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:07 compute-0 nova_compute[182935]: 2026-01-22 00:29:07.723 182939 DEBUG oslo_concurrency.lockutils [None req-0c4d5ec9-c5c8-49db-bad8-599a1436205a 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:07 compute-0 nova_compute[182935]: 2026-01-22 00:29:07.724 182939 DEBUG nova.compute.manager [None req-0c4d5ec9-c5c8-49db-bad8-599a1436205a 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:29:07 compute-0 nova_compute[182935]: 2026-01-22 00:29:07.727 182939 DEBUG nova.compute.manager [None req-0c4d5ec9-c5c8-49db-bad8-599a1436205a 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Jan 22 00:29:07 compute-0 nova_compute[182935]: 2026-01-22 00:29:07.728 182939 DEBUG nova.objects.instance [None req-0c4d5ec9-c5c8-49db-bad8-599a1436205a 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'flavor' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:29:07 compute-0 nova_compute[182935]: 2026-01-22 00:29:07.755 182939 DEBUG nova.objects.instance [None req-0c4d5ec9-c5c8-49db-bad8-599a1436205a 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'info_cache' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:29:07 compute-0 nova_compute[182935]: 2026-01-22 00:29:07.786 182939 DEBUG nova.virt.libvirt.driver [None req-0c4d5ec9-c5c8-49db-bad8-599a1436205a 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:29:09 compute-0 nova_compute[182935]: 2026-01-22 00:29:09.261 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:09 compute-0 podman[240542]: 2026-01-22 00:29:09.692785033 +0000 UTC m=+0.063018697 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:29:09 compute-0 kernel: tap06e8eaa8-d4 (unregistering): left promiscuous mode
Jan 22 00:29:09 compute-0 NetworkManager[55139]: <info>  [1769041749.9821] device (tap06e8eaa8-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:29:09 compute-0 ovn_controller[95047]: 2026-01-22T00:29:09Z|00656|binding|INFO|Releasing lport 06e8eaa8-d435-4dcd-af5c-959e8d49754d from this chassis (sb_readonly=0)
Jan 22 00:29:09 compute-0 ovn_controller[95047]: 2026-01-22T00:29:09Z|00657|binding|INFO|Setting lport 06e8eaa8-d435-4dcd-af5c-959e8d49754d down in Southbound
Jan 22 00:29:09 compute-0 nova_compute[182935]: 2026-01-22 00:29:09.990 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:09 compute-0 ovn_controller[95047]: 2026-01-22T00:29:09Z|00658|binding|INFO|Removing iface tap06e8eaa8-d4 ovn-installed in OVS
Jan 22 00:29:09 compute-0 nova_compute[182935]: 2026-01-22 00:29:09.993 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.000 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:d1:68 10.100.0.3'], port_security=['fa:16:3e:d3:d1:68 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c4b2800e-1798-4a1f-b4a2-e870e907eb2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8a99868-0c12-4b68-ad26-6e85ed918505', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c4d4a9b7-d02f-4e4d-8562-8ba78b9b38a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.222'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbe04a09-5fce-4283-b95b-9116ec0f414f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=06e8eaa8-d435-4dcd-af5c-959e8d49754d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.001 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 06e8eaa8-d435-4dcd-af5c-959e8d49754d in datapath d8a99868-0c12-4b68-ad26-6e85ed918505 unbound from our chassis
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.002 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8a99868-0c12-4b68-ad26-6e85ed918505, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.003 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d0098877-0315-4084-ba62-bf979edb8619]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.004 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505 namespace which is not needed anymore
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.009 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:10 compute-0 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000a7.scope: Deactivated successfully.
Jan 22 00:29:10 compute-0 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000a7.scope: Consumed 13.154s CPU time.
Jan 22 00:29:10 compute-0 systemd-machined[154182]: Machine qemu-85-instance-000000a7 terminated.
Jan 22 00:29:10 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240418]: [NOTICE]   (240425) : haproxy version is 2.8.14-c23fe91
Jan 22 00:29:10 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240418]: [NOTICE]   (240425) : path to executable is /usr/sbin/haproxy
Jan 22 00:29:10 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240418]: [WARNING]  (240425) : Exiting Master process...
Jan 22 00:29:10 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240418]: [WARNING]  (240425) : Exiting Master process...
Jan 22 00:29:10 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240418]: [ALERT]    (240425) : Current worker (240428) exited with code 143 (Terminated)
Jan 22 00:29:10 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240418]: [WARNING]  (240425) : All workers exited. Exiting... (0)
Jan 22 00:29:10 compute-0 systemd[1]: libpod-b18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a.scope: Deactivated successfully.
Jan 22 00:29:10 compute-0 podman[240585]: 2026-01-22 00:29:10.138416161 +0000 UTC m=+0.048217380 container died b18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 00:29:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a-userdata-shm.mount: Deactivated successfully.
Jan 22 00:29:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-ee104473dc17322af9b7a7cf1c078f383cf4c7497d798a82a3a85ea9a01ae602-merged.mount: Deactivated successfully.
Jan 22 00:29:10 compute-0 podman[240585]: 2026-01-22 00:29:10.173125396 +0000 UTC m=+0.082926615 container cleanup b18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:29:10 compute-0 systemd[1]: libpod-conmon-b18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a.scope: Deactivated successfully.
Jan 22 00:29:10 compute-0 podman[240614]: 2026-01-22 00:29:10.240059496 +0000 UTC m=+0.042267147 container remove b18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.246 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[257849ef-676f-4c3d-a927-7b61c8aaa33c]: (4, ('Thu Jan 22 12:29:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505 (b18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a)\nb18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a\nThu Jan 22 12:29:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505 (b18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a)\nb18364ad42e59bf74426f6adcb3fe75d0192a5a4430d048e76c83ef26edbbd9a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.248 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[eba8d31a-a219-41ef-8fde-43e36ffb3726]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.249 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8a99868-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.277 182939 DEBUG nova.compute.manager [req-cb04a56e-41bf-43ab-9867-241c3ed6dbf3 req-63b8e527-12ba-4790-9a4d-5a860cb1ef3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-vif-unplugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.277 182939 DEBUG oslo_concurrency.lockutils [req-cb04a56e-41bf-43ab-9867-241c3ed6dbf3 req-63b8e527-12ba-4790-9a4d-5a860cb1ef3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.277 182939 DEBUG oslo_concurrency.lockutils [req-cb04a56e-41bf-43ab-9867-241c3ed6dbf3 req-63b8e527-12ba-4790-9a4d-5a860cb1ef3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.277 182939 DEBUG oslo_concurrency.lockutils [req-cb04a56e-41bf-43ab-9867-241c3ed6dbf3 req-63b8e527-12ba-4790-9a4d-5a860cb1ef3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.277 182939 DEBUG nova.compute.manager [req-cb04a56e-41bf-43ab-9867-241c3ed6dbf3 req-63b8e527-12ba-4790-9a4d-5a860cb1ef3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] No waiting events found dispatching network-vif-unplugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.278 182939 WARNING nova.compute.manager [req-cb04a56e-41bf-43ab-9867-241c3ed6dbf3 req-63b8e527-12ba-4790-9a4d-5a860cb1ef3b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received unexpected event network-vif-unplugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d for instance with vm_state active and task_state powering-off.
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.283 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:10 compute-0 kernel: tapd8a99868-00: left promiscuous mode
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.299 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.302 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9c844d91-1f67-4be5-a03c-56b42c716516]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.318 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[77a78beb-4bba-46c0-a165-27343e2e34bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.319 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[94650bad-f092-4e6a-a51f-8f73a90c6aa0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.336 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9a36a5fa-af5b-4549-9966-ce39ab72c2e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 622936, 'reachable_time': 25226, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240651, 'error': None, 'target': 'ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.339 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:29:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:10.339 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a1c3cf-7727-496e-b9cd-fe70e659bc96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:10 compute-0 systemd[1]: run-netns-ovnmeta\x2dd8a99868\x2d0c12\x2d4b68\x2dad26\x2d6e85ed918505.mount: Deactivated successfully.
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.510 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.856 182939 INFO nova.virt.libvirt.driver [None req-0c4d5ec9-c5c8-49db-bad8-599a1436205a 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Instance shutdown successfully after 3 seconds.
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.862 182939 INFO nova.virt.libvirt.driver [-] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Instance destroyed successfully.
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.863 182939 DEBUG nova.objects.instance [None req-0c4d5ec9-c5c8-49db-bad8-599a1436205a 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'numa_topology' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.880 182939 DEBUG nova.compute.manager [None req-0c4d5ec9-c5c8-49db-bad8-599a1436205a 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:29:10 compute-0 nova_compute[182935]: 2026-01-22 00:29:10.979 182939 DEBUG oslo_concurrency.lockutils [None req-0c4d5ec9-c5c8-49db-bad8-599a1436205a 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:12 compute-0 nova_compute[182935]: 2026-01-22 00:29:12.344 182939 DEBUG nova.compute.manager [req-2a614085-e8ba-468c-b3e4-444628135f6f req-76f8be32-d393-4068-98e9-07ff014d33d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:29:12 compute-0 nova_compute[182935]: 2026-01-22 00:29:12.344 182939 DEBUG oslo_concurrency.lockutils [req-2a614085-e8ba-468c-b3e4-444628135f6f req-76f8be32-d393-4068-98e9-07ff014d33d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:12 compute-0 nova_compute[182935]: 2026-01-22 00:29:12.345 182939 DEBUG oslo_concurrency.lockutils [req-2a614085-e8ba-468c-b3e4-444628135f6f req-76f8be32-d393-4068-98e9-07ff014d33d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:12 compute-0 nova_compute[182935]: 2026-01-22 00:29:12.345 182939 DEBUG oslo_concurrency.lockutils [req-2a614085-e8ba-468c-b3e4-444628135f6f req-76f8be32-d393-4068-98e9-07ff014d33d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:12 compute-0 nova_compute[182935]: 2026-01-22 00:29:12.345 182939 DEBUG nova.compute.manager [req-2a614085-e8ba-468c-b3e4-444628135f6f req-76f8be32-d393-4068-98e9-07ff014d33d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] No waiting events found dispatching network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:29:12 compute-0 nova_compute[182935]: 2026-01-22 00:29:12.345 182939 WARNING nova.compute.manager [req-2a614085-e8ba-468c-b3e4-444628135f6f req-76f8be32-d393-4068-98e9-07ff014d33d1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received unexpected event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d for instance with vm_state stopped and task_state None.
Jan 22 00:29:14 compute-0 nova_compute[182935]: 2026-01-22 00:29:14.313 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:15 compute-0 nova_compute[182935]: 2026-01-22 00:29:15.062 182939 INFO nova.compute.manager [None req-c97647df-0e2b-47c5-975a-9ff5775bec68 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Get console output
Jan 22 00:29:15 compute-0 nova_compute[182935]: 2026-01-22 00:29:15.391 182939 DEBUG nova.objects.instance [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'flavor' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:29:15 compute-0 nova_compute[182935]: 2026-01-22 00:29:15.416 182939 DEBUG nova.objects.instance [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'info_cache' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:29:15 compute-0 nova_compute[182935]: 2026-01-22 00:29:15.440 182939 DEBUG oslo_concurrency.lockutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:29:15 compute-0 nova_compute[182935]: 2026-01-22 00:29:15.440 182939 DEBUG oslo_concurrency.lockutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:29:15 compute-0 nova_compute[182935]: 2026-01-22 00:29:15.440 182939 DEBUG nova.network.neutron [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:29:15 compute-0 nova_compute[182935]: 2026-01-22 00:29:15.512 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:15 compute-0 podman[240652]: 2026-01-22 00:29:15.691937353 +0000 UTC m=+0.059859701 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_id=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 00:29:15 compute-0 podman[240653]: 2026-01-22 00:29:15.707586969 +0000 UTC m=+0.069269208 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.480 182939 DEBUG nova.network.neutron [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Updating instance_info_cache with network_info: [{"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.500 182939 DEBUG oslo_concurrency.lockutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.523 182939 INFO nova.virt.libvirt.driver [-] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Instance destroyed successfully.
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.524 182939 DEBUG nova.objects.instance [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'numa_topology' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.535 182939 DEBUG nova.objects.instance [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.547 182939 DEBUG nova.virt.libvirt.vif [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:28:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-380645357',display_name='tempest-TestNetworkAdvancedServerOps-server-380645357',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-380645357',id=167,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH7XBLZCDFN1SjQd5pdAiqzITq+BHor59G0Z0hkCJ16rGPV5ckUS2yxojxpRQnEFr5g70UZgNl62OwdVVuKhMenMnTtj7l+/b1lrTkAtU0xv67CvUH7GWdEje0B1GYX0Bg==',key_name='tempest-TestNetworkAdvancedServerOps-346227063',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:28:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-4t40upso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:29:10Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=c4b2800e-1798-4a1f-b4a2-e870e907eb2a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.548 182939 DEBUG nova.network.os_vif_util [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.548 182939 DEBUG nova.network.os_vif_util [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.549 182939 DEBUG os_vif [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.550 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.550 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e8eaa8-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.592 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.594 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.597 182939 INFO os_vif [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4')
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.602 182939 DEBUG nova.virt.libvirt.driver [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Start _get_guest_xml network_info=[{"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.605 182939 WARNING nova.virt.libvirt.driver [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.610 182939 DEBUG nova.virt.libvirt.host [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.610 182939 DEBUG nova.virt.libvirt.host [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.614 182939 DEBUG nova.virt.libvirt.host [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.614 182939 DEBUG nova.virt.libvirt.host [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.615 182939 DEBUG nova.virt.libvirt.driver [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.615 182939 DEBUG nova.virt.hardware [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.616 182939 DEBUG nova.virt.hardware [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.616 182939 DEBUG nova.virt.hardware [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.616 182939 DEBUG nova.virt.hardware [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.616 182939 DEBUG nova.virt.hardware [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.616 182939 DEBUG nova.virt.hardware [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.617 182939 DEBUG nova.virt.hardware [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.617 182939 DEBUG nova.virt.hardware [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.617 182939 DEBUG nova.virt.hardware [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.617 182939 DEBUG nova.virt.hardware [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.617 182939 DEBUG nova.virt.hardware [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.618 182939 DEBUG nova.objects.instance [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'vcpu_model' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.634 182939 DEBUG oslo_concurrency.processutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.692 182939 DEBUG oslo_concurrency.processutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.config --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.694 182939 DEBUG oslo_concurrency.lockutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.694 182939 DEBUG oslo_concurrency.lockutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.695 182939 DEBUG oslo_concurrency.lockutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.696 182939 DEBUG nova.virt.libvirt.vif [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:28:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-380645357',display_name='tempest-TestNetworkAdvancedServerOps-server-380645357',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-380645357',id=167,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH7XBLZCDFN1SjQd5pdAiqzITq+BHor59G0Z0hkCJ16rGPV5ckUS2yxojxpRQnEFr5g70UZgNl62OwdVVuKhMenMnTtj7l+/b1lrTkAtU0xv67CvUH7GWdEje0B1GYX0Bg==',key_name='tempest-TestNetworkAdvancedServerOps-346227063',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:28:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-4t40upso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:29:10Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=c4b2800e-1798-4a1f-b4a2-e870e907eb2a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.697 182939 DEBUG nova.network.os_vif_util [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.697 182939 DEBUG nova.network.os_vif_util [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.698 182939 DEBUG nova.objects.instance [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.717 182939 DEBUG nova.virt.libvirt.driver [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:29:16 compute-0 nova_compute[182935]:   <uuid>c4b2800e-1798-4a1f-b4a2-e870e907eb2a</uuid>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   <name>instance-000000a7</name>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-380645357</nova:name>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:29:16</nova:creationTime>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:29:16 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:29:16 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:29:16 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:29:16 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:29:16 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:29:16 compute-0 nova_compute[182935]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:29:16 compute-0 nova_compute[182935]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:29:16 compute-0 nova_compute[182935]:         <nova:port uuid="06e8eaa8-d435-4dcd-af5c-959e8d49754d">
Jan 22 00:29:16 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <system>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <entry name="serial">c4b2800e-1798-4a1f-b4a2-e870e907eb2a</entry>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <entry name="uuid">c4b2800e-1798-4a1f-b4a2-e870e907eb2a</entry>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     </system>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   <os>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   </os>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   <features>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   </features>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk.config"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:d3:d1:68"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <target dev="tap06e8eaa8-d4"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/console.log" append="off"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <video>
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     </video>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <input type="keyboard" bus="usb"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:29:16 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:29:16 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:29:16 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:29:16 compute-0 nova_compute[182935]: </domain>
Jan 22 00:29:16 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.720 182939 DEBUG oslo_concurrency.processutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.781 182939 DEBUG oslo_concurrency.processutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.783 182939 DEBUG oslo_concurrency.processutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.841 182939 DEBUG oslo_concurrency.processutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.843 182939 DEBUG nova.objects.instance [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'trusted_certs' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.856 182939 DEBUG oslo_concurrency.processutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.913 182939 DEBUG oslo_concurrency.processutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.914 182939 DEBUG nova.virt.disk.api [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Checking if we can resize image /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.914 182939 DEBUG oslo_concurrency.processutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.969 182939 DEBUG oslo_concurrency.processutils [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.971 182939 DEBUG nova.virt.disk.api [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Cannot resize image /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.972 182939 DEBUG nova.objects.instance [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.986 182939 DEBUG nova.virt.libvirt.vif [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:28:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-380645357',display_name='tempest-TestNetworkAdvancedServerOps-server-380645357',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-380645357',id=167,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH7XBLZCDFN1SjQd5pdAiqzITq+BHor59G0Z0hkCJ16rGPV5ckUS2yxojxpRQnEFr5g70UZgNl62OwdVVuKhMenMnTtj7l+/b1lrTkAtU0xv67CvUH7GWdEje0B1GYX0Bg==',key_name='tempest-TestNetworkAdvancedServerOps-346227063',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:28:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-4t40upso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:29:10Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=c4b2800e-1798-4a1f-b4a2-e870e907eb2a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.987 182939 DEBUG nova.network.os_vif_util [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.987 182939 DEBUG nova.network.os_vif_util [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.988 182939 DEBUG os_vif [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.988 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.989 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.990 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.992 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.992 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e8eaa8-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.993 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06e8eaa8-d4, col_values=(('external_ids', {'iface-id': '06e8eaa8-d435-4dcd-af5c-959e8d49754d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:d1:68', 'vm-uuid': 'c4b2800e-1798-4a1f-b4a2-e870e907eb2a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.995 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:16 compute-0 NetworkManager[55139]: <info>  [1769041756.9956] manager: (tap06e8eaa8-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.998 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:29:16 compute-0 nova_compute[182935]: 2026-01-22 00:29:16.999 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.000 182939 INFO os_vif [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4')
Jan 22 00:29:17 compute-0 kernel: tap06e8eaa8-d4: entered promiscuous mode
Jan 22 00:29:17 compute-0 NetworkManager[55139]: <info>  [1769041757.0783] manager: (tap06e8eaa8-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/323)
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.078 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:17 compute-0 ovn_controller[95047]: 2026-01-22T00:29:17Z|00659|binding|INFO|Claiming lport 06e8eaa8-d435-4dcd-af5c-959e8d49754d for this chassis.
Jan 22 00:29:17 compute-0 ovn_controller[95047]: 2026-01-22T00:29:17Z|00660|binding|INFO|06e8eaa8-d435-4dcd-af5c-959e8d49754d: Claiming fa:16:3e:d3:d1:68 10.100.0.3
Jan 22 00:29:17 compute-0 ovn_controller[95047]: 2026-01-22T00:29:17Z|00661|binding|INFO|Setting lport 06e8eaa8-d435-4dcd-af5c-959e8d49754d ovn-installed in OVS
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.093 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.094 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:17 compute-0 systemd-udevd[240723]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:29:17 compute-0 systemd-machined[154182]: New machine qemu-86-instance-000000a7.
Jan 22 00:29:17 compute-0 ovn_controller[95047]: 2026-01-22T00:29:17Z|00662|binding|INFO|Setting lport 06e8eaa8-d435-4dcd-af5c-959e8d49754d up in Southbound
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.117 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:d1:68 10.100.0.3'], port_security=['fa:16:3e:d3:d1:68 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c4b2800e-1798-4a1f-b4a2-e870e907eb2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8a99868-0c12-4b68-ad26-6e85ed918505', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c4d4a9b7-d02f-4e4d-8562-8ba78b9b38a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.222'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbe04a09-5fce-4283-b95b-9116ec0f414f, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=06e8eaa8-d435-4dcd-af5c-959e8d49754d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:29:17 compute-0 NetworkManager[55139]: <info>  [1769041757.1187] device (tap06e8eaa8-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.118 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 06e8eaa8-d435-4dcd-af5c-959e8d49754d in datapath d8a99868-0c12-4b68-ad26-6e85ed918505 bound to our chassis
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.119 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d8a99868-0c12-4b68-ad26-6e85ed918505
Jan 22 00:29:17 compute-0 NetworkManager[55139]: <info>  [1769041757.1199] device (tap06e8eaa8-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:29:17 compute-0 systemd[1]: Started Virtual Machine qemu-86-instance-000000a7.
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.131 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1d15050b-6c88-410b-b58b-9927982bae55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.132 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd8a99868-01 in ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.133 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd8a99868-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.134 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f663b1fc-d861-4bb1-a19d-3a261f423ddb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.137 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[06a7fbfe-fe0b-4f0f-b5f6-e8d5874400bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.148 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[d54df1b4-b980-4c6a-8d9e-a392513e2c87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.171 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[59a5a259-d4b4-4c69-8fe0-09fcae36d73a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.199 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[173d3d7b-a9ca-4a06-a875-89359cd36b63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.204 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[25ed6df9-6b57-4d67-a4d2-d89f7789aa90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 NetworkManager[55139]: <info>  [1769041757.2053] manager: (tapd8a99868-00): new Veth device (/org/freedesktop/NetworkManager/Devices/324)
Jan 22 00:29:17 compute-0 systemd-udevd[240726]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.233 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[71a2f710-7bd4-4f4d-a5d7-cc4597901cf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.237 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[79bd180f-8b92-4cfe-bdac-21788797ef82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 NetworkManager[55139]: <info>  [1769041757.2564] device (tapd8a99868-00): carrier: link connected
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.262 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[990a3efe-b544-471f-a744-a752a7c128df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.277 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5e3df4d6-f5b5-4183-ab7c-3fbb8b5b1da9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8a99868-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:8e:b3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625999, 'reachable_time': 34055, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240757, 'error': None, 'target': 'ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.293 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0d494852-7e52-439f-ad32-b8c71b8c8a12]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9c:8eb3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 625999, 'tstamp': 625999}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240758, 'error': None, 'target': 'ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.306 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1f0b22ae-f697-4b2d-8887-32aef6999454]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd8a99868-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:8e:b3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625999, 'reachable_time': 34055, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240759, 'error': None, 'target': 'ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.333 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b078a5ae-9541-4843-bb80-f591fd3bbb11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.376 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[55110941-8b82-44ac-95fe-15924ac606c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.377 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8a99868-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.378 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.378 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd8a99868-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.379 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:17 compute-0 NetworkManager[55139]: <info>  [1769041757.3805] manager: (tapd8a99868-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Jan 22 00:29:17 compute-0 kernel: tapd8a99868-00: entered promiscuous mode
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.385 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.392 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd8a99868-00, col_values=(('external_ids', {'iface-id': '20a373d3-eb11-4b35-a16e-6f1df7e4cf14'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.393 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:17 compute-0 ovn_controller[95047]: 2026-01-22T00:29:17Z|00663|binding|INFO|Releasing lport 20a373d3-eb11-4b35-a16e-6f1df7e4cf14 from this chassis (sb_readonly=0)
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.394 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.405 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.405 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d8a99868-0c12-4b68-ad26-6e85ed918505.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d8a99868-0c12-4b68-ad26-6e85ed918505.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.406 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d4829cff-9b8e-4697-a645-482601fcd8ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.406 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-d8a99868-0c12-4b68-ad26-6e85ed918505
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/d8a99868-0c12-4b68-ad26-6e85ed918505.pid.haproxy
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID d8a99868-0c12-4b68-ad26-6e85ed918505
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:29:17 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:17.407 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505', 'env', 'PROCESS_TAG=haproxy-d8a99868-0c12-4b68-ad26-6e85ed918505', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d8a99868-0c12-4b68-ad26-6e85ed918505.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:29:17 compute-0 podman[240791]: 2026-01-22 00:29:17.725974737 +0000 UTC m=+0.044256955 container create 0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:29:17 compute-0 systemd[1]: Started libpod-conmon-0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505.scope.
Jan 22 00:29:17 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:29:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0731ec8c4ba94dec740f8ddf634c83489a8a28a01225112491b78404b046b58/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:29:17 compute-0 podman[240791]: 2026-01-22 00:29:17.703972109 +0000 UTC m=+0.022254347 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:29:17 compute-0 podman[240791]: 2026-01-22 00:29:17.806598047 +0000 UTC m=+0.124880295 container init 0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 22 00:29:17 compute-0 podman[240791]: 2026-01-22 00:29:17.811850444 +0000 UTC m=+0.130132672 container start 0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.811 182939 DEBUG nova.compute.manager [req-01c17241-d9ba-4b4b-859f-7e93cc343af3 req-0d76b543-285f-4f68-a4cb-8de5d8fba9f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.812 182939 DEBUG oslo_concurrency.lockutils [req-01c17241-d9ba-4b4b-859f-7e93cc343af3 req-0d76b543-285f-4f68-a4cb-8de5d8fba9f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.812 182939 DEBUG oslo_concurrency.lockutils [req-01c17241-d9ba-4b4b-859f-7e93cc343af3 req-0d76b543-285f-4f68-a4cb-8de5d8fba9f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.812 182939 DEBUG oslo_concurrency.lockutils [req-01c17241-d9ba-4b4b-859f-7e93cc343af3 req-0d76b543-285f-4f68-a4cb-8de5d8fba9f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.813 182939 DEBUG nova.compute.manager [req-01c17241-d9ba-4b4b-859f-7e93cc343af3 req-0d76b543-285f-4f68-a4cb-8de5d8fba9f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] No waiting events found dispatching network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:29:17 compute-0 nova_compute[182935]: 2026-01-22 00:29:17.813 182939 WARNING nova.compute.manager [req-01c17241-d9ba-4b4b-859f-7e93cc343af3 req-0d76b543-285f-4f68-a4cb-8de5d8fba9f5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received unexpected event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d for instance with vm_state stopped and task_state powering-on.
Jan 22 00:29:17 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240806]: [NOTICE]   (240810) : New worker (240812) forked
Jan 22 00:29:17 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240806]: [NOTICE]   (240810) : Loading success.
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.021 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for c4b2800e-1798-4a1f-b4a2-e870e907eb2a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.024 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041758.0201514, c4b2800e-1798-4a1f-b4a2-e870e907eb2a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.025 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] VM Resumed (Lifecycle Event)
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.028 182939 DEBUG nova.compute.manager [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.032 182939 INFO nova.virt.libvirt.driver [-] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Instance rebooted successfully.
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.032 182939 DEBUG nova.compute.manager [None req-c93e2cf0-e7c1-4788-a035-cdf08228648f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.085 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.089 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.156 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] During sync_power_state the instance has a pending task (powering-on). Skip.
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.156 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041758.0212867, c4b2800e-1798-4a1f-b4a2-e870e907eb2a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.156 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] VM Started (Lifecycle Event)
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.181 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:29:18 compute-0 nova_compute[182935]: 2026-01-22 00:29:18.184 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:29:19 compute-0 nova_compute[182935]: 2026-01-22 00:29:19.313 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:19 compute-0 nova_compute[182935]: 2026-01-22 00:29:19.947 182939 DEBUG nova.compute.manager [req-bd0c0934-6ce5-40b8-97c6-0df71d4b0185 req-af52db46-400c-49b9-bcaf-5bf65e32c090 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:29:19 compute-0 nova_compute[182935]: 2026-01-22 00:29:19.947 182939 DEBUG oslo_concurrency.lockutils [req-bd0c0934-6ce5-40b8-97c6-0df71d4b0185 req-af52db46-400c-49b9-bcaf-5bf65e32c090 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:19 compute-0 nova_compute[182935]: 2026-01-22 00:29:19.947 182939 DEBUG oslo_concurrency.lockutils [req-bd0c0934-6ce5-40b8-97c6-0df71d4b0185 req-af52db46-400c-49b9-bcaf-5bf65e32c090 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:19 compute-0 nova_compute[182935]: 2026-01-22 00:29:19.948 182939 DEBUG oslo_concurrency.lockutils [req-bd0c0934-6ce5-40b8-97c6-0df71d4b0185 req-af52db46-400c-49b9-bcaf-5bf65e32c090 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:19 compute-0 nova_compute[182935]: 2026-01-22 00:29:19.948 182939 DEBUG nova.compute.manager [req-bd0c0934-6ce5-40b8-97c6-0df71d4b0185 req-af52db46-400c-49b9-bcaf-5bf65e32c090 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] No waiting events found dispatching network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:29:19 compute-0 nova_compute[182935]: 2026-01-22 00:29:19.948 182939 WARNING nova.compute.manager [req-bd0c0934-6ce5-40b8-97c6-0df71d4b0185 req-af52db46-400c-49b9-bcaf-5bf65e32c090 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received unexpected event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d for instance with vm_state active and task_state None.
Jan 22 00:29:21 compute-0 nova_compute[182935]: 2026-01-22 00:29:21.996 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:24 compute-0 nova_compute[182935]: 2026-01-22 00:29:24.316 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:27 compute-0 nova_compute[182935]: 2026-01-22 00:29:27.033 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:29 compute-0 nova_compute[182935]: 2026-01-22 00:29:29.330 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:30 compute-0 ovn_controller[95047]: 2026-01-22T00:29:30Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:d1:68 10.100.0.3
Jan 22 00:29:32 compute-0 nova_compute[182935]: 2026-01-22 00:29:32.037 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:32 compute-0 podman[240843]: 2026-01-22 00:29:32.693667083 +0000 UTC m=+0.056571902 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:29:32 compute-0 podman[240845]: 2026-01-22 00:29:32.719965665 +0000 UTC m=+0.080505847 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:29:32 compute-0 podman[240844]: 2026-01-22 00:29:32.788906894 +0000 UTC m=+0.146434234 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Jan 22 00:29:34 compute-0 nova_compute[182935]: 2026-01-22 00:29:34.334 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:37 compute-0 nova_compute[182935]: 2026-01-22 00:29:37.040 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:37 compute-0 sshd-session[240915]: Invalid user mongodb from 188.166.69.60 port 39780
Jan 22 00:29:37 compute-0 sshd-session[240915]: Connection closed by invalid user mongodb 188.166.69.60 port 39780 [preauth]
Jan 22 00:29:38 compute-0 nova_compute[182935]: 2026-01-22 00:29:38.611 182939 INFO nova.compute.manager [None req-a35de056-5d1d-427c-b7d5-0b30db9e1f9a 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Get console output
Jan 22 00:29:38 compute-0 nova_compute[182935]: 2026-01-22 00:29:38.617 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:29:39 compute-0 nova_compute[182935]: 2026-01-22 00:29:39.337 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:40 compute-0 ovn_controller[95047]: 2026-01-22T00:29:40Z|00664|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Jan 22 00:29:40 compute-0 podman[240917]: 2026-01-22 00:29:40.673222869 +0000 UTC m=+0.049453941 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 00:29:40 compute-0 nova_compute[182935]: 2026-01-22 00:29:40.681 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:40.681 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:29:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:40.682 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.022 182939 DEBUG nova.compute.manager [req-e17c5ca9-0b5d-42a5-8ebe-21f89f3262a3 req-479ee710-a9da-44a9-9369-9e334335f701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-changed-06e8eaa8-d435-4dcd-af5c-959e8d49754d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.022 182939 DEBUG nova.compute.manager [req-e17c5ca9-0b5d-42a5-8ebe-21f89f3262a3 req-479ee710-a9da-44a9-9369-9e334335f701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Refreshing instance network info cache due to event network-changed-06e8eaa8-d435-4dcd-af5c-959e8d49754d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.022 182939 DEBUG oslo_concurrency.lockutils [req-e17c5ca9-0b5d-42a5-8ebe-21f89f3262a3 req-479ee710-a9da-44a9-9369-9e334335f701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.023 182939 DEBUG oslo_concurrency.lockutils [req-e17c5ca9-0b5d-42a5-8ebe-21f89f3262a3 req-479ee710-a9da-44a9-9369-9e334335f701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.023 182939 DEBUG nova.network.neutron [req-e17c5ca9-0b5d-42a5-8ebe-21f89f3262a3 req-479ee710-a9da-44a9-9369-9e334335f701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Refreshing network info cache for port 06e8eaa8-d435-4dcd-af5c-959e8d49754d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.143 182939 DEBUG oslo_concurrency.lockutils [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.144 182939 DEBUG oslo_concurrency.lockutils [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.144 182939 DEBUG oslo_concurrency.lockutils [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.145 182939 DEBUG oslo_concurrency.lockutils [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.145 182939 DEBUG oslo_concurrency.lockutils [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.162 182939 INFO nova.compute.manager [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Terminating instance
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.172 182939 DEBUG nova.compute.manager [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:29:41 compute-0 kernel: tap06e8eaa8-d4 (unregistering): left promiscuous mode
Jan 22 00:29:41 compute-0 NetworkManager[55139]: <info>  [1769041781.1987] device (tap06e8eaa8-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.240 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:41 compute-0 ovn_controller[95047]: 2026-01-22T00:29:41Z|00665|binding|INFO|Releasing lport 06e8eaa8-d435-4dcd-af5c-959e8d49754d from this chassis (sb_readonly=0)
Jan 22 00:29:41 compute-0 ovn_controller[95047]: 2026-01-22T00:29:41Z|00666|binding|INFO|Setting lport 06e8eaa8-d435-4dcd-af5c-959e8d49754d down in Southbound
Jan 22 00:29:41 compute-0 ovn_controller[95047]: 2026-01-22T00:29:41Z|00667|binding|INFO|Removing iface tap06e8eaa8-d4 ovn-installed in OVS
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.252 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:d1:68 10.100.0.3'], port_security=['fa:16:3e:d3:d1:68 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c4b2800e-1798-4a1f-b4a2-e870e907eb2a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d8a99868-0c12-4b68-ad26-6e85ed918505', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c4d4a9b7-d02f-4e4d-8562-8ba78b9b38a4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bbe04a09-5fce-4283-b95b-9116ec0f414f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=06e8eaa8-d435-4dcd-af5c-959e8d49754d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.253 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.253 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 06e8eaa8-d435-4dcd-af5c-959e8d49754d in datapath d8a99868-0c12-4b68-ad26-6e85ed918505 unbound from our chassis
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.255 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d8a99868-0c12-4b68-ad26-6e85ed918505, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.256 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bfce3bd3-09e8-4cba-aec6-34c3bbda3c30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.256 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505 namespace which is not needed anymore
Jan 22 00:29:41 compute-0 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000a7.scope: Deactivated successfully.
Jan 22 00:29:41 compute-0 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000a7.scope: Consumed 14.005s CPU time.
Jan 22 00:29:41 compute-0 systemd-machined[154182]: Machine qemu-86-instance-000000a7 terminated.
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.398 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:41 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240806]: [NOTICE]   (240810) : haproxy version is 2.8.14-c23fe91
Jan 22 00:29:41 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240806]: [NOTICE]   (240810) : path to executable is /usr/sbin/haproxy
Jan 22 00:29:41 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240806]: [WARNING]  (240810) : Exiting Master process...
Jan 22 00:29:41 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240806]: [ALERT]    (240810) : Current worker (240812) exited with code 143 (Terminated)
Jan 22 00:29:41 compute-0 neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505[240806]: [WARNING]  (240810) : All workers exited. Exiting... (0)
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.403 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:41 compute-0 systemd[1]: libpod-0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505.scope: Deactivated successfully.
Jan 22 00:29:41 compute-0 podman[240962]: 2026-01-22 00:29:41.411755292 +0000 UTC m=+0.065128297 container died 0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.436 182939 INFO nova.virt.libvirt.driver [-] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Instance destroyed successfully.
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.437 182939 DEBUG nova.objects.instance [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid c4b2800e-1798-4a1f-b4a2-e870e907eb2a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:29:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505-userdata-shm.mount: Deactivated successfully.
Jan 22 00:29:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-f0731ec8c4ba94dec740f8ddf634c83489a8a28a01225112491b78404b046b58-merged.mount: Deactivated successfully.
Jan 22 00:29:41 compute-0 podman[240962]: 2026-01-22 00:29:41.448740432 +0000 UTC m=+0.102113437 container cleanup 0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.455 182939 DEBUG nova.virt.libvirt.vif [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:28:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-380645357',display_name='tempest-TestNetworkAdvancedServerOps-server-380645357',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-380645357',id=167,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH7XBLZCDFN1SjQd5pdAiqzITq+BHor59G0Z0hkCJ16rGPV5ckUS2yxojxpRQnEFr5g70UZgNl62OwdVVuKhMenMnTtj7l+/b1lrTkAtU0xv67CvUH7GWdEje0B1GYX0Bg==',key_name='tempest-TestNetworkAdvancedServerOps-346227063',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:28:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-4t40upso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:29:18Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=c4b2800e-1798-4a1f-b4a2-e870e907eb2a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.456 182939 DEBUG nova.network.os_vif_util [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.457 182939 DEBUG nova.network.os_vif_util [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.457 182939 DEBUG os_vif [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.458 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.458 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e8eaa8-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.460 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.463 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:29:41 compute-0 systemd[1]: libpod-conmon-0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505.scope: Deactivated successfully.
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.466 182939 INFO os_vif [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:d1:68,bridge_name='br-int',has_traffic_filtering=True,id=06e8eaa8-d435-4dcd-af5c-959e8d49754d,network=Network(d8a99868-0c12-4b68-ad26-6e85ed918505),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e8eaa8-d4')
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.466 182939 INFO nova.virt.libvirt.driver [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Deleting instance files /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a_del
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.467 182939 INFO nova.virt.libvirt.driver [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Deletion of /var/lib/nova/instances/c4b2800e-1798-4a1f-b4a2-e870e907eb2a_del complete
Jan 22 00:29:41 compute-0 podman[241008]: 2026-01-22 00:29:41.516764458 +0000 UTC m=+0.040765001 container remove 0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.521 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cb73388a-e738-47e0-ad43-f6f32a82f292]: (4, ('Thu Jan 22 12:29:41 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505 (0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505)\n0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505\nThu Jan 22 12:29:41 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505 (0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505)\n0a62b088cdefc85928c3ad702c9a085a1d39ec0a348294c301eeef6a471d7505\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.523 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0be18118-18de-4bfe-96ff-fec58dfe1b35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.524 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd8a99868-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.525 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:41 compute-0 kernel: tapd8a99868-00: left promiscuous mode
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.538 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.541 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3202aa76-df60-4756-902b-714e05825600]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.548 182939 DEBUG nova.compute.manager [req-e7e657a2-178c-4444-8f86-340824a9eea8 req-ca1dbb63-1b84-41fc-861d-d4b09c0954ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-vif-unplugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.549 182939 DEBUG oslo_concurrency.lockutils [req-e7e657a2-178c-4444-8f86-340824a9eea8 req-ca1dbb63-1b84-41fc-861d-d4b09c0954ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.549 182939 DEBUG oslo_concurrency.lockutils [req-e7e657a2-178c-4444-8f86-340824a9eea8 req-ca1dbb63-1b84-41fc-861d-d4b09c0954ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.549 182939 DEBUG oslo_concurrency.lockutils [req-e7e657a2-178c-4444-8f86-340824a9eea8 req-ca1dbb63-1b84-41fc-861d-d4b09c0954ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.550 182939 DEBUG nova.compute.manager [req-e7e657a2-178c-4444-8f86-340824a9eea8 req-ca1dbb63-1b84-41fc-861d-d4b09c0954ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] No waiting events found dispatching network-vif-unplugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.551 182939 DEBUG nova.compute.manager [req-e7e657a2-178c-4444-8f86-340824a9eea8 req-ca1dbb63-1b84-41fc-861d-d4b09c0954ba 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-vif-unplugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.556 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[555e62c9-f5ec-42b7-822d-1383f13bb221]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.557 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[12b58fd6-05ca-404a-a095-899d35b755b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.565 182939 INFO nova.compute.manager [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Took 0.39 seconds to destroy the instance on the hypervisor.
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.566 182939 DEBUG oslo.service.loopingcall [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.566 182939 DEBUG nova.compute.manager [-] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:29:41 compute-0 nova_compute[182935]: 2026-01-22 00:29:41.566 182939 DEBUG nova.network.neutron [-] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.574 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[adebe1ef-803c-4f70-aa47-d8c8dd60252c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 625993, 'reachable_time': 23065, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241023, 'error': None, 'target': 'ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:41 compute-0 systemd[1]: run-netns-ovnmeta\x2dd8a99868\x2d0c12\x2d4b68\x2dad26\x2d6e85ed918505.mount: Deactivated successfully.
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.578 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d8a99868-0c12-4b68-ad26-6e85ed918505 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.578 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe93293-6374-4cc7-8e36-cb916cb8173f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:29:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:29:41.684 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:29:42 compute-0 nova_compute[182935]: 2026-01-22 00:29:42.858 182939 DEBUG nova.network.neutron [req-e17c5ca9-0b5d-42a5-8ebe-21f89f3262a3 req-479ee710-a9da-44a9-9369-9e334335f701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Updated VIF entry in instance network info cache for port 06e8eaa8-d435-4dcd-af5c-959e8d49754d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:29:42 compute-0 nova_compute[182935]: 2026-01-22 00:29:42.859 182939 DEBUG nova.network.neutron [req-e17c5ca9-0b5d-42a5-8ebe-21f89f3262a3 req-479ee710-a9da-44a9-9369-9e334335f701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Updating instance_info_cache with network_info: [{"id": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "address": "fa:16:3e:d3:d1:68", "network": {"id": "d8a99868-0c12-4b68-ad26-6e85ed918505", "bridge": "br-int", "label": "tempest-network-smoke--924731151", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e8eaa8-d4", "ovs_interfaceid": "06e8eaa8-d435-4dcd-af5c-959e8d49754d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:29:42 compute-0 nova_compute[182935]: 2026-01-22 00:29:42.888 182939 DEBUG oslo_concurrency.lockutils [req-e17c5ca9-0b5d-42a5-8ebe-21f89f3262a3 req-479ee710-a9da-44a9-9369-9e334335f701 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-c4b2800e-1798-4a1f-b4a2-e870e907eb2a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.099 182939 DEBUG nova.network.neutron [-] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.125 182939 INFO nova.compute.manager [-] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Took 1.56 seconds to deallocate network for instance.
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.200 182939 DEBUG nova.compute.manager [req-6de8fce0-e8ff-4cee-a844-83ba85416907 req-5b4e7be4-9d89-4d96-8abe-d42908058547 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-vif-deleted-06e8eaa8-d435-4dcd-af5c-959e8d49754d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.210 182939 DEBUG oslo_concurrency.lockutils [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.211 182939 DEBUG oslo_concurrency.lockutils [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.282 182939 DEBUG nova.compute.provider_tree [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.300 182939 DEBUG nova.scheduler.client.report [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.326 182939 DEBUG oslo_concurrency.lockutils [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.347 182939 INFO nova.scheduler.client.report [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Deleted allocations for instance c4b2800e-1798-4a1f-b4a2-e870e907eb2a
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.434 182939 DEBUG oslo_concurrency.lockutils [None req-90a544d4-4e30-4499-a846-aa35ec564737 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.629 182939 DEBUG nova.compute.manager [req-14d22466-be39-4f1b-96f3-183b0c015b4a req-a4736664-c561-42bb-9f5c-5dea7c0dbb61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.629 182939 DEBUG oslo_concurrency.lockutils [req-14d22466-be39-4f1b-96f3-183b0c015b4a req-a4736664-c561-42bb-9f5c-5dea7c0dbb61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.630 182939 DEBUG oslo_concurrency.lockutils [req-14d22466-be39-4f1b-96f3-183b0c015b4a req-a4736664-c561-42bb-9f5c-5dea7c0dbb61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.630 182939 DEBUG oslo_concurrency.lockutils [req-14d22466-be39-4f1b-96f3-183b0c015b4a req-a4736664-c561-42bb-9f5c-5dea7c0dbb61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "c4b2800e-1798-4a1f-b4a2-e870e907eb2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.630 182939 DEBUG nova.compute.manager [req-14d22466-be39-4f1b-96f3-183b0c015b4a req-a4736664-c561-42bb-9f5c-5dea7c0dbb61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] No waiting events found dispatching network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:29:43 compute-0 nova_compute[182935]: 2026-01-22 00:29:43.630 182939 WARNING nova.compute.manager [req-14d22466-be39-4f1b-96f3-183b0c015b4a req-a4736664-c561-42bb-9f5c-5dea7c0dbb61 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Received unexpected event network-vif-plugged-06e8eaa8-d435-4dcd-af5c-959e8d49754d for instance with vm_state deleted and task_state None.
Jan 22 00:29:44 compute-0 nova_compute[182935]: 2026-01-22 00:29:44.339 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:46 compute-0 nova_compute[182935]: 2026-01-22 00:29:46.462 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:46 compute-0 podman[241024]: 2026-01-22 00:29:46.694196703 +0000 UTC m=+0.066532772 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Jan 22 00:29:46 compute-0 podman[241025]: 2026-01-22 00:29:46.696013096 +0000 UTC m=+0.065582628 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:29:47 compute-0 nova_compute[182935]: 2026-01-22 00:29:47.062 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:47 compute-0 nova_compute[182935]: 2026-01-22 00:29:47.141 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:49 compute-0 nova_compute[182935]: 2026-01-22 00:29:49.340 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:51 compute-0 nova_compute[182935]: 2026-01-22 00:29:51.464 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:51 compute-0 nova_compute[182935]: 2026-01-22 00:29:51.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:51 compute-0 nova_compute[182935]: 2026-01-22 00:29:51.814 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:51 compute-0 nova_compute[182935]: 2026-01-22 00:29:51.815 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:51 compute-0 nova_compute[182935]: 2026-01-22 00:29:51.815 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:51 compute-0 nova_compute[182935]: 2026-01-22 00:29:51.815 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:29:51 compute-0 nova_compute[182935]: 2026-01-22 00:29:51.967 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:29:51 compute-0 nova_compute[182935]: 2026-01-22 00:29:51.968 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5720MB free_disk=73.11896896362305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:29:51 compute-0 nova_compute[182935]: 2026-01-22 00:29:51.968 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:29:51 compute-0 nova_compute[182935]: 2026-01-22 00:29:51.968 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:29:52 compute-0 nova_compute[182935]: 2026-01-22 00:29:52.072 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:29:52 compute-0 nova_compute[182935]: 2026-01-22 00:29:52.073 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:29:52 compute-0 nova_compute[182935]: 2026-01-22 00:29:52.155 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:29:52 compute-0 nova_compute[182935]: 2026-01-22 00:29:52.190 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:29:52 compute-0 nova_compute[182935]: 2026-01-22 00:29:52.219 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:29:52 compute-0 nova_compute[182935]: 2026-01-22 00:29:52.219 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:29:54 compute-0 nova_compute[182935]: 2026-01-22 00:29:54.341 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:56 compute-0 nova_compute[182935]: 2026-01-22 00:29:56.220 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:56 compute-0 nova_compute[182935]: 2026-01-22 00:29:56.221 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:29:56 compute-0 nova_compute[182935]: 2026-01-22 00:29:56.434 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041781.433183, c4b2800e-1798-4a1f-b4a2-e870e907eb2a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:29:56 compute-0 nova_compute[182935]: 2026-01-22 00:29:56.435 182939 INFO nova.compute.manager [-] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] VM Stopped (Lifecycle Event)
Jan 22 00:29:56 compute-0 nova_compute[182935]: 2026-01-22 00:29:56.507 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:29:56 compute-0 nova_compute[182935]: 2026-01-22 00:29:56.649 182939 DEBUG nova.compute.manager [None req-2e7b780a-9a6b-438e-90be-544e45a76181 - - - - - -] [instance: c4b2800e-1798-4a1f-b4a2-e870e907eb2a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:29:56 compute-0 nova_compute[182935]: 2026-01-22 00:29:56.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:56 compute-0 nova_compute[182935]: 2026-01-22 00:29:56.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:29:56 compute-0 nova_compute[182935]: 2026-01-22 00:29:56.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:29:57 compute-0 nova_compute[182935]: 2026-01-22 00:29:57.254 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:29:58 compute-0 nova_compute[182935]: 2026-01-22 00:29:58.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:29:59 compute-0 nova_compute[182935]: 2026-01-22 00:29:59.343 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:01 compute-0 nova_compute[182935]: 2026-01-22 00:30:01.548 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:01 compute-0 nova_compute[182935]: 2026-01-22 00:30:01.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:02 compute-0 nova_compute[182935]: 2026-01-22 00:30:02.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:02 compute-0 podman[241066]: 2026-01-22 00:30:02.872961957 +0000 UTC m=+0.050177348 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:30:02 compute-0 podman[241068]: 2026-01-22 00:30:02.903760538 +0000 UTC m=+0.077139367 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:30:02 compute-0 podman[241067]: 2026-01-22 00:30:02.906678158 +0000 UTC m=+0.081038830 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:30:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:03.226 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:03.226 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:03.226 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:04 compute-0 nova_compute[182935]: 2026-01-22 00:30:04.345 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:05 compute-0 nova_compute[182935]: 2026-01-22 00:30:05.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:06 compute-0 nova_compute[182935]: 2026-01-22 00:30:06.551 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:07 compute-0 nova_compute[182935]: 2026-01-22 00:30:07.787 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:08 compute-0 nova_compute[182935]: 2026-01-22 00:30:08.737 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:08 compute-0 nova_compute[182935]: 2026-01-22 00:30:08.738 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:08 compute-0 nova_compute[182935]: 2026-01-22 00:30:08.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:08 compute-0 nova_compute[182935]: 2026-01-22 00:30:08.887 182939 DEBUG nova.compute.manager [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.042 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.042 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.049 182939 DEBUG nova.virt.hardware [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.049 182939 INFO nova.compute.claims [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.347 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.363 182939 DEBUG nova.compute.provider_tree [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.385 182939 DEBUG nova.scheduler.client.report [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.803 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.804 182939 DEBUG nova.compute.manager [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.943 182939 DEBUG nova.compute.manager [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.943 182939 DEBUG nova.network.neutron [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.959 182939 INFO nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:30:09 compute-0 nova_compute[182935]: 2026-01-22 00:30:09.976 182939 DEBUG nova.compute.manager [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.085 182939 DEBUG nova.compute.manager [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.086 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.086 182939 INFO nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Creating image(s)
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.087 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "/var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.087 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.088 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "/var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.098 182939 DEBUG oslo_concurrency.processutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.175 182939 DEBUG oslo_concurrency.processutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.177 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.177 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.189 182939 DEBUG oslo_concurrency.processutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.245 182939 DEBUG oslo_concurrency.processutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.246 182939 DEBUG oslo_concurrency.processutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.279 182939 DEBUG oslo_concurrency.processutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.280 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.281 182939 DEBUG oslo_concurrency.processutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.336 182939 DEBUG oslo_concurrency.processutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.337 182939 DEBUG nova.virt.disk.api [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Checking if we can resize image /var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.337 182939 DEBUG oslo_concurrency.processutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.394 182939 DEBUG oslo_concurrency.processutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.395 182939 DEBUG nova.virt.disk.api [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Cannot resize image /var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.396 182939 DEBUG nova.objects.instance [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'migration_context' on Instance uuid f6ca3701-93a3-4e03-999f-1bdb7a95d933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.493 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.494 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Ensure instance console log exists: /var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.494 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.495 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.495 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.714 182939 DEBUG nova.policy [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:30:10 compute-0 nova_compute[182935]: 2026-01-22 00:30:10.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:11 compute-0 nova_compute[182935]: 2026-01-22 00:30:11.553 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:11 compute-0 podman[241154]: 2026-01-22 00:30:11.713557373 +0000 UTC m=+0.083117190 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 22 00:30:12 compute-0 nova_compute[182935]: 2026-01-22 00:30:12.348 182939 DEBUG nova.network.neutron [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Successfully created port: 195536c8-52ab-4d51-b46e-5095d097728e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:30:13 compute-0 nova_compute[182935]: 2026-01-22 00:30:13.071 182939 DEBUG nova.network.neutron [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Successfully updated port: 195536c8-52ab-4d51-b46e-5095d097728e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:30:13 compute-0 nova_compute[182935]: 2026-01-22 00:30:13.097 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:30:13 compute-0 nova_compute[182935]: 2026-01-22 00:30:13.097 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:30:13 compute-0 nova_compute[182935]: 2026-01-22 00:30:13.098 182939 DEBUG nova.network.neutron [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:30:13 compute-0 nova_compute[182935]: 2026-01-22 00:30:13.198 182939 DEBUG nova.compute.manager [req-ef9d2fbd-18ad-4b2d-8da9-743adcea3302 req-d501c86a-74e3-476d-aeb4-ee77f07daf50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-changed-195536c8-52ab-4d51-b46e-5095d097728e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:30:13 compute-0 nova_compute[182935]: 2026-01-22 00:30:13.199 182939 DEBUG nova.compute.manager [req-ef9d2fbd-18ad-4b2d-8da9-743adcea3302 req-d501c86a-74e3-476d-aeb4-ee77f07daf50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Refreshing instance network info cache due to event network-changed-195536c8-52ab-4d51-b46e-5095d097728e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:30:13 compute-0 nova_compute[182935]: 2026-01-22 00:30:13.199 182939 DEBUG oslo_concurrency.lockutils [req-ef9d2fbd-18ad-4b2d-8da9-743adcea3302 req-d501c86a-74e3-476d-aeb4-ee77f07daf50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:30:13 compute-0 nova_compute[182935]: 2026-01-22 00:30:13.686 182939 DEBUG nova.network.neutron [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.348 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.685 182939 DEBUG nova.network.neutron [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Updating instance_info_cache with network_info: [{"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.706 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.707 182939 DEBUG nova.compute.manager [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Instance network_info: |[{"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.707 182939 DEBUG oslo_concurrency.lockutils [req-ef9d2fbd-18ad-4b2d-8da9-743adcea3302 req-d501c86a-74e3-476d-aeb4-ee77f07daf50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.707 182939 DEBUG nova.network.neutron [req-ef9d2fbd-18ad-4b2d-8da9-743adcea3302 req-d501c86a-74e3-476d-aeb4-ee77f07daf50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Refreshing network info cache for port 195536c8-52ab-4d51-b46e-5095d097728e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.709 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Start _get_guest_xml network_info=[{"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.714 182939 WARNING nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.718 182939 DEBUG nova.virt.libvirt.host [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.718 182939 DEBUG nova.virt.libvirt.host [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.724 182939 DEBUG nova.virt.libvirt.host [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.725 182939 DEBUG nova.virt.libvirt.host [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.726 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.726 182939 DEBUG nova.virt.hardware [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.726 182939 DEBUG nova.virt.hardware [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.726 182939 DEBUG nova.virt.hardware [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.727 182939 DEBUG nova.virt.hardware [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.727 182939 DEBUG nova.virt.hardware [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.727 182939 DEBUG nova.virt.hardware [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.727 182939 DEBUG nova.virt.hardware [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.727 182939 DEBUG nova.virt.hardware [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.727 182939 DEBUG nova.virt.hardware [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.727 182939 DEBUG nova.virt.hardware [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.728 182939 DEBUG nova.virt.hardware [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.731 182939 DEBUG nova.virt.libvirt.vif [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:30:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1402697012',display_name='tempest-TestNetworkAdvancedServerOps-server-1402697012',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1402697012',id=168,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBSw81OruvzkVRarS+CU3t8/T+uR28raj1bWORjbeWaVlQSCjmt5S8ac1rdgwBP3Lkgrq7bMpc6sdfr/rqNs5mLih3Et/pOLQTSFRciDH7qDEioqcWvyyJnrprDf9EeD3Q==',key_name='tempest-TestNetworkAdvancedServerOps-858203030',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-fz0brlil',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:30:10Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=f6ca3701-93a3-4e03-999f-1bdb7a95d933,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.732 182939 DEBUG nova.network.os_vif_util [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.732 182939 DEBUG nova.network.os_vif_util [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:c5:d4,bridge_name='br-int',has_traffic_filtering=True,id=195536c8-52ab-4d51-b46e-5095d097728e,network=Network(9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195536c8-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.733 182939 DEBUG nova.objects.instance [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid f6ca3701-93a3-4e03-999f-1bdb7a95d933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.747 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:30:14 compute-0 nova_compute[182935]:   <uuid>f6ca3701-93a3-4e03-999f-1bdb7a95d933</uuid>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   <name>instance-000000a8</name>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1402697012</nova:name>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:30:14</nova:creationTime>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:30:14 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:30:14 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:30:14 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:30:14 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:30:14 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:30:14 compute-0 nova_compute[182935]:         <nova:user uuid="635cc2f351c344dc8e2b1264080dbafb">tempest-TestNetworkAdvancedServerOps-587955072-project-member</nova:user>
Jan 22 00:30:14 compute-0 nova_compute[182935]:         <nova:project uuid="adb1305c8f874f2684e845e88fd95ffe">tempest-TestNetworkAdvancedServerOps-587955072</nova:project>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:30:14 compute-0 nova_compute[182935]:         <nova:port uuid="195536c8-52ab-4d51-b46e-5095d097728e">
Jan 22 00:30:14 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <system>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <entry name="serial">f6ca3701-93a3-4e03-999f-1bdb7a95d933</entry>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <entry name="uuid">f6ca3701-93a3-4e03-999f-1bdb7a95d933</entry>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     </system>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   <os>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   </os>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   <features>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   </features>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.config"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:3a:c5:d4"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <target dev="tap195536c8-52"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/console.log" append="off"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <video>
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     </video>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:30:14 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:30:14 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:30:14 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:30:14 compute-0 nova_compute[182935]: </domain>
Jan 22 00:30:14 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.748 182939 DEBUG nova.compute.manager [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Preparing to wait for external event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.748 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.748 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.748 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.748 182939 DEBUG nova.virt.libvirt.vif [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:30:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1402697012',display_name='tempest-TestNetworkAdvancedServerOps-server-1402697012',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1402697012',id=168,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBSw81OruvzkVRarS+CU3t8/T+uR28raj1bWORjbeWaVlQSCjmt5S8ac1rdgwBP3Lkgrq7bMpc6sdfr/rqNs5mLih3Et/pOLQTSFRciDH7qDEioqcWvyyJnrprDf9EeD3Q==',key_name='tempest-TestNetworkAdvancedServerOps-858203030',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-fz0brlil',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:30:10Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=f6ca3701-93a3-4e03-999f-1bdb7a95d933,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.749 182939 DEBUG nova.network.os_vif_util [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.749 182939 DEBUG nova.network.os_vif_util [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:c5:d4,bridge_name='br-int',has_traffic_filtering=True,id=195536c8-52ab-4d51-b46e-5095d097728e,network=Network(9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195536c8-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.749 182939 DEBUG os_vif [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:c5:d4,bridge_name='br-int',has_traffic_filtering=True,id=195536c8-52ab-4d51-b46e-5095d097728e,network=Network(9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195536c8-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.750 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.750 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.750 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.753 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.753 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap195536c8-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.754 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap195536c8-52, col_values=(('external_ids', {'iface-id': '195536c8-52ab-4d51-b46e-5095d097728e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3a:c5:d4', 'vm-uuid': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:14 compute-0 NetworkManager[55139]: <info>  [1769041814.7561] manager: (tap195536c8-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.758 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.760 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.762 182939 INFO os_vif [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:c5:d4,bridge_name='br-int',has_traffic_filtering=True,id=195536c8-52ab-4d51-b46e-5095d097728e,network=Network(9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195536c8-52')
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.811 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.812 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.812 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] No VIF found with MAC fa:16:3e:3a:c5:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:30:14 compute-0 nova_compute[182935]: 2026-01-22 00:30:14.813 182939 INFO nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Using config drive
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.127 182939 INFO nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Creating config drive at /var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.config
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.132 182939 DEBUG oslo_concurrency.processutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2tl1e2z3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.261 182939 DEBUG oslo_concurrency.processutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2tl1e2z3" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:30:15 compute-0 kernel: tap195536c8-52: entered promiscuous mode
Jan 22 00:30:15 compute-0 NetworkManager[55139]: <info>  [1769041815.3288] manager: (tap195536c8-52): new Tun device (/org/freedesktop/NetworkManager/Devices/327)
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.331 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:15 compute-0 ovn_controller[95047]: 2026-01-22T00:30:15Z|00668|binding|INFO|Claiming lport 195536c8-52ab-4d51-b46e-5095d097728e for this chassis.
Jan 22 00:30:15 compute-0 ovn_controller[95047]: 2026-01-22T00:30:15Z|00669|binding|INFO|195536c8-52ab-4d51-b46e-5095d097728e: Claiming fa:16:3e:3a:c5:d4 10.100.0.5
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.336 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:15 compute-0 systemd-udevd[241189]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:30:15 compute-0 systemd-machined[154182]: New machine qemu-87-instance-000000a8.
Jan 22 00:30:15 compute-0 NetworkManager[55139]: <info>  [1769041815.3700] device (tap195536c8-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:30:15 compute-0 NetworkManager[55139]: <info>  [1769041815.3713] device (tap195536c8-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.387 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:15 compute-0 ovn_controller[95047]: 2026-01-22T00:30:15Z|00670|binding|INFO|Setting lport 195536c8-52ab-4d51-b46e-5095d097728e ovn-installed in OVS
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.393 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:15 compute-0 systemd[1]: Started Virtual Machine qemu-87-instance-000000a8.
Jan 22 00:30:15 compute-0 ovn_controller[95047]: 2026-01-22T00:30:15Z|00671|binding|INFO|Setting lport 195536c8-52ab-4d51-b46e-5095d097728e up in Southbound
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.413 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:c5:d4 10.100.0.5'], port_security=['fa:16:3e:3a:c5:d4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '82d1c837-3db3-422e-ad4c-90f973791238', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c12d587-6450-4baa-9602-df652658847f, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=195536c8-52ab-4d51-b46e-5095d097728e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.414 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 195536c8-52ab-4d51-b46e-5095d097728e in datapath 9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 bound to our chassis
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.415 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.426 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7688da82-dd3f-4777-97bc-f2d00b2e4e7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.427 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9bb9bb99-81 in ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.429 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9bb9bb99-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.429 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6998784a-f865-40c5-8e55-cfc122bde648]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.430 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[daa54ba0-5a26-4b74-a0e9-337fee675e1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.441 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[d9d968d2-ba22-41c1-98ec-c2f981e001a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.466 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7a268aa3-7e72-48a1-89b4-28272e787b1d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.493 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[cca1c329-eac5-4dd1-83e4-c34af58296ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.499 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8987fb45-901e-482e-9064-34006413b863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 NetworkManager[55139]: <info>  [1769041815.5004] manager: (tap9bb9bb99-80): new Veth device (/org/freedesktop/NetworkManager/Devices/328)
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.529 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b0930e56-b0bd-46fd-b122-bfa236630e73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.533 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[ea9e72f1-23fb-4ec1-9169-19aaaeec0634]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 NetworkManager[55139]: <info>  [1769041815.5534] device (tap9bb9bb99-80): carrier: link connected
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.558 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[4af5ba02-53eb-4d00-9e05-d0ba6814aba8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.576 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4e2cec94-be25-4240-b572-81c63b3d97b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bb9bb99-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:1f:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 631829, 'reachable_time': 23116, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241231, 'error': None, 'target': 'ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.589 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2ecc238a-a31e-4be1-9664-105346eab76b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedb:1f5e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 631829, 'tstamp': 631829}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241232, 'error': None, 'target': 'ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.606 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d7691771-12ef-4fc6-8580-ce0e7f3ef608]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bb9bb99-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:1f:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 631829, 'reachable_time': 23116, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241234, 'error': None, 'target': 'ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.625 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041815.6248257, f6ca3701-93a3-4e03-999f-1bdb7a95d933 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.625 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] VM Started (Lifecycle Event)
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.632 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8f223646-7c54-496c-b4f4-8aa508a03e6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.667 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.670 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041815.6249974, f6ca3701-93a3-4e03-999f-1bdb7a95d933 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.670 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] VM Paused (Lifecycle Event)
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.680 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[539a5469-f43e-452f-84e2-e25672196fe7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.681 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bb9bb99-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.682 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.682 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bb9bb99-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:15 compute-0 kernel: tap9bb9bb99-80: entered promiscuous mode
Jan 22 00:30:15 compute-0 NetworkManager[55139]: <info>  [1769041815.6846] manager: (tap9bb9bb99-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/329)
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.684 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.687 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9bb9bb99-80, col_values=(('external_ids', {'iface-id': 'fc9607ca-e5d3-4a10-ba93-cc16b1df0b66'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.688 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:15 compute-0 ovn_controller[95047]: 2026-01-22T00:30:15Z|00672|binding|INFO|Releasing lport fc9607ca-e5d3-4a10-ba93-cc16b1df0b66 from this chassis (sb_readonly=0)
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.689 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.689 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.690 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f3191bc1-65b1-4d38-a2b5-4958d0c2dfea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.691 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58.pid.haproxy
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:30:15 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:15.691 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'env', 'PROCESS_TAG=haproxy-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.699 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.719 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.723 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.838 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.976 182939 DEBUG nova.compute.manager [req-bac15807-e525-404e-bdeb-3748c33ed587 req-1a85f433-e7f0-44a9-adc8-f910bba6ccda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.977 182939 DEBUG oslo_concurrency.lockutils [req-bac15807-e525-404e-bdeb-3748c33ed587 req-1a85f433-e7f0-44a9-adc8-f910bba6ccda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.978 182939 DEBUG oslo_concurrency.lockutils [req-bac15807-e525-404e-bdeb-3748c33ed587 req-1a85f433-e7f0-44a9-adc8-f910bba6ccda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.978 182939 DEBUG oslo_concurrency.lockutils [req-bac15807-e525-404e-bdeb-3748c33ed587 req-1a85f433-e7f0-44a9-adc8-f910bba6ccda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.979 182939 DEBUG nova.compute.manager [req-bac15807-e525-404e-bdeb-3748c33ed587 req-1a85f433-e7f0-44a9-adc8-f910bba6ccda 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Processing event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.980 182939 DEBUG nova.compute.manager [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.984 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041815.9838514, f6ca3701-93a3-4e03-999f-1bdb7a95d933 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.984 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] VM Resumed (Lifecycle Event)
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.986 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.988 182939 INFO nova.virt.libvirt.driver [-] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Instance spawned successfully.
Jan 22 00:30:15 compute-0 nova_compute[182935]: 2026-01-22 00:30:15.989 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.014 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.019 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.021 182939 DEBUG nova.network.neutron [req-ef9d2fbd-18ad-4b2d-8da9-743adcea3302 req-d501c86a-74e3-476d-aeb4-ee77f07daf50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Updated VIF entry in instance network info cache for port 195536c8-52ab-4d51-b46e-5095d097728e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.022 182939 DEBUG nova.network.neutron [req-ef9d2fbd-18ad-4b2d-8da9-743adcea3302 req-d501c86a-74e3-476d-aeb4-ee77f07daf50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Updating instance_info_cache with network_info: [{"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.024 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.024 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.025 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.025 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.025 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.026 182939 DEBUG nova.virt.libvirt.driver [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:30:16 compute-0 podman[241266]: 2026-01-22 00:30:16.026792832 +0000 UTC m=+0.046171552 container create 12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.059 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.059 182939 DEBUG oslo_concurrency.lockutils [req-ef9d2fbd-18ad-4b2d-8da9-743adcea3302 req-d501c86a-74e3-476d-aeb4-ee77f07daf50 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:30:16 compute-0 systemd[1]: Started libpod-conmon-12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1.scope.
Jan 22 00:30:16 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:30:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4ef95dab02a87cd840301cc68231d83cbb2a88c23cc790b7afc70f07e9c06be/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:30:16 compute-0 podman[241266]: 2026-01-22 00:30:16.002085867 +0000 UTC m=+0.021464607 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:30:16 compute-0 podman[241266]: 2026-01-22 00:30:16.104442069 +0000 UTC m=+0.123820979 container init 12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.109 182939 INFO nova.compute.manager [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Took 6.02 seconds to spawn the instance on the hypervisor.
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.110 182939 DEBUG nova.compute.manager [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:30:16 compute-0 podman[241266]: 2026-01-22 00:30:16.111877928 +0000 UTC m=+0.131256648 container start 12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:30:16 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241281]: [NOTICE]   (241285) : New worker (241287) forked
Jan 22 00:30:16 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241281]: [NOTICE]   (241285) : Loading success.
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.191 182939 INFO nova.compute.manager [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Took 7.19 seconds to build instance.
Jan 22 00:30:16 compute-0 nova_compute[182935]: 2026-01-22 00:30:16.211 182939 DEBUG oslo_concurrency.lockutils [None req-2a79738f-e424-4c7d-92f5-572234a8e34c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.473s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:17 compute-0 podman[241296]: 2026-01-22 00:30:17.679111866 +0000 UTC m=+0.055710931 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:30:17 compute-0 podman[241297]: 2026-01-22 00:30:17.716069335 +0000 UTC m=+0.090245642 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 00:30:18 compute-0 nova_compute[182935]: 2026-01-22 00:30:18.065 182939 DEBUG nova.compute.manager [req-dfad2068-941c-408a-b3ba-a066422ce855 req-fec22a27-e57d-4ca4-85b9-5a99cf3ffe69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:30:18 compute-0 nova_compute[182935]: 2026-01-22 00:30:18.066 182939 DEBUG oslo_concurrency.lockutils [req-dfad2068-941c-408a-b3ba-a066422ce855 req-fec22a27-e57d-4ca4-85b9-5a99cf3ffe69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:18 compute-0 nova_compute[182935]: 2026-01-22 00:30:18.066 182939 DEBUG oslo_concurrency.lockutils [req-dfad2068-941c-408a-b3ba-a066422ce855 req-fec22a27-e57d-4ca4-85b9-5a99cf3ffe69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:18 compute-0 nova_compute[182935]: 2026-01-22 00:30:18.067 182939 DEBUG oslo_concurrency.lockutils [req-dfad2068-941c-408a-b3ba-a066422ce855 req-fec22a27-e57d-4ca4-85b9-5a99cf3ffe69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:18 compute-0 nova_compute[182935]: 2026-01-22 00:30:18.067 182939 DEBUG nova.compute.manager [req-dfad2068-941c-408a-b3ba-a066422ce855 req-fec22a27-e57d-4ca4-85b9-5a99cf3ffe69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] No waiting events found dispatching network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:30:18 compute-0 nova_compute[182935]: 2026-01-22 00:30:18.067 182939 WARNING nova.compute.manager [req-dfad2068-941c-408a-b3ba-a066422ce855 req-fec22a27-e57d-4ca4-85b9-5a99cf3ffe69 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received unexpected event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e for instance with vm_state active and task_state None.
Jan 22 00:30:19 compute-0 nova_compute[182935]: 2026-01-22 00:30:19.350 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:19 compute-0 nova_compute[182935]: 2026-01-22 00:30:19.790 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:21 compute-0 ovn_controller[95047]: 2026-01-22T00:30:21Z|00673|binding|INFO|Releasing lport fc9607ca-e5d3-4a10-ba93-cc16b1df0b66 from this chassis (sb_readonly=0)
Jan 22 00:30:21 compute-0 NetworkManager[55139]: <info>  [1769041821.2924] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/330)
Jan 22 00:30:21 compute-0 NetworkManager[55139]: <info>  [1769041821.2940] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Jan 22 00:30:21 compute-0 nova_compute[182935]: 2026-01-22 00:30:21.307 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:21 compute-0 ovn_controller[95047]: 2026-01-22T00:30:21Z|00674|binding|INFO|Releasing lport fc9607ca-e5d3-4a10-ba93-cc16b1df0b66 from this chassis (sb_readonly=0)
Jan 22 00:30:21 compute-0 nova_compute[182935]: 2026-01-22 00:30:21.321 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:21 compute-0 nova_compute[182935]: 2026-01-22 00:30:21.325 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:21 compute-0 sshd-session[241334]: Invalid user apache from 188.166.69.60 port 52348
Jan 22 00:30:21 compute-0 sshd-session[241334]: Connection closed by invalid user apache 188.166.69.60 port 52348 [preauth]
Jan 22 00:30:21 compute-0 nova_compute[182935]: 2026-01-22 00:30:21.933 182939 DEBUG nova.compute.manager [req-d0d28266-b3ef-44e0-9da4-4d3b3dd2f1bf req-d9975ff8-5d22-4d18-ad41-e68d6d5e70e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-changed-195536c8-52ab-4d51-b46e-5095d097728e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:30:21 compute-0 nova_compute[182935]: 2026-01-22 00:30:21.934 182939 DEBUG nova.compute.manager [req-d0d28266-b3ef-44e0-9da4-4d3b3dd2f1bf req-d9975ff8-5d22-4d18-ad41-e68d6d5e70e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Refreshing instance network info cache due to event network-changed-195536c8-52ab-4d51-b46e-5095d097728e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:30:21 compute-0 nova_compute[182935]: 2026-01-22 00:30:21.934 182939 DEBUG oslo_concurrency.lockutils [req-d0d28266-b3ef-44e0-9da4-4d3b3dd2f1bf req-d9975ff8-5d22-4d18-ad41-e68d6d5e70e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:30:21 compute-0 nova_compute[182935]: 2026-01-22 00:30:21.934 182939 DEBUG oslo_concurrency.lockutils [req-d0d28266-b3ef-44e0-9da4-4d3b3dd2f1bf req-d9975ff8-5d22-4d18-ad41-e68d6d5e70e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:30:21 compute-0 nova_compute[182935]: 2026-01-22 00:30:21.934 182939 DEBUG nova.network.neutron [req-d0d28266-b3ef-44e0-9da4-4d3b3dd2f1bf req-d9975ff8-5d22-4d18-ad41-e68d6d5e70e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Refreshing network info cache for port 195536c8-52ab-4d51-b46e-5095d097728e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:30:23 compute-0 nova_compute[182935]: 2026-01-22 00:30:23.220 182939 DEBUG nova.network.neutron [req-d0d28266-b3ef-44e0-9da4-4d3b3dd2f1bf req-d9975ff8-5d22-4d18-ad41-e68d6d5e70e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Updated VIF entry in instance network info cache for port 195536c8-52ab-4d51-b46e-5095d097728e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:30:23 compute-0 nova_compute[182935]: 2026-01-22 00:30:23.221 182939 DEBUG nova.network.neutron [req-d0d28266-b3ef-44e0-9da4-4d3b3dd2f1bf req-d9975ff8-5d22-4d18-ad41-e68d6d5e70e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Updating instance_info_cache with network_info: [{"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:30:23 compute-0 nova_compute[182935]: 2026-01-22 00:30:23.241 182939 DEBUG oslo_concurrency.lockutils [req-d0d28266-b3ef-44e0-9da4-4d3b3dd2f1bf req-d9975ff8-5d22-4d18-ad41-e68d6d5e70e3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.322 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a8', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'adb1305c8f874f2684e845e88fd95ffe', 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'hostId': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.323 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.352 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.read.latency volume: 104515196 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.353 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.read.latency volume: 277278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fab64844-c546-4609-bc9b-a942d291ad07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 104515196, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-vda', 'timestamp': '2026-01-22T00:30:23.323510', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a3e2978-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.117779186, 'message_signature': '8a27d5142c5828e3862ed5dc5f9f0cc0d8baa79bc6110ebdd9159dac4571e2ab'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 277278, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-sda', 'timestamp': '2026-01-22T00:30:23.323510', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a3e3b70-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.117779186, 'message_signature': '8b1e46189188e8a16fcd7a8eedcf974a50ef409b8b5120f6558c2f36055cc9fc'}]}, 'timestamp': '2026-01-22 00:30:23.354117', '_unique_id': '9a6efac209c64b06ac6514d6fe33563c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.356 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.357 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.360 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f6ca3701-93a3-4e03-999f-1bdb7a95d933 / tap195536c8-52 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.360 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7f43698-19f7-42a2-a5cd-5f7bc98366a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-000000a8-f6ca3701-93a3-4e03-999f-1bdb7a95d933-tap195536c8-52', 'timestamp': '2026-01-22T00:30:23.357570', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'tap195536c8-52', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:c5:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap195536c8-52'}, 'message_id': '8a3f51d6-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.151840975, 'message_signature': 'bbf8c8215755ed7a8f251446fd8f1bcf09ee8b52266c99145c1e4ade42dfee71'}]}, 'timestamp': '2026-01-22 00:30:23.361233', '_unique_id': 'a0203080d15047ea9d875d34c04cb409'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.362 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.363 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.378 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.379 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance f6ca3701-93a3-4e03-999f-1bdb7a95d933: ceilometer.compute.pollsters.NoVolumeException
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.379 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.389 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.390 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '284ed3d1-2a6b-4efa-b7d7-e35ff99770c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-vda', 'timestamp': '2026-01-22T00:30:23.380068', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a43c770-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.174365228, 'message_signature': 'a8a905efeabaff6d6938a314fcafe594106aa62e110d405002a05fdd725c9f4e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-sda', 'timestamp': '2026-01-22T00:30:23.380068', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a43d454-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.174365228, 'message_signature': '3b67699cc72bf5ff70e7a77498a8271e000d5571e637553c9b786b6472a62a7c'}]}, 'timestamp': '2026-01-22 00:30:23.390742', '_unique_id': '6cf81a8f7ae04e0fbe035f054f18011f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.391 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.392 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.393 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.393 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a5730ea-44af-4329-b68a-f9f4e3a2d046', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-vda', 'timestamp': '2026-01-22T00:30:23.392999', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a443782-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.174365228, 'message_signature': '7a2144d1ee36ec76f5bd534bb45417537fd739ed1e5430f82cceeb906311c40d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-sda', 'timestamp': '2026-01-22T00:30:23.392999', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a4441b4-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.174365228, 'message_signature': '72869799ba8288aee68d9ace0414b075ae96849d7acabe6adbee2c8538e48c27'}]}, 'timestamp': '2026-01-22 00:30:23.393530', '_unique_id': '05087c0c665a44e9b67cb58edbdfcc44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.394 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.395 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.395 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ed26a4e-9ff6-4152-b09a-fb6a902cd1c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-vda', 'timestamp': '2026-01-22T00:30:23.395095', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a44893a-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.117779186, 'message_signature': '0ed289c3c281c15951303ff6371583154f1ba79d18c663e5f5c90ac399f692ef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-sda', 'timestamp': '2026-01-22T00:30:23.395095', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a4492a4-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.117779186, 'message_signature': 'f923718b3f7f59d94623eb0eaf8093a04b5acc1ccc2e63c6219643356c1956f7'}]}, 'timestamp': '2026-01-22 00:30:23.395595', '_unique_id': 'bceb3c3eaba746719bd3c879007911f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.396 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.397 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.397 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.397 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1402697012>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1402697012>]
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.397 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.397 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e545814-557b-4535-bb4b-d824971c4e60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-vda', 'timestamp': '2026-01-22T00:30:23.397842', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a44f5f0-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.117779186, 'message_signature': 'dc0dde08f8b036bf1ffe5fbeda0760fbad8487fe89e31d5ab32afdf007cd6126'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-sda', 'timestamp': '2026-01-22T00:30:23.397842', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a45005e-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.117779186, 'message_signature': '036434f35dab3002e9c80a43166280ccbc4f924cd85855fe1f79b2c8ced05e1d'}]}, 'timestamp': '2026-01-22 00:30:23.398416', '_unique_id': '6c986db14c814450a9ea5428738d4974'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.398 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.399 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7aa7470-0c7e-4538-b21c-a40761628ca2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-000000a8-f6ca3701-93a3-4e03-999f-1bdb7a95d933-tap195536c8-52', 'timestamp': '2026-01-22T00:30:23.400054', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'tap195536c8-52', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:c5:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap195536c8-52'}, 'message_id': '8a454b22-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.151840975, 'message_signature': 'f094a2e7f939455cb09b7e1d371b56a16920bc79b77b2e8428504e22cd46dcf3'}]}, 'timestamp': '2026-01-22 00:30:23.400335', '_unique_id': 'f1b3f76f65134c2c9a15099865b27d42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.400 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.401 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.401 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0d7d777-c887-41e1-b51d-45eabe5cd654', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-vda', 'timestamp': '2026-01-22T00:30:23.401851', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a459186-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.117779186, 'message_signature': 'dd10a41cf58bcdc02d3bd006f09c2aa1c5a1893957e8dc5cd9575651822849b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-sda', 'timestamp': '2026-01-22T00:30:23.401851', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a459b9a-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.117779186, 'message_signature': 'ef11cbf222c1ce43506e9b88cb22fc6e8996d4ab97da4f0d4f6745f59c587d20'}]}, 'timestamp': '2026-01-22 00:30:23.402380', '_unique_id': 'a032955d1d7b4fe2ba8ba7d5e7041443'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.402 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.404 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.404 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32184cdc-6424-4155-b99e-afec58671ebb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-000000a8-f6ca3701-93a3-4e03-999f-1bdb7a95d933-tap195536c8-52', 'timestamp': '2026-01-22T00:30:23.404499', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'tap195536c8-52', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:c5:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap195536c8-52'}, 'message_id': '8a45f8b0-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.151840975, 'message_signature': '96f46732b44bb9cbd85e51eb272a57723d8e9b0c992fbc23b260c8a3d43866ca'}]}, 'timestamp': '2026-01-22 00:30:23.404781', '_unique_id': 'c3ebf9dbd6874f9c8189268ffd8594d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.406 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.406 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0f45de3-9dcd-4702-88d9-d71f85954122', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-000000a8-f6ca3701-93a3-4e03-999f-1bdb7a95d933-tap195536c8-52', 'timestamp': '2026-01-22T00:30:23.406353', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'tap195536c8-52', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:c5:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap195536c8-52'}, 'message_id': '8a46419e-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.151840975, 'message_signature': '5ed8a7d12acef62983052b7f97a8cb1f3a472aefa179c840c37872ce91446712'}]}, 'timestamp': '2026-01-22 00:30:23.406666', '_unique_id': '5c9d8d6c5cf441399ee266c92d16d166'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.408 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.408 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.408 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0964ba7c-36e3-4482-a400-a85273e8fdaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-vda', 'timestamp': '2026-01-22T00:30:23.408215', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a468ac8-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.174365228, 'message_signature': '11939d01c843bbc91fd184d8139244150f4a8543caaef240bec4c3ce620e8930'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-sda', 'timestamp': '2026-01-22T00:30:23.408215', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a4695ae-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.174365228, 'message_signature': 'debeea199721dad1ee2170b8b94c67ab049225c2678b5d78cc82455e8b24f554'}]}, 'timestamp': '2026-01-22 00:30:23.408786', '_unique_id': '35df67dea5504cdbae24fec48aad1931'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.409 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.410 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.410 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.410 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd401166f-68d5-42f9-a7f2-5f8ef2250231', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-vda', 'timestamp': '2026-01-22T00:30:23.410599', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a46e7ac-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.117779186, 'message_signature': '4630ec932c2e40f08320a7ebc98f9b999ff20ee04e985aeb3bb044ddc35b93d1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-sda', 'timestamp': '2026-01-22T00:30:23.410599', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a46f378-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.117779186, 'message_signature': '41f7325be9989f018ed210ac9e6dd9126b0ae9937ae6879a801f8341a93f2a3f'}]}, 'timestamp': '2026-01-22 00:30:23.411183', '_unique_id': 'f3c6c89eef1e4873b0d249a07cd15046'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.412 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.412 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ad59ce8-79f7-46a4-bf3b-8f450df125ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-000000a8-f6ca3701-93a3-4e03-999f-1bdb7a95d933-tap195536c8-52', 'timestamp': '2026-01-22T00:30:23.412620', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'tap195536c8-52', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:c5:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap195536c8-52'}, 'message_id': '8a4735ae-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.151840975, 'message_signature': '25d57d28efa4991a62200fecb953e11c90afc0f42533e25faec4e80b8dce23a9'}]}, 'timestamp': '2026-01-22 00:30:23.412929', '_unique_id': '8776af9288f94654b294f69b94b0ac4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.413 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.414 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.414 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.414 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1402697012>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1402697012>]
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.414 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.414 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03f8c919-78e0-4750-8ad1-757246f7c8b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-000000a8-f6ca3701-93a3-4e03-999f-1bdb7a95d933-tap195536c8-52', 'timestamp': '2026-01-22T00:30:23.414857', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'tap195536c8-52', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:c5:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap195536c8-52'}, 'message_id': '8a479058-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.151840975, 'message_signature': 'b93b8d5bc0f2960c6be6f1b69e188d0162b5dde49cb59e486f0b34b07d51f0e6'}]}, 'timestamp': '2026-01-22 00:30:23.415212', '_unique_id': 'e52c60215c624cafad1e8c30205bbf9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.415 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.416 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.416 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6dfedf3d-44e3-4089-9286-3a7826d9037b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-000000a8-f6ca3701-93a3-4e03-999f-1bdb7a95d933-tap195536c8-52', 'timestamp': '2026-01-22T00:30:23.416687', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'tap195536c8-52', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:c5:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap195536c8-52'}, 'message_id': '8a47d4fa-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.151840975, 'message_signature': '5cd6f70a30140a43ff43bca0425ac69c783767a29370b1b098dbb855a2060a40'}]}, 'timestamp': '2026-01-22 00:30:23.416990', '_unique_id': 'f45562227ca342818447c2cd1887eae3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.418 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.418 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.418 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc4bc2e1-1c74-4019-8284-819de64e7d1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-vda', 'timestamp': '2026-01-22T00:30:23.418671', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8a4822de-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.117779186, 'message_signature': 'c15ca80b616ac11335be39a44ef2b8406b2f57f4bbd4e52c80fe37434899f417'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933-sda', 'timestamp': '2026-01-22T00:30:23.418671', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8a482c84-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.117779186, 'message_signature': 'f23e449a1adab33a0edd7efe5624042813c42f98aded6e4bce5ed4c04a7b3d61'}]}, 'timestamp': '2026-01-22 00:30:23.419196', '_unique_id': 'f43869ed02c34fd8af69e6ca2d115167'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.419 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.420 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.420 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f079c0c-e68c-4ddf-ab75-68333e7698c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-000000a8-f6ca3701-93a3-4e03-999f-1bdb7a95d933-tap195536c8-52', 'timestamp': '2026-01-22T00:30:23.420683', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'tap195536c8-52', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:c5:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap195536c8-52'}, 'message_id': '8a487158-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.151840975, 'message_signature': '4005d5460e42bd233b438ca4f829ff379c4a32865b3eac75eb900562ba8cfe23'}]}, 'timestamp': '2026-01-22 00:30:23.421003', '_unique_id': '835c52a56147454b9b9d129dda280f60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.421 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.422 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.422 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4c900c0-d7ed-41a4-b05c-2f1e6588111c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-000000a8-f6ca3701-93a3-4e03-999f-1bdb7a95d933-tap195536c8-52', 'timestamp': '2026-01-22T00:30:23.422417', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'tap195536c8-52', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:c5:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap195536c8-52'}, 'message_id': '8a48b42e-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.151840975, 'message_signature': '399d90ea4deeacb152c2c2880cc27a427f64f565df8a4425ab22ef143f76b516'}]}, 'timestamp': '2026-01-22 00:30:23.422691', '_unique_id': '975e22f08e7740f681dba8a3780c3ada'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.423 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.424 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.424 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86948a82-f891-4a0e-83d3-1707a0339f39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'instance-000000a8-f6ca3701-93a3-4e03-999f-1bdb7a95d933-tap195536c8-52', 'timestamp': '2026-01-22T00:30:23.424138', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'tap195536c8-52', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3a:c5:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap195536c8-52'}, 'message_id': '8a48f7cc-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.151840975, 'message_signature': '365147a4dc088747b3587d6fca51f04f39f32ced3e6c6a2c93d9137e62e87bf2'}]}, 'timestamp': '2026-01-22 00:30:23.424414', '_unique_id': '7bd39b7ab5a1435c9eff11140a0a848f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.425 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 DEBUG ceilometer.compute.pollsters [-] f6ca3701-93a3-4e03-999f-1bdb7a95d933/cpu volume: 7130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3efa3a2b-e49d-407a-a2c0-678b3fd82fcf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7130000000, 'user_id': '635cc2f351c344dc8e2b1264080dbafb', 'user_name': None, 'project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'project_name': None, 'resource_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'timestamp': '2026-01-22T00:30:23.426132', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1402697012', 'name': 'instance-000000a8', 'instance_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'instance_type': 'm1.nano', 'host': '0f4a01e08a4f3a9d65c74028687a345bc4d7b81d590be3f51b8f3988', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8a494628-f729-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6326.172930044, 'message_signature': 'b253caae676ebca7080227b78f3d6f3c772c027175ce76754e9b9530c87facc8'}]}, 'timestamp': '2026-01-22 00:30:23.426413', '_unique_id': 'd1874f640f5e4b8d924c2d3a7fd1fe01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.426 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.427 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.427 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.427 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1402697012>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1402697012>]
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.428 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.428 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:30:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:30:23.428 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1402697012>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1402697012>]
Jan 22 00:30:24 compute-0 nova_compute[182935]: 2026-01-22 00:30:24.352 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:24 compute-0 nova_compute[182935]: 2026-01-22 00:30:24.833 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:29 compute-0 nova_compute[182935]: 2026-01-22 00:30:29.353 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:29 compute-0 ovn_controller[95047]: 2026-01-22T00:30:29Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3a:c5:d4 10.100.0.5
Jan 22 00:30:29 compute-0 ovn_controller[95047]: 2026-01-22T00:30:29Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3a:c5:d4 10.100.0.5
Jan 22 00:30:29 compute-0 nova_compute[182935]: 2026-01-22 00:30:29.873 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:33 compute-0 podman[241354]: 2026-01-22 00:30:33.69361243 +0000 UTC m=+0.049110363 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:30:33 compute-0 podman[241352]: 2026-01-22 00:30:33.693655121 +0000 UTC m=+0.055795483 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:30:33 compute-0 podman[241353]: 2026-01-22 00:30:33.768690735 +0000 UTC m=+0.127909197 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller)
Jan 22 00:30:34 compute-0 nova_compute[182935]: 2026-01-22 00:30:34.397 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:34 compute-0 sshd-session[241425]: Received disconnect from 91.224.92.54 port 15390:11:  [preauth]
Jan 22 00:30:34 compute-0 sshd-session[241425]: Disconnected from authenticating user root 91.224.92.54 port 15390 [preauth]
Jan 22 00:30:34 compute-0 nova_compute[182935]: 2026-01-22 00:30:34.876 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:35 compute-0 nova_compute[182935]: 2026-01-22 00:30:35.237 182939 INFO nova.compute.manager [None req-553ef3e8-b9d5-4bd2-9c3b-d82d8172973b 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Get console output
Jan 22 00:30:35 compute-0 nova_compute[182935]: 2026-01-22 00:30:35.243 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:30:35 compute-0 nova_compute[182935]: 2026-01-22 00:30:35.626 182939 DEBUG nova.objects.instance [None req-1397d6fa-586f-43c9-b90f-f8bab12b839c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid f6ca3701-93a3-4e03-999f-1bdb7a95d933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:30:35 compute-0 nova_compute[182935]: 2026-01-22 00:30:35.650 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041835.6498523, f6ca3701-93a3-4e03-999f-1bdb7a95d933 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:30:35 compute-0 nova_compute[182935]: 2026-01-22 00:30:35.650 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] VM Paused (Lifecycle Event)
Jan 22 00:30:35 compute-0 nova_compute[182935]: 2026-01-22 00:30:35.675 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:30:35 compute-0 nova_compute[182935]: 2026-01-22 00:30:35.678 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:30:35 compute-0 nova_compute[182935]: 2026-01-22 00:30:35.730 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 22 00:30:36 compute-0 kernel: tap195536c8-52 (unregistering): left promiscuous mode
Jan 22 00:30:36 compute-0 NetworkManager[55139]: <info>  [1769041836.8175] device (tap195536c8-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:30:36 compute-0 ovn_controller[95047]: 2026-01-22T00:30:36Z|00675|binding|INFO|Releasing lport 195536c8-52ab-4d51-b46e-5095d097728e from this chassis (sb_readonly=0)
Jan 22 00:30:36 compute-0 ovn_controller[95047]: 2026-01-22T00:30:36Z|00676|binding|INFO|Setting lport 195536c8-52ab-4d51-b46e-5095d097728e down in Southbound
Jan 22 00:30:36 compute-0 nova_compute[182935]: 2026-01-22 00:30:36.874 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:36 compute-0 ovn_controller[95047]: 2026-01-22T00:30:36Z|00677|binding|INFO|Removing iface tap195536c8-52 ovn-installed in OVS
Jan 22 00:30:36 compute-0 nova_compute[182935]: 2026-01-22 00:30:36.878 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:36.889 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:c5:d4 10.100.0.5'], port_security=['fa:16:3e:3a:c5:d4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '4', 'neutron:security_group_ids': '82d1c837-3db3-422e-ad4c-90f973791238', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.193'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c12d587-6450-4baa-9602-df652658847f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=195536c8-52ab-4d51-b46e-5095d097728e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:30:36 compute-0 nova_compute[182935]: 2026-01-22 00:30:36.890 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:36.891 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 195536c8-52ab-4d51-b46e-5095d097728e in datapath 9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 unbound from our chassis
Jan 22 00:30:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:36.893 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:30:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:36.895 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ea3373-cd3e-4f3c-ba0f-bb46cfc5b8bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:36 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:36.895 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 namespace which is not needed anymore
Jan 22 00:30:36 compute-0 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000a8.scope: Deactivated successfully.
Jan 22 00:30:36 compute-0 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000a8.scope: Consumed 13.205s CPU time.
Jan 22 00:30:36 compute-0 systemd-machined[154182]: Machine qemu-87-instance-000000a8 terminated.
Jan 22 00:30:37 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241281]: [NOTICE]   (241285) : haproxy version is 2.8.14-c23fe91
Jan 22 00:30:37 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241281]: [NOTICE]   (241285) : path to executable is /usr/sbin/haproxy
Jan 22 00:30:37 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241281]: [WARNING]  (241285) : Exiting Master process...
Jan 22 00:30:37 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241281]: [ALERT]    (241285) : Current worker (241287) exited with code 143 (Terminated)
Jan 22 00:30:37 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241281]: [WARNING]  (241285) : All workers exited. Exiting... (0)
Jan 22 00:30:37 compute-0 systemd[1]: libpod-12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1.scope: Deactivated successfully.
Jan 22 00:30:37 compute-0 nova_compute[182935]: 2026-01-22 00:30:37.050 182939 DEBUG nova.compute.manager [None req-1397d6fa-586f-43c9-b90f-f8bab12b839c 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:30:37 compute-0 podman[241456]: 2026-01-22 00:30:37.056622031 +0000 UTC m=+0.049691695 container died 12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 00:30:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1-userdata-shm.mount: Deactivated successfully.
Jan 22 00:30:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-a4ef95dab02a87cd840301cc68231d83cbb2a88c23cc790b7afc70f07e9c06be-merged.mount: Deactivated successfully.
Jan 22 00:30:37 compute-0 podman[241456]: 2026-01-22 00:30:37.092259079 +0000 UTC m=+0.085328743 container cleanup 12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:30:37 compute-0 systemd[1]: libpod-conmon-12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1.scope: Deactivated successfully.
Jan 22 00:30:37 compute-0 podman[241501]: 2026-01-22 00:30:37.151128085 +0000 UTC m=+0.039290726 container remove 12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 00:30:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:37.157 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[17130152-3bf4-493f-927c-c3266873a123]: (4, ('Thu Jan 22 12:30:36 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 (12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1)\n12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1\nThu Jan 22 12:30:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 (12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1)\n12cf04ad138b805c63892598264016fd1e7c704ab097b8964ac7f3fb36b3e1d1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:37.159 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dc241200-f136-47e6-ac89-264e932f56e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:37.160 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bb9bb99-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:37 compute-0 nova_compute[182935]: 2026-01-22 00:30:37.161 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:37 compute-0 kernel: tap9bb9bb99-80: left promiscuous mode
Jan 22 00:30:37 compute-0 nova_compute[182935]: 2026-01-22 00:30:37.180 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:37.183 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b412d0af-7cb0-41d1-842e-31681c334e47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:37.209 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[af78ae25-f970-477a-b059-ed58b97197a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:37.211 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[debfb137-952a-4aad-8fd0-5ad323bc26f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:37.235 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[23961103-b985-45ac-bd03-c6f4f0646e40]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 631822, 'reachable_time': 16020, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241520, 'error': None, 'target': 'ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:37.238 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:30:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:37.238 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[5cad059c-8da4-4847-aeba-ccc37ca8d225]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:37 compute-0 systemd[1]: run-netns-ovnmeta\x2d9bb9bb99\x2d8cf7\x2d4ecd\x2d8fcb\x2df85c3ad9ea58.mount: Deactivated successfully.
Jan 22 00:30:37 compute-0 nova_compute[182935]: 2026-01-22 00:30:37.817 182939 DEBUG nova.compute.manager [req-c91d5b70-b92a-411d-916b-4f290d1a1ff1 req-d6dcb8a0-c313-4ab6-b15b-a8866a49bb76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-vif-unplugged-195536c8-52ab-4d51-b46e-5095d097728e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:30:37 compute-0 nova_compute[182935]: 2026-01-22 00:30:37.818 182939 DEBUG oslo_concurrency.lockutils [req-c91d5b70-b92a-411d-916b-4f290d1a1ff1 req-d6dcb8a0-c313-4ab6-b15b-a8866a49bb76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:37 compute-0 nova_compute[182935]: 2026-01-22 00:30:37.819 182939 DEBUG oslo_concurrency.lockutils [req-c91d5b70-b92a-411d-916b-4f290d1a1ff1 req-d6dcb8a0-c313-4ab6-b15b-a8866a49bb76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:37 compute-0 nova_compute[182935]: 2026-01-22 00:30:37.819 182939 DEBUG oslo_concurrency.lockutils [req-c91d5b70-b92a-411d-916b-4f290d1a1ff1 req-d6dcb8a0-c313-4ab6-b15b-a8866a49bb76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:37 compute-0 nova_compute[182935]: 2026-01-22 00:30:37.820 182939 DEBUG nova.compute.manager [req-c91d5b70-b92a-411d-916b-4f290d1a1ff1 req-d6dcb8a0-c313-4ab6-b15b-a8866a49bb76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] No waiting events found dispatching network-vif-unplugged-195536c8-52ab-4d51-b46e-5095d097728e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:30:37 compute-0 nova_compute[182935]: 2026-01-22 00:30:37.820 182939 WARNING nova.compute.manager [req-c91d5b70-b92a-411d-916b-4f290d1a1ff1 req-d6dcb8a0-c313-4ab6-b15b-a8866a49bb76 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received unexpected event network-vif-unplugged-195536c8-52ab-4d51-b46e-5095d097728e for instance with vm_state suspended and task_state None.
Jan 22 00:30:39 compute-0 nova_compute[182935]: 2026-01-22 00:30:39.399 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:39 compute-0 nova_compute[182935]: 2026-01-22 00:30:39.877 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:39 compute-0 nova_compute[182935]: 2026-01-22 00:30:39.933 182939 DEBUG nova.compute.manager [req-9a356391-6df5-4b10-8678-0a4c05b2f55c req-19c5f8f0-083c-4257-9ea6-4908eb198c87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:30:39 compute-0 nova_compute[182935]: 2026-01-22 00:30:39.934 182939 DEBUG oslo_concurrency.lockutils [req-9a356391-6df5-4b10-8678-0a4c05b2f55c req-19c5f8f0-083c-4257-9ea6-4908eb198c87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:39 compute-0 nova_compute[182935]: 2026-01-22 00:30:39.934 182939 DEBUG oslo_concurrency.lockutils [req-9a356391-6df5-4b10-8678-0a4c05b2f55c req-19c5f8f0-083c-4257-9ea6-4908eb198c87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:39 compute-0 nova_compute[182935]: 2026-01-22 00:30:39.934 182939 DEBUG oslo_concurrency.lockutils [req-9a356391-6df5-4b10-8678-0a4c05b2f55c req-19c5f8f0-083c-4257-9ea6-4908eb198c87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:39 compute-0 nova_compute[182935]: 2026-01-22 00:30:39.934 182939 DEBUG nova.compute.manager [req-9a356391-6df5-4b10-8678-0a4c05b2f55c req-19c5f8f0-083c-4257-9ea6-4908eb198c87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] No waiting events found dispatching network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:30:39 compute-0 nova_compute[182935]: 2026-01-22 00:30:39.934 182939 WARNING nova.compute.manager [req-9a356391-6df5-4b10-8678-0a4c05b2f55c req-19c5f8f0-083c-4257-9ea6-4908eb198c87 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received unexpected event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e for instance with vm_state suspended and task_state None.
Jan 22 00:30:40 compute-0 nova_compute[182935]: 2026-01-22 00:30:40.935 182939 INFO nova.compute.manager [None req-ad80a539-73f0-4c76-9508-ecc05f0241fb 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Get console output
Jan 22 00:30:41 compute-0 nova_compute[182935]: 2026-01-22 00:30:41.125 182939 INFO nova.compute.manager [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Resuming
Jan 22 00:30:41 compute-0 nova_compute[182935]: 2026-01-22 00:30:41.126 182939 DEBUG nova.objects.instance [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'flavor' on Instance uuid f6ca3701-93a3-4e03-999f-1bdb7a95d933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:30:41 compute-0 nova_compute[182935]: 2026-01-22 00:30:41.174 182939 DEBUG oslo_concurrency.lockutils [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:30:41 compute-0 nova_compute[182935]: 2026-01-22 00:30:41.174 182939 DEBUG oslo_concurrency.lockutils [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquired lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:30:41 compute-0 nova_compute[182935]: 2026-01-22 00:30:41.175 182939 DEBUG nova.network.neutron [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:30:42 compute-0 podman[241521]: 2026-01-22 00:30:42.674000328 +0000 UTC m=+0.042390460 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.196 182939 DEBUG nova.network.neutron [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Updating instance_info_cache with network_info: [{"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.235 182939 DEBUG oslo_concurrency.lockutils [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Releasing lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.239 182939 DEBUG nova.virt.libvirt.vif [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:30:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1402697012',display_name='tempest-TestNetworkAdvancedServerOps-server-1402697012',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1402697012',id=168,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBSw81OruvzkVRarS+CU3t8/T+uR28raj1bWORjbeWaVlQSCjmt5S8ac1rdgwBP3Lkgrq7bMpc6sdfr/rqNs5mLih3Et/pOLQTSFRciDH7qDEioqcWvyyJnrprDf9EeD3Q==',key_name='tempest-TestNetworkAdvancedServerOps-858203030',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:30:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-fz0brlil',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:30:37Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=f6ca3701-93a3-4e03-999f-1bdb7a95d933,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.240 182939 DEBUG nova.network.os_vif_util [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.240 182939 DEBUG nova.network.os_vif_util [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:c5:d4,bridge_name='br-int',has_traffic_filtering=True,id=195536c8-52ab-4d51-b46e-5095d097728e,network=Network(9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195536c8-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.241 182939 DEBUG os_vif [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:c5:d4,bridge_name='br-int',has_traffic_filtering=True,id=195536c8-52ab-4d51-b46e-5095d097728e,network=Network(9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195536c8-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.241 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.241 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.242 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.244 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.244 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap195536c8-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.244 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap195536c8-52, col_values=(('external_ids', {'iface-id': '195536c8-52ab-4d51-b46e-5095d097728e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3a:c5:d4', 'vm-uuid': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.245 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.245 182939 INFO os_vif [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:c5:d4,bridge_name='br-int',has_traffic_filtering=True,id=195536c8-52ab-4d51-b46e-5095d097728e,network=Network(9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195536c8-52')
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.264 182939 DEBUG nova.objects.instance [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'numa_topology' on Instance uuid f6ca3701-93a3-4e03-999f-1bdb7a95d933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:30:43 compute-0 kernel: tap195536c8-52: entered promiscuous mode
Jan 22 00:30:43 compute-0 NetworkManager[55139]: <info>  [1769041843.3451] manager: (tap195536c8-52): new Tun device (/org/freedesktop/NetworkManager/Devices/332)
Jan 22 00:30:43 compute-0 ovn_controller[95047]: 2026-01-22T00:30:43Z|00678|binding|INFO|Claiming lport 195536c8-52ab-4d51-b46e-5095d097728e for this chassis.
Jan 22 00:30:43 compute-0 ovn_controller[95047]: 2026-01-22T00:30:43Z|00679|binding|INFO|195536c8-52ab-4d51-b46e-5095d097728e: Claiming fa:16:3e:3a:c5:d4 10.100.0.5
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.347 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:43 compute-0 ovn_controller[95047]: 2026-01-22T00:30:43Z|00680|binding|INFO|Setting lport 195536c8-52ab-4d51-b46e-5095d097728e ovn-installed in OVS
Jan 22 00:30:43 compute-0 ovn_controller[95047]: 2026-01-22T00:30:43Z|00681|binding|INFO|Setting lport 195536c8-52ab-4d51-b46e-5095d097728e up in Southbound
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.360 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:c5:d4 10.100.0.5'], port_security=['fa:16:3e:3a:c5:d4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '5', 'neutron:security_group_ids': '82d1c837-3db3-422e-ad4c-90f973791238', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.193'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c12d587-6450-4baa-9602-df652658847f, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=195536c8-52ab-4d51-b46e-5095d097728e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.361 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 195536c8-52ab-4d51-b46e-5095d097728e in datapath 9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 bound to our chassis
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.361 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.362 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.364 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:43 compute-0 systemd-udevd[241554]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.374 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc2bb61-f4b9-4b09-9ed6-eb5e5fc5af44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.375 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9bb9bb99-81 in ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.377 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9bb9bb99-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.377 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[64f4e6bb-1fe9-42ab-b5f3-bf2adad1131d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.378 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2b83c65f-3127-4785-8883-b5f40a40843a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 NetworkManager[55139]: <info>  [1769041843.3810] device (tap195536c8-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:30:43 compute-0 NetworkManager[55139]: <info>  [1769041843.3816] device (tap195536c8-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.389 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[a938bbb8-ca79-4756-8ddb-516d67a7ea4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 systemd-machined[154182]: New machine qemu-88-instance-000000a8.
Jan 22 00:30:43 compute-0 systemd[1]: Started Virtual Machine qemu-88-instance-000000a8.
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.414 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[68430d19-3ff0-484e-99a6-c184c39ff20f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.443 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9f813741-1cd7-4b7f-93ba-a13bc95a446f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.448 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ab168ef6-4dba-4544-9d94-ac5dc2495f09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 NetworkManager[55139]: <info>  [1769041843.4497] manager: (tap9bb9bb99-80): new Veth device (/org/freedesktop/NetworkManager/Devices/333)
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.480 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[1a7507f1-2e94-4a77-b8d7-4813d100d76d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.485 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[91a05366-ec72-40c2-bf18-d52e48b55d77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 NetworkManager[55139]: <info>  [1769041843.5050] device (tap9bb9bb99-80): carrier: link connected
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.512 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[51eda43a-7f03-4faa-bc8e-bd5110c1fabe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.528 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5e65aa13-4a37-4fbc-a0b3-990eb4b5b0a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bb9bb99-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:1f:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634624, 'reachable_time': 39370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241590, 'error': None, 'target': 'ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.544 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[573d4094-1781-4e5b-9f38-6ae3578649fb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedb:1f5e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 634624, 'tstamp': 634624}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241591, 'error': None, 'target': 'ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.558 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4395feb2-f5ed-449b-8594-1b99f6aa0897]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bb9bb99-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:1f:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634624, 'reachable_time': 39370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241592, 'error': None, 'target': 'ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.587 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b4301d08-3308-4a72-b6a4-b79d337404b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.592 182939 DEBUG nova.compute.manager [req-325660e2-34d2-4816-a005-9ec22bee2ccc req-84225a6f-f31a-4586-9e0e-4d23b2c7c79a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.593 182939 DEBUG oslo_concurrency.lockutils [req-325660e2-34d2-4816-a005-9ec22bee2ccc req-84225a6f-f31a-4586-9e0e-4d23b2c7c79a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.593 182939 DEBUG oslo_concurrency.lockutils [req-325660e2-34d2-4816-a005-9ec22bee2ccc req-84225a6f-f31a-4586-9e0e-4d23b2c7c79a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.594 182939 DEBUG oslo_concurrency.lockutils [req-325660e2-34d2-4816-a005-9ec22bee2ccc req-84225a6f-f31a-4586-9e0e-4d23b2c7c79a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.594 182939 DEBUG nova.compute.manager [req-325660e2-34d2-4816-a005-9ec22bee2ccc req-84225a6f-f31a-4586-9e0e-4d23b2c7c79a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] No waiting events found dispatching network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.594 182939 WARNING nova.compute.manager [req-325660e2-34d2-4816-a005-9ec22bee2ccc req-84225a6f-f31a-4586-9e0e-4d23b2c7c79a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received unexpected event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e for instance with vm_state suspended and task_state resuming.
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.644 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8ddbbdc6-4fb4-4cad-bc41-c5ac758018cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.645 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bb9bb99-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.646 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.646 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bb9bb99-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.647 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:43 compute-0 kernel: tap9bb9bb99-80: entered promiscuous mode
Jan 22 00:30:43 compute-0 NetworkManager[55139]: <info>  [1769041843.6485] manager: (tap9bb9bb99-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.650 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9bb9bb99-80, col_values=(('external_ids', {'iface-id': 'fc9607ca-e5d3-4a10-ba93-cc16b1df0b66'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:43 compute-0 ovn_controller[95047]: 2026-01-22T00:30:43Z|00682|binding|INFO|Releasing lport fc9607ca-e5d3-4a10-ba93-cc16b1df0b66 from this chassis (sb_readonly=0)
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.651 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:43 compute-0 nova_compute[182935]: 2026-01-22 00:30:43.662 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.664 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.664 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[53bc1e0f-3a58-479b-88f1-a5d80b36d921]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.665 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58.pid.haproxy
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:30:43 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:43.666 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'env', 'PROCESS_TAG=haproxy-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:30:44 compute-0 podman[241624]: 2026-01-22 00:30:44.00237266 +0000 UTC m=+0.045333691 container create 4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:30:44 compute-0 systemd[1]: Started libpod-conmon-4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80.scope.
Jan 22 00:30:44 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:30:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbb9ae7fbe59a441829a6f32345dd25cad26058006da4c69ee9ee351b61d7c85/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:30:44 compute-0 podman[241624]: 2026-01-22 00:30:43.977459751 +0000 UTC m=+0.020420792 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:30:44 compute-0 podman[241624]: 2026-01-22 00:30:44.080378477 +0000 UTC m=+0.123339498 container init 4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:30:44 compute-0 podman[241624]: 2026-01-22 00:30:44.086791711 +0000 UTC m=+0.129752732 container start 4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:30:44 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241639]: [NOTICE]   (241643) : New worker (241645) forked
Jan 22 00:30:44 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241639]: [NOTICE]   (241643) : Loading success.
Jan 22 00:30:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:44.115 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.116 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:44 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:44.144 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.399 182939 DEBUG nova.virt.libvirt.host [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Removed pending event for f6ca3701-93a3-4e03-999f-1bdb7a95d933 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.399 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041844.398873, f6ca3701-93a3-4e03-999f-1bdb7a95d933 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.400 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] VM Started (Lifecycle Event)
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.420 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.436 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.440 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769041844.4402328, f6ca3701-93a3-4e03-999f-1bdb7a95d933 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.440 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] VM Resumed (Lifecycle Event)
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.463 182939 DEBUG nova.compute.manager [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.463 182939 DEBUG nova.objects.instance [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'pci_devices' on Instance uuid f6ca3701-93a3-4e03-999f-1bdb7a95d933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.475 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.479 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.486 182939 INFO nova.virt.libvirt.driver [-] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Instance running successfully.
Jan 22 00:30:44 compute-0 virtqemud[182477]: argument unsupported: QEMU guest agent is not configured
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.489 182939 DEBUG nova.virt.libvirt.guest [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.489 182939 DEBUG nova.compute.manager [None req-019f8696-23db-415f-a3ab-64c3b9944e71 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.512 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] During sync_power_state the instance has a pending task (resuming). Skip.
Jan 22 00:30:44 compute-0 nova_compute[182935]: 2026-01-22 00:30:44.879 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:45 compute-0 nova_compute[182935]: 2026-01-22 00:30:45.684 182939 DEBUG nova.compute.manager [req-42eafd5d-9def-43a8-bd1a-92e0cbf3278d req-1f4898fa-fc9f-4f6d-ad0f-649350b703cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:30:45 compute-0 nova_compute[182935]: 2026-01-22 00:30:45.685 182939 DEBUG oslo_concurrency.lockutils [req-42eafd5d-9def-43a8-bd1a-92e0cbf3278d req-1f4898fa-fc9f-4f6d-ad0f-649350b703cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:45 compute-0 nova_compute[182935]: 2026-01-22 00:30:45.685 182939 DEBUG oslo_concurrency.lockutils [req-42eafd5d-9def-43a8-bd1a-92e0cbf3278d req-1f4898fa-fc9f-4f6d-ad0f-649350b703cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:45 compute-0 nova_compute[182935]: 2026-01-22 00:30:45.685 182939 DEBUG oslo_concurrency.lockutils [req-42eafd5d-9def-43a8-bd1a-92e0cbf3278d req-1f4898fa-fc9f-4f6d-ad0f-649350b703cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:45 compute-0 nova_compute[182935]: 2026-01-22 00:30:45.686 182939 DEBUG nova.compute.manager [req-42eafd5d-9def-43a8-bd1a-92e0cbf3278d req-1f4898fa-fc9f-4f6d-ad0f-649350b703cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] No waiting events found dispatching network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:30:45 compute-0 nova_compute[182935]: 2026-01-22 00:30:45.686 182939 WARNING nova.compute.manager [req-42eafd5d-9def-43a8-bd1a-92e0cbf3278d req-1f4898fa-fc9f-4f6d-ad0f-649350b703cf 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received unexpected event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e for instance with vm_state active and task_state None.
Jan 22 00:30:48 compute-0 nova_compute[182935]: 2026-01-22 00:30:48.099 182939 INFO nova.compute.manager [None req-ae1ec4f8-2edd-4936-a289-6671fdcd4a90 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Get console output
Jan 22 00:30:48 compute-0 nova_compute[182935]: 2026-01-22 00:30:48.104 211648 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 22 00:30:48 compute-0 podman[241662]: 2026-01-22 00:30:48.703744814 +0000 UTC m=+0.055240059 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:30:48 compute-0 podman[241661]: 2026-01-22 00:30:48.713782525 +0000 UTC m=+0.070451175 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.151 182939 DEBUG nova.compute.manager [req-2786ca0b-ea40-4652-8fd5-d58d301b57ac req-b0cb9b2f-2c0d-4b04-a2ca-d96e967ec8a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-changed-195536c8-52ab-4d51-b46e-5095d097728e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.151 182939 DEBUG nova.compute.manager [req-2786ca0b-ea40-4652-8fd5-d58d301b57ac req-b0cb9b2f-2c0d-4b04-a2ca-d96e967ec8a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Refreshing instance network info cache due to event network-changed-195536c8-52ab-4d51-b46e-5095d097728e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.152 182939 DEBUG oslo_concurrency.lockutils [req-2786ca0b-ea40-4652-8fd5-d58d301b57ac req-b0cb9b2f-2c0d-4b04-a2ca-d96e967ec8a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.152 182939 DEBUG oslo_concurrency.lockutils [req-2786ca0b-ea40-4652-8fd5-d58d301b57ac req-b0cb9b2f-2c0d-4b04-a2ca-d96e967ec8a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.152 182939 DEBUG nova.network.neutron [req-2786ca0b-ea40-4652-8fd5-d58d301b57ac req-b0cb9b2f-2c0d-4b04-a2ca-d96e967ec8a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Refreshing network info cache for port 195536c8-52ab-4d51-b46e-5095d097728e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.277 182939 DEBUG oslo_concurrency.lockutils [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.277 182939 DEBUG oslo_concurrency.lockutils [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.278 182939 DEBUG oslo_concurrency.lockutils [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.278 182939 DEBUG oslo_concurrency.lockutils [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.278 182939 DEBUG oslo_concurrency.lockutils [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.290 182939 INFO nova.compute.manager [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Terminating instance
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.302 182939 DEBUG nova.compute.manager [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:30:49 compute-0 kernel: tap195536c8-52 (unregistering): left promiscuous mode
Jan 22 00:30:49 compute-0 NetworkManager[55139]: <info>  [1769041849.3257] device (tap195536c8-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:30:49 compute-0 ovn_controller[95047]: 2026-01-22T00:30:49Z|00683|binding|INFO|Releasing lport 195536c8-52ab-4d51-b46e-5095d097728e from this chassis (sb_readonly=0)
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.332 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:49 compute-0 ovn_controller[95047]: 2026-01-22T00:30:49Z|00684|binding|INFO|Setting lport 195536c8-52ab-4d51-b46e-5095d097728e down in Southbound
Jan 22 00:30:49 compute-0 ovn_controller[95047]: 2026-01-22T00:30:49Z|00685|binding|INFO|Removing iface tap195536c8-52 ovn-installed in OVS
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.350 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:c5:d4 10.100.0.5'], port_security=['fa:16:3e:3a:c5:d4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f6ca3701-93a3-4e03-999f-1bdb7a95d933', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'adb1305c8f874f2684e845e88fd95ffe', 'neutron:revision_number': '6', 'neutron:security_group_ids': '82d1c837-3db3-422e-ad4c-90f973791238', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c12d587-6450-4baa-9602-df652658847f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=195536c8-52ab-4d51-b46e-5095d097728e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.350 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.352 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 195536c8-52ab-4d51-b46e-5095d097728e in datapath 9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 unbound from our chassis
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.353 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.355 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8ae0c70f-763d-4da2-913d-c14be7a9ce22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.355 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 namespace which is not needed anymore
Jan 22 00:30:49 compute-0 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000a8.scope: Deactivated successfully.
Jan 22 00:30:49 compute-0 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000a8.scope: Consumed 1.177s CPU time.
Jan 22 00:30:49 compute-0 systemd-machined[154182]: Machine qemu-88-instance-000000a8 terminated.
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.439 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:49 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241639]: [NOTICE]   (241643) : haproxy version is 2.8.14-c23fe91
Jan 22 00:30:49 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241639]: [NOTICE]   (241643) : path to executable is /usr/sbin/haproxy
Jan 22 00:30:49 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241639]: [WARNING]  (241643) : Exiting Master process...
Jan 22 00:30:49 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241639]: [ALERT]    (241643) : Current worker (241645) exited with code 143 (Terminated)
Jan 22 00:30:49 compute-0 neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58[241639]: [WARNING]  (241643) : All workers exited. Exiting... (0)
Jan 22 00:30:49 compute-0 systemd[1]: libpod-4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80.scope: Deactivated successfully.
Jan 22 00:30:49 compute-0 podman[241728]: 2026-01-22 00:30:49.495287223 +0000 UTC m=+0.049279576 container died 4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:30:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80-userdata-shm.mount: Deactivated successfully.
Jan 22 00:30:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-bbb9ae7fbe59a441829a6f32345dd25cad26058006da4c69ee9ee351b61d7c85-merged.mount: Deactivated successfully.
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.523 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:49 compute-0 podman[241728]: 2026-01-22 00:30:49.525221304 +0000 UTC m=+0.079213657 container cleanup 4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.527 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:49 compute-0 systemd[1]: libpod-conmon-4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80.scope: Deactivated successfully.
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.558 182939 INFO nova.virt.libvirt.driver [-] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Instance destroyed successfully.
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.559 182939 DEBUG nova.objects.instance [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lazy-loading 'resources' on Instance uuid f6ca3701-93a3-4e03-999f-1bdb7a95d933 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:30:49 compute-0 podman[241762]: 2026-01-22 00:30:49.584490569 +0000 UTC m=+0.039424839 container remove 4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.589 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[750d867e-1719-4ecb-b8af-4f91f94edcdc]: (4, ('Thu Jan 22 12:30:49 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 (4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80)\n4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80\nThu Jan 22 12:30:49 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 (4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80)\n4af4aa66b96f73d6044d00f0f8c0897f5aa44e487370bdbedd3c57745e11fe80\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.591 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[053d4228-83a1-4cc8-8cd7-d05501f4daa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.592 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bb9bb99-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.594 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:49 compute-0 kernel: tap9bb9bb99-80: left promiscuous mode
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.609 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.613 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cef57f94-4618-4fb7-890b-e05908903460]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.627 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[68ed0508-8b34-404e-ba7c-a37c56ec1883]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.628 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b36e5f25-32fe-4b09-b2de-31d847d81480]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.645 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b6373b6e-87d7-4d89-b41f-51a139904c0f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 634617, 'reachable_time': 35780, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241791, 'error': None, 'target': 'ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:49 compute-0 systemd[1]: run-netns-ovnmeta\x2d9bb9bb99\x2d8cf7\x2d4ecd\x2d8fcb\x2df85c3ad9ea58.mount: Deactivated successfully.
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.648 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:30:49 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:49.648 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[8a1c6766-4acd-4a49-a1a1-9c86d34aaf29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.650 182939 DEBUG nova.virt.libvirt.vif [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:30:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1402697012',display_name='tempest-TestNetworkAdvancedServerOps-server-1402697012',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1402697012',id=168,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBSw81OruvzkVRarS+CU3t8/T+uR28raj1bWORjbeWaVlQSCjmt5S8ac1rdgwBP3Lkgrq7bMpc6sdfr/rqNs5mLih3Et/pOLQTSFRciDH7qDEioqcWvyyJnrprDf9EeD3Q==',key_name='tempest-TestNetworkAdvancedServerOps-858203030',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:30:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='adb1305c8f874f2684e845e88fd95ffe',ramdisk_id='',reservation_id='r-fz0brlil',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-587955072',owner_user_name='tempest-TestNetworkAdvancedServerOps-587955072-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:30:44Z,user_data=None,user_id='635cc2f351c344dc8e2b1264080dbafb',uuid=f6ca3701-93a3-4e03-999f-1bdb7a95d933,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.650 182939 DEBUG nova.network.os_vif_util [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converting VIF {"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.651 182939 DEBUG nova.network.os_vif_util [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3a:c5:d4,bridge_name='br-int',has_traffic_filtering=True,id=195536c8-52ab-4d51-b46e-5095d097728e,network=Network(9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195536c8-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.651 182939 DEBUG os_vif [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:c5:d4,bridge_name='br-int',has_traffic_filtering=True,id=195536c8-52ab-4d51-b46e-5095d097728e,network=Network(9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195536c8-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.653 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.654 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap195536c8-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.655 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.656 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.658 182939 INFO os_vif [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3a:c5:d4,bridge_name='br-int',has_traffic_filtering=True,id=195536c8-52ab-4d51-b46e-5095d097728e,network=Network(9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap195536c8-52')
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.659 182939 INFO nova.virt.libvirt.driver [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Deleting instance files /var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933_del
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.660 182939 INFO nova.virt.libvirt.driver [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Deletion of /var/lib/nova/instances/f6ca3701-93a3-4e03-999f-1bdb7a95d933_del complete
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.734 182939 INFO nova.compute.manager [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Took 0.43 seconds to destroy the instance on the hypervisor.
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.735 182939 DEBUG oslo.service.loopingcall [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.735 182939 DEBUG nova.compute.manager [-] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.736 182939 DEBUG nova.network.neutron [-] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.875 182939 DEBUG nova.compute.manager [req-27fcdd39-7baa-4f6e-b8f6-3966ee525f1b req-f2918a41-db4f-4f7f-b946-08328e59f6e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-vif-unplugged-195536c8-52ab-4d51-b46e-5095d097728e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.875 182939 DEBUG oslo_concurrency.lockutils [req-27fcdd39-7baa-4f6e-b8f6-3966ee525f1b req-f2918a41-db4f-4f7f-b946-08328e59f6e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.876 182939 DEBUG oslo_concurrency.lockutils [req-27fcdd39-7baa-4f6e-b8f6-3966ee525f1b req-f2918a41-db4f-4f7f-b946-08328e59f6e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.876 182939 DEBUG oslo_concurrency.lockutils [req-27fcdd39-7baa-4f6e-b8f6-3966ee525f1b req-f2918a41-db4f-4f7f-b946-08328e59f6e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.876 182939 DEBUG nova.compute.manager [req-27fcdd39-7baa-4f6e-b8f6-3966ee525f1b req-f2918a41-db4f-4f7f-b946-08328e59f6e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] No waiting events found dispatching network-vif-unplugged-195536c8-52ab-4d51-b46e-5095d097728e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:30:49 compute-0 nova_compute[182935]: 2026-01-22 00:30:49.876 182939 DEBUG nova.compute.manager [req-27fcdd39-7baa-4f6e-b8f6-3966ee525f1b req-f2918a41-db4f-4f7f-b946-08328e59f6e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-vif-unplugged-195536c8-52ab-4d51-b46e-5095d097728e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.287 182939 DEBUG nova.network.neutron [-] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.300 182939 INFO nova.compute.manager [-] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Took 0.56 seconds to deallocate network for instance.
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.362 182939 DEBUG oslo_concurrency.lockutils [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.362 182939 DEBUG oslo_concurrency.lockutils [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.390 182939 DEBUG nova.scheduler.client.report [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.485 182939 DEBUG nova.scheduler.client.report [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.486 182939 DEBUG nova.compute.provider_tree [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.501 182939 DEBUG nova.scheduler.client.report [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.521 182939 DEBUG nova.scheduler.client.report [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.569 182939 DEBUG nova.compute.provider_tree [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.586 182939 DEBUG nova.scheduler.client.report [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.612 182939 DEBUG oslo_concurrency.lockutils [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.640 182939 INFO nova.scheduler.client.report [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Deleted allocations for instance f6ca3701-93a3-4e03-999f-1bdb7a95d933
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.752 182939 DEBUG oslo_concurrency.lockutils [None req-4a3feb67-cd71-4be2-8974-1b0aeb753f0f 635cc2f351c344dc8e2b1264080dbafb adb1305c8f874f2684e845e88fd95ffe - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.475s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.987 182939 DEBUG nova.network.neutron [req-2786ca0b-ea40-4652-8fd5-d58d301b57ac req-b0cb9b2f-2c0d-4b04-a2ca-d96e967ec8a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Updated VIF entry in instance network info cache for port 195536c8-52ab-4d51-b46e-5095d097728e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:30:50 compute-0 nova_compute[182935]: 2026-01-22 00:30:50.988 182939 DEBUG nova.network.neutron [req-2786ca0b-ea40-4652-8fd5-d58d301b57ac req-b0cb9b2f-2c0d-4b04-a2ca-d96e967ec8a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Updating instance_info_cache with network_info: [{"id": "195536c8-52ab-4d51-b46e-5095d097728e", "address": "fa:16:3e:3a:c5:d4", "network": {"id": "9bb9bb99-8cf7-4ecd-8fcb-f85c3ad9ea58", "bridge": "br-int", "label": "tempest-network-smoke--1831331879", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "adb1305c8f874f2684e845e88fd95ffe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap195536c8-52", "ovs_interfaceid": "195536c8-52ab-4d51-b46e-5095d097728e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.006 182939 DEBUG oslo_concurrency.lockutils [req-2786ca0b-ea40-4652-8fd5-d58d301b57ac req-b0cb9b2f-2c0d-4b04-a2ca-d96e967ec8a5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-f6ca3701-93a3-4e03-999f-1bdb7a95d933" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.813 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.813 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.814 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.814 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.953 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.954 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5685MB free_disk=73.11895370483398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.954 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.954 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.966 182939 DEBUG nova.compute.manager [req-e84a1ba6-86f1-40f8-8dc5-c11355088943 req-d98526e4-ae2a-4548-af64-e713b76b6ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.966 182939 DEBUG oslo_concurrency.lockutils [req-e84a1ba6-86f1-40f8-8dc5-c11355088943 req-d98526e4-ae2a-4548-af64-e713b76b6ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.967 182939 DEBUG oslo_concurrency.lockutils [req-e84a1ba6-86f1-40f8-8dc5-c11355088943 req-d98526e4-ae2a-4548-af64-e713b76b6ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.967 182939 DEBUG oslo_concurrency.lockutils [req-e84a1ba6-86f1-40f8-8dc5-c11355088943 req-d98526e4-ae2a-4548-af64-e713b76b6ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "f6ca3701-93a3-4e03-999f-1bdb7a95d933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.967 182939 DEBUG nova.compute.manager [req-e84a1ba6-86f1-40f8-8dc5-c11355088943 req-d98526e4-ae2a-4548-af64-e713b76b6ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] No waiting events found dispatching network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.967 182939 WARNING nova.compute.manager [req-e84a1ba6-86f1-40f8-8dc5-c11355088943 req-d98526e4-ae2a-4548-af64-e713b76b6ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received unexpected event network-vif-plugged-195536c8-52ab-4d51-b46e-5095d097728e for instance with vm_state deleted and task_state None.
Jan 22 00:30:51 compute-0 nova_compute[182935]: 2026-01-22 00:30:51.968 182939 DEBUG nova.compute.manager [req-e84a1ba6-86f1-40f8-8dc5-c11355088943 req-d98526e4-ae2a-4548-af64-e713b76b6ee2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Received event network-vif-deleted-195536c8-52ab-4d51-b46e-5095d097728e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:30:52 compute-0 nova_compute[182935]: 2026-01-22 00:30:52.020 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:30:52 compute-0 nova_compute[182935]: 2026-01-22 00:30:52.020 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:30:52 compute-0 nova_compute[182935]: 2026-01-22 00:30:52.043 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:30:52 compute-0 nova_compute[182935]: 2026-01-22 00:30:52.065 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:30:52 compute-0 nova_compute[182935]: 2026-01-22 00:30:52.093 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:30:52 compute-0 nova_compute[182935]: 2026-01-22 00:30:52.094 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:30:53 compute-0 nova_compute[182935]: 2026-01-22 00:30:53.461 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:53 compute-0 nova_compute[182935]: 2026-01-22 00:30:53.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:54 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:30:54.146 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:30:54 compute-0 nova_compute[182935]: 2026-01-22 00:30:54.478 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:54 compute-0 nova_compute[182935]: 2026-01-22 00:30:54.655 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:57 compute-0 nova_compute[182935]: 2026-01-22 00:30:57.094 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:57 compute-0 nova_compute[182935]: 2026-01-22 00:30:57.094 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:30:58 compute-0 nova_compute[182935]: 2026-01-22 00:30:58.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:30:58 compute-0 nova_compute[182935]: 2026-01-22 00:30:58.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:30:58 compute-0 nova_compute[182935]: 2026-01-22 00:30:58.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:30:58 compute-0 nova_compute[182935]: 2026-01-22 00:30:58.811 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:30:59 compute-0 nova_compute[182935]: 2026-01-22 00:30:59.480 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:30:59 compute-0 nova_compute[182935]: 2026-01-22 00:30:59.658 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:00 compute-0 nova_compute[182935]: 2026-01-22 00:31:00.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:31:03.227 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:31:03.228 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:31:03.228 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:03 compute-0 nova_compute[182935]: 2026-01-22 00:31:03.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:04 compute-0 nova_compute[182935]: 2026-01-22 00:31:04.482 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:04 compute-0 nova_compute[182935]: 2026-01-22 00:31:04.556 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769041849.5559585, f6ca3701-93a3-4e03-999f-1bdb7a95d933 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:31:04 compute-0 nova_compute[182935]: 2026-01-22 00:31:04.557 182939 INFO nova.compute.manager [-] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] VM Stopped (Lifecycle Event)
Jan 22 00:31:04 compute-0 nova_compute[182935]: 2026-01-22 00:31:04.585 182939 DEBUG nova.compute.manager [None req-67f19260-1616-47f5-ba04-5525bd9aa91a - - - - - -] [instance: f6ca3701-93a3-4e03-999f-1bdb7a95d933] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:31:04 compute-0 nova_compute[182935]: 2026-01-22 00:31:04.659 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:04 compute-0 podman[241796]: 2026-01-22 00:31:04.688654646 +0000 UTC m=+0.049824240 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:31:04 compute-0 podman[241794]: 2026-01-22 00:31:04.693116113 +0000 UTC m=+0.053542289 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:31:04 compute-0 podman[241795]: 2026-01-22 00:31:04.713834971 +0000 UTC m=+0.079150795 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:31:04 compute-0 nova_compute[182935]: 2026-01-22 00:31:04.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:05 compute-0 sshd-session[241866]: Invalid user apache from 188.166.69.60 port 58158
Jan 22 00:31:05 compute-0 sshd-session[241866]: Connection closed by invalid user apache 188.166.69.60 port 58158 [preauth]
Jan 22 00:31:07 compute-0 nova_compute[182935]: 2026-01-22 00:31:07.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:09 compute-0 nova_compute[182935]: 2026-01-22 00:31:09.516 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:09 compute-0 nova_compute[182935]: 2026-01-22 00:31:09.661 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:09 compute-0 nova_compute[182935]: 2026-01-22 00:31:09.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:10 compute-0 nova_compute[182935]: 2026-01-22 00:31:10.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:31:11.731 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:47:fc 10.100.0.2 2001:db8::f816:3eff:fe99:47fc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe99:47fc/64', 'neutron:device_id': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb52301b-689d-4e28-a6fb-c23352694dd4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b38c45f8-f983-4d04-9b7c-db4cbbad86b5) old=Port_Binding(mac=['fa:16:3e:99:47:fc 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-895033ac-5f91-4350-ad1a-b5c5d0ff13a2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:31:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:31:11.732 104408 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b38c45f8-f983-4d04-9b7c-db4cbbad86b5 in datapath 895033ac-5f91-4350-ad1a-b5c5d0ff13a2 updated
Jan 22 00:31:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:31:11.733 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 895033ac-5f91-4350-ad1a-b5c5d0ff13a2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:31:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:31:11.735 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[37a1bd96-63a6-4508-8bd9-379e8934b37b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:31:13 compute-0 podman[241868]: 2026-01-22 00:31:13.712819928 +0000 UTC m=+0.084648608 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 00:31:14 compute-0 nova_compute[182935]: 2026-01-22 00:31:14.564 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:14 compute-0 nova_compute[182935]: 2026-01-22 00:31:14.663 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:19 compute-0 nova_compute[182935]: 2026-01-22 00:31:19.575 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:19 compute-0 nova_compute[182935]: 2026-01-22 00:31:19.664 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:19 compute-0 podman[241887]: 2026-01-22 00:31:19.678692878 +0000 UTC m=+0.052588087 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vcs-type=git, io.openshift.expose-services=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Jan 22 00:31:19 compute-0 podman[241888]: 2026-01-22 00:31:19.685186613 +0000 UTC m=+0.058163560 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:31:24 compute-0 nova_compute[182935]: 2026-01-22 00:31:24.615 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:24 compute-0 nova_compute[182935]: 2026-01-22 00:31:24.667 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:29 compute-0 nova_compute[182935]: 2026-01-22 00:31:29.667 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:29 compute-0 nova_compute[182935]: 2026-01-22 00:31:29.669 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:34 compute-0 nova_compute[182935]: 2026-01-22 00:31:34.669 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:35 compute-0 podman[241929]: 2026-01-22 00:31:35.677243958 +0000 UTC m=+0.051903760 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:31:35 compute-0 podman[241931]: 2026-01-22 00:31:35.684263066 +0000 UTC m=+0.055311431 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:31:35 compute-0 podman[241930]: 2026-01-22 00:31:35.70562837 +0000 UTC m=+0.078736885 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 22 00:31:38 compute-0 ovn_controller[95047]: 2026-01-22T00:31:38Z|00686|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 22 00:31:39 compute-0 nova_compute[182935]: 2026-01-22 00:31:39.672 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:31:39 compute-0 nova_compute[182935]: 2026-01-22 00:31:39.674 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:31:39 compute-0 nova_compute[182935]: 2026-01-22 00:31:39.674 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:31:39 compute-0 nova_compute[182935]: 2026-01-22 00:31:39.675 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:31:39 compute-0 nova_compute[182935]: 2026-01-22 00:31:39.710 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:39 compute-0 nova_compute[182935]: 2026-01-22 00:31:39.711 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:31:44 compute-0 podman[241998]: 2026-01-22 00:31:44.669780249 +0000 UTC m=+0.044249626 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 22 00:31:44 compute-0 nova_compute[182935]: 2026-01-22 00:31:44.711 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:31:44 compute-0 nova_compute[182935]: 2026-01-22 00:31:44.713 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:31:44 compute-0 nova_compute[182935]: 2026-01-22 00:31:44.713 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:31:44 compute-0 nova_compute[182935]: 2026-01-22 00:31:44.714 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:31:44 compute-0 nova_compute[182935]: 2026-01-22 00:31:44.747 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:44 compute-0 nova_compute[182935]: 2026-01-22 00:31:44.748 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:31:48 compute-0 sshd-session[242018]: Invalid user apache from 188.166.69.60 port 39216
Jan 22 00:31:48 compute-0 sshd-session[242018]: Connection closed by invalid user apache 188.166.69.60 port 39216 [preauth]
Jan 22 00:31:49 compute-0 nova_compute[182935]: 2026-01-22 00:31:49.749 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:31:49 compute-0 nova_compute[182935]: 2026-01-22 00:31:49.750 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:49 compute-0 nova_compute[182935]: 2026-01-22 00:31:49.751 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:31:49 compute-0 nova_compute[182935]: 2026-01-22 00:31:49.751 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:31:49 compute-0 nova_compute[182935]: 2026-01-22 00:31:49.751 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:31:49 compute-0 nova_compute[182935]: 2026-01-22 00:31:49.752 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:31:50 compute-0 podman[242020]: 2026-01-22 00:31:50.686361178 +0000 UTC m=+0.058692433 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:31:50 compute-0 podman[242021]: 2026-01-22 00:31:50.691639185 +0000 UTC m=+0.060529388 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:31:51 compute-0 nova_compute[182935]: 2026-01-22 00:31:51.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:51 compute-0 nova_compute[182935]: 2026-01-22 00:31:51.822 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:51 compute-0 nova_compute[182935]: 2026-01-22 00:31:51.822 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:51 compute-0 nova_compute[182935]: 2026-01-22 00:31:51.822 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:51 compute-0 nova_compute[182935]: 2026-01-22 00:31:51.822 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:31:51 compute-0 nova_compute[182935]: 2026-01-22 00:31:51.981 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:31:51 compute-0 nova_compute[182935]: 2026-01-22 00:31:51.983 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5715MB free_disk=73.11894989013672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:31:51 compute-0 nova_compute[182935]: 2026-01-22 00:31:51.984 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:31:51 compute-0 nova_compute[182935]: 2026-01-22 00:31:51.984 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:31:52 compute-0 nova_compute[182935]: 2026-01-22 00:31:52.052 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:31:52 compute-0 nova_compute[182935]: 2026-01-22 00:31:52.053 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:31:52 compute-0 nova_compute[182935]: 2026-01-22 00:31:52.074 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:31:52 compute-0 nova_compute[182935]: 2026-01-22 00:31:52.090 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:31:52 compute-0 nova_compute[182935]: 2026-01-22 00:31:52.091 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:31:52 compute-0 nova_compute[182935]: 2026-01-22 00:31:52.091 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:31:54 compute-0 nova_compute[182935]: 2026-01-22 00:31:54.752 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:58 compute-0 nova_compute[182935]: 2026-01-22 00:31:58.092 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:31:58 compute-0 nova_compute[182935]: 2026-01-22 00:31:58.093 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:31:59 compute-0 nova_compute[182935]: 2026-01-22 00:31:59.752 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:31:59 compute-0 nova_compute[182935]: 2026-01-22 00:31:59.755 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:00 compute-0 nova_compute[182935]: 2026-01-22 00:32:00.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:00 compute-0 nova_compute[182935]: 2026-01-22 00:32:00.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:32:00 compute-0 nova_compute[182935]: 2026-01-22 00:32:00.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:32:00 compute-0 nova_compute[182935]: 2026-01-22 00:32:00.885 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:32:00 compute-0 nova_compute[182935]: 2026-01-22 00:32:00.886 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:32:03.228 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:32:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:32:03.229 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:32:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:32:03.229 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:32:03 compute-0 nova_compute[182935]: 2026-01-22 00:32:03.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:04 compute-0 nova_compute[182935]: 2026-01-22 00:32:04.754 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:04 compute-0 nova_compute[182935]: 2026-01-22 00:32:04.756 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:06 compute-0 podman[242061]: 2026-01-22 00:32:06.70958952 +0000 UTC m=+0.069570585 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:32:06 compute-0 podman[242063]: 2026-01-22 00:32:06.717869339 +0000 UTC m=+0.078262444 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:32:06 compute-0 podman[242062]: 2026-01-22 00:32:06.787776841 +0000 UTC m=+0.150796729 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 22 00:32:06 compute-0 nova_compute[182935]: 2026-01-22 00:32:06.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:08 compute-0 nova_compute[182935]: 2026-01-22 00:32:08.802 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:09 compute-0 nova_compute[182935]: 2026-01-22 00:32:09.756 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:10 compute-0 nova_compute[182935]: 2026-01-22 00:32:10.147 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:32:10.148 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:32:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:32:10.148 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:32:10 compute-0 nova_compute[182935]: 2026-01-22 00:32:10.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:10 compute-0 nova_compute[182935]: 2026-01-22 00:32:10.816 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:11 compute-0 nova_compute[182935]: 2026-01-22 00:32:11.809 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:14 compute-0 nova_compute[182935]: 2026-01-22 00:32:14.758 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:14 compute-0 nova_compute[182935]: 2026-01-22 00:32:14.759 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:15 compute-0 podman[242135]: 2026-01-22 00:32:15.665796029 +0000 UTC m=+0.045324592 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 00:32:19 compute-0 nova_compute[182935]: 2026-01-22 00:32:19.760 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:19 compute-0 nova_compute[182935]: 2026-01-22 00:32:19.762 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:32:20 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:32:20.151 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:32:21 compute-0 podman[242154]: 2026-01-22 00:32:21.69872196 +0000 UTC m=+0.071049509 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 00:32:21 compute-0 podman[242155]: 2026-01-22 00:32:21.704161481 +0000 UTC m=+0.072917924 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:32:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:32:24 compute-0 nova_compute[182935]: 2026-01-22 00:32:24.763 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:32:29 compute-0 nova_compute[182935]: 2026-01-22 00:32:29.765 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:32:29 compute-0 nova_compute[182935]: 2026-01-22 00:32:29.767 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:32:29 compute-0 nova_compute[182935]: 2026-01-22 00:32:29.767 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:32:29 compute-0 nova_compute[182935]: 2026-01-22 00:32:29.767 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:32:29 compute-0 nova_compute[182935]: 2026-01-22 00:32:29.806 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:29 compute-0 nova_compute[182935]: 2026-01-22 00:32:29.806 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:32:31 compute-0 sshd-session[242195]: Invalid user apache from 188.166.69.60 port 51792
Jan 22 00:32:31 compute-0 sshd-session[242195]: Connection closed by invalid user apache 188.166.69.60 port 51792 [preauth]
Jan 22 00:32:34 compute-0 nova_compute[182935]: 2026-01-22 00:32:34.807 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:34 compute-0 nova_compute[182935]: 2026-01-22 00:32:34.809 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:37 compute-0 podman[242197]: 2026-01-22 00:32:37.676150552 +0000 UTC m=+0.050905447 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:32:37 compute-0 podman[242199]: 2026-01-22 00:32:37.686897499 +0000 UTC m=+0.058345223 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:32:37 compute-0 podman[242198]: 2026-01-22 00:32:37.70977904 +0000 UTC m=+0.084495094 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 00:32:39 compute-0 nova_compute[182935]: 2026-01-22 00:32:39.810 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:32:39 compute-0 nova_compute[182935]: 2026-01-22 00:32:39.812 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:32:39 compute-0 nova_compute[182935]: 2026-01-22 00:32:39.812 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:32:39 compute-0 nova_compute[182935]: 2026-01-22 00:32:39.812 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:32:39 compute-0 nova_compute[182935]: 2026-01-22 00:32:39.850 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:39 compute-0 nova_compute[182935]: 2026-01-22 00:32:39.851 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:32:44 compute-0 nova_compute[182935]: 2026-01-22 00:32:44.852 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:44 compute-0 nova_compute[182935]: 2026-01-22 00:32:44.854 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:32:46 compute-0 podman[242268]: 2026-01-22 00:32:46.722057314 +0000 UTC m=+0.091671236 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 00:32:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:32:47.232 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:32:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:32:47.233 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:32:47 compute-0 nova_compute[182935]: 2026-01-22 00:32:47.232 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:49 compute-0 nova_compute[182935]: 2026-01-22 00:32:49.854 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:32:52 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:32:52.235 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:32:52 compute-0 podman[242287]: 2026-01-22 00:32:52.682419981 +0000 UTC m=+0.057477583 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 00:32:52 compute-0 podman[242288]: 2026-01-22 00:32:52.686591721 +0000 UTC m=+0.055538666 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:32:52 compute-0 nova_compute[182935]: 2026-01-22 00:32:52.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:32:52 compute-0 nova_compute[182935]: 2026-01-22 00:32:52.821 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:32:52 compute-0 nova_compute[182935]: 2026-01-22 00:32:52.821 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:32:52 compute-0 nova_compute[182935]: 2026-01-22 00:32:52.821 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:32:52 compute-0 nova_compute[182935]: 2026-01-22 00:32:52.822 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:32:52 compute-0 nova_compute[182935]: 2026-01-22 00:32:52.962 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:32:52 compute-0 nova_compute[182935]: 2026-01-22 00:32:52.964 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5725MB free_disk=73.11896514892578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:32:52 compute-0 nova_compute[182935]: 2026-01-22 00:32:52.964 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:32:52 compute-0 nova_compute[182935]: 2026-01-22 00:32:52.964 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:32:53 compute-0 nova_compute[182935]: 2026-01-22 00:32:53.028 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:32:53 compute-0 nova_compute[182935]: 2026-01-22 00:32:53.028 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:32:53 compute-0 nova_compute[182935]: 2026-01-22 00:32:53.049 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:32:53 compute-0 nova_compute[182935]: 2026-01-22 00:32:53.064 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:32:53 compute-0 nova_compute[182935]: 2026-01-22 00:32:53.066 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:32:53 compute-0 nova_compute[182935]: 2026-01-22 00:32:53.066 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:32:54 compute-0 nova_compute[182935]: 2026-01-22 00:32:54.856 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:32:59 compute-0 nova_compute[182935]: 2026-01-22 00:32:59.858 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:00 compute-0 nova_compute[182935]: 2026-01-22 00:33:00.066 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:00 compute-0 nova_compute[182935]: 2026-01-22 00:33:00.067 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:33:01 compute-0 nova_compute[182935]: 2026-01-22 00:33:01.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:01 compute-0 nova_compute[182935]: 2026-01-22 00:33:01.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:33:01 compute-0 nova_compute[182935]: 2026-01-22 00:33:01.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:33:01 compute-0 nova_compute[182935]: 2026-01-22 00:33:01.814 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:33:02 compute-0 nova_compute[182935]: 2026-01-22 00:33:02.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:03.230 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:03.230 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:03.231 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:03 compute-0 nova_compute[182935]: 2026-01-22 00:33:03.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:04 compute-0 nova_compute[182935]: 2026-01-22 00:33:04.860 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:33:07 compute-0 nova_compute[182935]: 2026-01-22 00:33:07.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:08 compute-0 podman[242329]: 2026-01-22 00:33:08.698585936 +0000 UTC m=+0.060656010 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:33:08 compute-0 podman[242331]: 2026-01-22 00:33:08.70084965 +0000 UTC m=+0.059796679 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:33:08 compute-0 podman[242330]: 2026-01-22 00:33:08.739602523 +0000 UTC m=+0.095466228 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:33:08 compute-0 nova_compute[182935]: 2026-01-22 00:33:08.810 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:09 compute-0 nova_compute[182935]: 2026-01-22 00:33:09.862 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:33:10 compute-0 nova_compute[182935]: 2026-01-22 00:33:10.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:11 compute-0 nova_compute[182935]: 2026-01-22 00:33:11.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:13 compute-0 nova_compute[182935]: 2026-01-22 00:33:13.241 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:13 compute-0 sshd-session[242397]: Invalid user apache from 188.166.69.60 port 33982
Jan 22 00:33:13 compute-0 sshd-session[242397]: Connection closed by invalid user apache 188.166.69.60 port 33982 [preauth]
Jan 22 00:33:14 compute-0 nova_compute[182935]: 2026-01-22 00:33:14.863 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:17 compute-0 podman[242399]: 2026-01-22 00:33:17.667056329 +0000 UTC m=+0.045429005 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:33:17 compute-0 nova_compute[182935]: 2026-01-22 00:33:17.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:17 compute-0 nova_compute[182935]: 2026-01-22 00:33:17.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:33:18 compute-0 sshd-session[242420]: Received disconnect from 38.67.240.124 port 20459:11:  [preauth]
Jan 22 00:33:18 compute-0 sshd-session[242420]: Disconnected from authenticating user root 38.67.240.124 port 20459 [preauth]
Jan 22 00:33:19 compute-0 nova_compute[182935]: 2026-01-22 00:33:19.864 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:23 compute-0 nova_compute[182935]: 2026-01-22 00:33:23.688 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:23 compute-0 nova_compute[182935]: 2026-01-22 00:33:23.688 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:23 compute-0 nova_compute[182935]: 2026-01-22 00:33:23.706 182939 DEBUG nova.compute.manager [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:33:23 compute-0 podman[242422]: 2026-01-22 00:33:23.709921961 +0000 UTC m=+0.074590476 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 00:33:23 compute-0 podman[242423]: 2026-01-22 00:33:23.725568257 +0000 UTC m=+0.082353232 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:33:23 compute-0 nova_compute[182935]: 2026-01-22 00:33:23.815 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:23 compute-0 nova_compute[182935]: 2026-01-22 00:33:23.816 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:23 compute-0 nova_compute[182935]: 2026-01-22 00:33:23.823 182939 DEBUG nova.virt.hardware [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:33:23 compute-0 nova_compute[182935]: 2026-01-22 00:33:23.823 182939 INFO nova.compute.claims [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.053 182939 DEBUG nova.compute.provider_tree [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.070 182939 DEBUG nova.scheduler.client.report [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.093 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.093 182939 DEBUG nova.compute.manager [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.150 182939 DEBUG nova.compute.manager [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.151 182939 DEBUG nova.network.neutron [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.172 182939 INFO nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.190 182939 DEBUG nova.compute.manager [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.324 182939 DEBUG nova.compute.manager [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.325 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.325 182939 INFO nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Creating image(s)
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.326 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.327 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.327 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.339 182939 DEBUG oslo_concurrency.processutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.400 182939 DEBUG oslo_concurrency.processutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.401 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.402 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.414 182939 DEBUG oslo_concurrency.processutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.473 182939 DEBUG oslo_concurrency.processutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.475 182939 DEBUG oslo_concurrency.processutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.510 182939 DEBUG oslo_concurrency.processutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.511 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.512 182939 DEBUG oslo_concurrency.processutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.565 182939 DEBUG oslo_concurrency.processutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.566 182939 DEBUG nova.virt.disk.api [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.566 182939 DEBUG oslo_concurrency.processutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.622 182939 DEBUG oslo_concurrency.processutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.623 182939 DEBUG nova.virt.disk.api [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.623 182939 DEBUG nova.objects.instance [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.639 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.640 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Ensure instance console log exists: /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.640 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.640 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.641 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.865 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:24 compute-0 nova_compute[182935]: 2026-01-22 00:33:24.889 182939 DEBUG nova.policy [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:33:26 compute-0 nova_compute[182935]: 2026-01-22 00:33:26.611 182939 DEBUG nova.network.neutron [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Successfully created port: 6a92d070-97be-4ea4-a051-f8b9882b8b9c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:33:27 compute-0 nova_compute[182935]: 2026-01-22 00:33:27.206 182939 DEBUG nova.network.neutron [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Successfully created port: 0091268d-dee8-4a48-8f05-e20c0db2ec29 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:33:28 compute-0 nova_compute[182935]: 2026-01-22 00:33:28.295 182939 DEBUG nova.network.neutron [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Successfully updated port: 6a92d070-97be-4ea4-a051-f8b9882b8b9c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:33:28 compute-0 nova_compute[182935]: 2026-01-22 00:33:28.533 182939 DEBUG nova.compute.manager [req-633f5e59-c203-4dd5-a499-85813e6231e6 req-5301d21c-d785-4752-acf9-8717ed942f7f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-changed-6a92d070-97be-4ea4-a051-f8b9882b8b9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:33:28 compute-0 nova_compute[182935]: 2026-01-22 00:33:28.534 182939 DEBUG nova.compute.manager [req-633f5e59-c203-4dd5-a499-85813e6231e6 req-5301d21c-d785-4752-acf9-8717ed942f7f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Refreshing instance network info cache due to event network-changed-6a92d070-97be-4ea4-a051-f8b9882b8b9c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:33:28 compute-0 nova_compute[182935]: 2026-01-22 00:33:28.534 182939 DEBUG oslo_concurrency.lockutils [req-633f5e59-c203-4dd5-a499-85813e6231e6 req-5301d21c-d785-4752-acf9-8717ed942f7f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:33:28 compute-0 nova_compute[182935]: 2026-01-22 00:33:28.534 182939 DEBUG oslo_concurrency.lockutils [req-633f5e59-c203-4dd5-a499-85813e6231e6 req-5301d21c-d785-4752-acf9-8717ed942f7f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:33:28 compute-0 nova_compute[182935]: 2026-01-22 00:33:28.535 182939 DEBUG nova.network.neutron [req-633f5e59-c203-4dd5-a499-85813e6231e6 req-5301d21c-d785-4752-acf9-8717ed942f7f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Refreshing network info cache for port 6a92d070-97be-4ea4-a051-f8b9882b8b9c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:33:28 compute-0 nova_compute[182935]: 2026-01-22 00:33:28.862 182939 DEBUG nova.network.neutron [req-633f5e59-c203-4dd5-a499-85813e6231e6 req-5301d21c-d785-4752-acf9-8717ed942f7f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.218 182939 DEBUG nova.network.neutron [req-633f5e59-c203-4dd5-a499-85813e6231e6 req-5301d21c-d785-4752-acf9-8717ed942f7f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.245 182939 DEBUG oslo_concurrency.lockutils [req-633f5e59-c203-4dd5-a499-85813e6231e6 req-5301d21c-d785-4752-acf9-8717ed942f7f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.410 182939 DEBUG nova.network.neutron [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Successfully updated port: 0091268d-dee8-4a48-8f05-e20c0db2ec29 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.425 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.426 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.426 182939 DEBUG nova.network.neutron [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.868 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.870 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.870 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.870 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.883 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.884 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:33:29 compute-0 nova_compute[182935]: 2026-01-22 00:33:29.926 182939 DEBUG nova.network.neutron [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:33:30 compute-0 nova_compute[182935]: 2026-01-22 00:33:30.622 182939 DEBUG nova.compute.manager [req-d95b0a88-637f-48f6-8b99-2ffd61410ff9 req-2d662cdc-8b03-4bd8-8c24-aff5c1c7fe9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-changed-0091268d-dee8-4a48-8f05-e20c0db2ec29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:33:30 compute-0 nova_compute[182935]: 2026-01-22 00:33:30.623 182939 DEBUG nova.compute.manager [req-d95b0a88-637f-48f6-8b99-2ffd61410ff9 req-2d662cdc-8b03-4bd8-8c24-aff5c1c7fe9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Refreshing instance network info cache due to event network-changed-0091268d-dee8-4a48-8f05-e20c0db2ec29. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:33:30 compute-0 nova_compute[182935]: 2026-01-22 00:33:30.624 182939 DEBUG oslo_concurrency.lockutils [req-d95b0a88-637f-48f6-8b99-2ffd61410ff9 req-2d662cdc-8b03-4bd8-8c24-aff5c1c7fe9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.908 182939 DEBUG nova.network.neutron [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Updating instance_info_cache with network_info: [{"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.964 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.965 182939 DEBUG nova.compute.manager [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Instance network_info: |[{"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.965 182939 DEBUG oslo_concurrency.lockutils [req-d95b0a88-637f-48f6-8b99-2ffd61410ff9 req-2d662cdc-8b03-4bd8-8c24-aff5c1c7fe9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.966 182939 DEBUG nova.network.neutron [req-d95b0a88-637f-48f6-8b99-2ffd61410ff9 req-2d662cdc-8b03-4bd8-8c24-aff5c1c7fe9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Refreshing network info cache for port 0091268d-dee8-4a48-8f05-e20c0db2ec29 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.969 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Start _get_guest_xml network_info=[{"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.974 182939 WARNING nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.979 182939 DEBUG nova.virt.libvirt.host [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.980 182939 DEBUG nova.virt.libvirt.host [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.982 182939 DEBUG nova.virt.libvirt.host [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.983 182939 DEBUG nova.virt.libvirt.host [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.984 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.984 182939 DEBUG nova.virt.hardware [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.985 182939 DEBUG nova.virt.hardware [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.985 182939 DEBUG nova.virt.hardware [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.985 182939 DEBUG nova.virt.hardware [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.986 182939 DEBUG nova.virt.hardware [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.986 182939 DEBUG nova.virt.hardware [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.986 182939 DEBUG nova.virt.hardware [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.986 182939 DEBUG nova.virt.hardware [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.987 182939 DEBUG nova.virt.hardware [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.987 182939 DEBUG nova.virt.hardware [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.987 182939 DEBUG nova.virt.hardware [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.992 182939 DEBUG nova.virt.libvirt.vif [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:33:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1289953885',display_name='tempest-TestGettingAddress-server-1289953885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1289953885',id=171,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCnBNde9LnK+U+D5ENHV5Bhm30z07BTZ3lIWHvMukOxK8Jf8o76X7/cU7jqimLz8JbkUji6/91McZm7z1O1Yc3tDz3ZuSaz3KrVuj+NWjZEoVIpu6UJrwRH+k4I2kfmaw==',key_name='tempest-TestGettingAddress-1562921949',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-pn756rnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:33:24Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=7a9023a9-d882-4cdd-aa9e-ee59c7795d38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.992 182939 DEBUG nova.network.os_vif_util [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.993 182939 DEBUG nova.network.os_vif_util [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:8b:c5,bridge_name='br-int',has_traffic_filtering=True,id=6a92d070-97be-4ea4-a051-f8b9882b8b9c,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a92d070-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.994 182939 DEBUG nova.virt.libvirt.vif [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:33:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1289953885',display_name='tempest-TestGettingAddress-server-1289953885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1289953885',id=171,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCnBNde9LnK+U+D5ENHV5Bhm30z07BTZ3lIWHvMukOxK8Jf8o76X7/cU7jqimLz8JbkUji6/91McZm7z1O1Yc3tDz3ZuSaz3KrVuj+NWjZEoVIpu6UJrwRH+k4I2kfmaw==',key_name='tempest-TestGettingAddress-1562921949',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-pn756rnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:33:24Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=7a9023a9-d882-4cdd-aa9e-ee59c7795d38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.995 182939 DEBUG nova.network.os_vif_util [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.995 182939 DEBUG nova.network.os_vif_util [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5c:f9:6c,bridge_name='br-int',has_traffic_filtering=True,id=0091268d-dee8-4a48-8f05-e20c0db2ec29,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0091268d-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:33:31 compute-0 nova_compute[182935]: 2026-01-22 00:33:31.996 182939 DEBUG nova.objects.instance [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.009 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:33:32 compute-0 nova_compute[182935]:   <uuid>7a9023a9-d882-4cdd-aa9e-ee59c7795d38</uuid>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   <name>instance-000000ab</name>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <nova:name>tempest-TestGettingAddress-server-1289953885</nova:name>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:33:31</nova:creationTime>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:33:32 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:33:32 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:33:32 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:33:32 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:33:32 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:33:32 compute-0 nova_compute[182935]:         <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 22 00:33:32 compute-0 nova_compute[182935]:         <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:33:32 compute-0 nova_compute[182935]:         <nova:port uuid="6a92d070-97be-4ea4-a051-f8b9882b8b9c">
Jan 22 00:33:32 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:33:32 compute-0 nova_compute[182935]:         <nova:port uuid="0091268d-dee8-4a48-8f05-e20c0db2ec29">
Jan 22 00:33:32 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe5c:f96c" ipVersion="6"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <system>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <entry name="serial">7a9023a9-d882-4cdd-aa9e-ee59c7795d38</entry>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <entry name="uuid">7a9023a9-d882-4cdd-aa9e-ee59c7795d38</entry>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     </system>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   <os>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   </os>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   <features>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   </features>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.config"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:95:8b:c5"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <target dev="tap6a92d070-97"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:5c:f9:6c"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <target dev="tap0091268d-de"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/console.log" append="off"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <video>
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     </video>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:33:32 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:33:32 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:33:32 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:33:32 compute-0 nova_compute[182935]: </domain>
Jan 22 00:33:32 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.010 182939 DEBUG nova.compute.manager [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Preparing to wait for external event network-vif-plugged-6a92d070-97be-4ea4-a051-f8b9882b8b9c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.011 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.011 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.011 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.011 182939 DEBUG nova.compute.manager [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Preparing to wait for external event network-vif-plugged-0091268d-dee8-4a48-8f05-e20c0db2ec29 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.012 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.012 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.012 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.013 182939 DEBUG nova.virt.libvirt.vif [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:33:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1289953885',display_name='tempest-TestGettingAddress-server-1289953885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1289953885',id=171,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCnBNde9LnK+U+D5ENHV5Bhm30z07BTZ3lIWHvMukOxK8Jf8o76X7/cU7jqimLz8JbkUji6/91McZm7z1O1Yc3tDz3ZuSaz3KrVuj+NWjZEoVIpu6UJrwRH+k4I2kfmaw==',key_name='tempest-TestGettingAddress-1562921949',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-pn756rnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:33:24Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=7a9023a9-d882-4cdd-aa9e-ee59c7795d38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.013 182939 DEBUG nova.network.os_vif_util [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.014 182939 DEBUG nova.network.os_vif_util [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:8b:c5,bridge_name='br-int',has_traffic_filtering=True,id=6a92d070-97be-4ea4-a051-f8b9882b8b9c,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a92d070-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.014 182939 DEBUG os_vif [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:8b:c5,bridge_name='br-int',has_traffic_filtering=True,id=6a92d070-97be-4ea4-a051-f8b9882b8b9c,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a92d070-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.015 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.015 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.015 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.020 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.020 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a92d070-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.020 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6a92d070-97, col_values=(('external_ids', {'iface-id': '6a92d070-97be-4ea4-a051-f8b9882b8b9c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:8b:c5', 'vm-uuid': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.022 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 NetworkManager[55139]: <info>  [1769042012.0240] manager: (tap6a92d070-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.024 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.032 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.034 182939 INFO os_vif [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:8b:c5,bridge_name='br-int',has_traffic_filtering=True,id=6a92d070-97be-4ea4-a051-f8b9882b8b9c,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a92d070-97')
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.035 182939 DEBUG nova.virt.libvirt.vif [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:33:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1289953885',display_name='tempest-TestGettingAddress-server-1289953885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1289953885',id=171,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCnBNde9LnK+U+D5ENHV5Bhm30z07BTZ3lIWHvMukOxK8Jf8o76X7/cU7jqimLz8JbkUji6/91McZm7z1O1Yc3tDz3ZuSaz3KrVuj+NWjZEoVIpu6UJrwRH+k4I2kfmaw==',key_name='tempest-TestGettingAddress-1562921949',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-pn756rnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:33:24Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=7a9023a9-d882-4cdd-aa9e-ee59c7795d38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.035 182939 DEBUG nova.network.os_vif_util [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.036 182939 DEBUG nova.network.os_vif_util [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5c:f9:6c,bridge_name='br-int',has_traffic_filtering=True,id=0091268d-dee8-4a48-8f05-e20c0db2ec29,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0091268d-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.036 182939 DEBUG os_vif [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:f9:6c,bridge_name='br-int',has_traffic_filtering=True,id=0091268d-dee8-4a48-8f05-e20c0db2ec29,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0091268d-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.037 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.037 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.038 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.040 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.040 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0091268d-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.040 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0091268d-de, col_values=(('external_ids', {'iface-id': '0091268d-dee8-4a48-8f05-e20c0db2ec29', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5c:f9:6c', 'vm-uuid': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:33:32 compute-0 NetworkManager[55139]: <info>  [1769042012.0434] manager: (tap0091268d-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/336)
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.042 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.045 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.048 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.048 182939 INFO os_vif [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:f9:6c,bridge_name='br-int',has_traffic_filtering=True,id=0091268d-dee8-4a48-8f05-e20c0db2ec29,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0091268d-de')
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.095 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.096 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.096 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:95:8b:c5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.096 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:5c:f9:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.097 182939 INFO nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Using config drive
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.423 182939 INFO nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Creating config drive at /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.config
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.428 182939 DEBUG oslo_concurrency.processutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb_jmjc5m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.557 182939 DEBUG oslo_concurrency.processutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb_jmjc5m" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:33:32 compute-0 kernel: tap6a92d070-97: entered promiscuous mode
Jan 22 00:33:32 compute-0 NetworkManager[55139]: <info>  [1769042012.6306] manager: (tap6a92d070-97): new Tun device (/org/freedesktop/NetworkManager/Devices/337)
Jan 22 00:33:32 compute-0 ovn_controller[95047]: 2026-01-22T00:33:32Z|00687|binding|INFO|Claiming lport 6a92d070-97be-4ea4-a051-f8b9882b8b9c for this chassis.
Jan 22 00:33:32 compute-0 ovn_controller[95047]: 2026-01-22T00:33:32Z|00688|binding|INFO|6a92d070-97be-4ea4-a051-f8b9882b8b9c: Claiming fa:16:3e:95:8b:c5 10.100.0.12
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.643 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 NetworkManager[55139]: <info>  [1769042012.6514] manager: (tap0091268d-de): new Tun device (/org/freedesktop/NetworkManager/Devices/338)
Jan 22 00:33:32 compute-0 kernel: tap0091268d-de: entered promiscuous mode
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.654 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 ovn_controller[95047]: 2026-01-22T00:33:32Z|00689|if_status|INFO|Not updating pb chassis for 0091268d-dee8-4a48-8f05-e20c0db2ec29 now as sb is readonly
Jan 22 00:33:32 compute-0 ovn_controller[95047]: 2026-01-22T00:33:32Z|00690|binding|INFO|Claiming lport 0091268d-dee8-4a48-8f05-e20c0db2ec29 for this chassis.
Jan 22 00:33:32 compute-0 ovn_controller[95047]: 2026-01-22T00:33:32Z|00691|binding|INFO|0091268d-dee8-4a48-8f05-e20c0db2ec29: Claiming fa:16:3e:5c:f9:6c 2001:db8::f816:3eff:fe5c:f96c
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.662 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:8b:c5 10.100.0.12'], port_security=['fa:16:3e:95:8b:c5 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c32b591-bafa-4089-9793-ef7884c86bda', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e757454-ac82-49d6-9905-0e379d7d274f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e2ff2e46-2af8-49fc-9f01-6a639111eeb4, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=6a92d070-97be-4ea4-a051-f8b9882b8b9c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.663 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 6a92d070-97be-4ea4-a051-f8b9882b8b9c in datapath 2c32b591-bafa-4089-9793-ef7884c86bda bound to our chassis
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.663 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2c32b591-bafa-4089-9793-ef7884c86bda
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.666 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5c:f9:6c 2001:db8::f816:3eff:fe5c:f96c'], port_security=['fa:16:3e:5c:f9:6c 2001:db8::f816:3eff:fe5c:f96c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe5c:f96c/64', 'neutron:device_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2e757454-ac82-49d6-9905-0e379d7d274f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27a761c8-dbca-47a8-b596-d7db8b087bd0, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=0091268d-dee8-4a48-8f05-e20c0db2ec29) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:33:32 compute-0 systemd-udevd[242501]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:33:32 compute-0 systemd-udevd[242500]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.676 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b87efbd5-a5b3-4c73-9138-22dbb61d1d86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.677 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2c32b591-b1 in ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:33:32 compute-0 NetworkManager[55139]: <info>  [1769042012.6798] device (tap6a92d070-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.680 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2c32b591-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.680 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7e57e248-2b64-447d-a248-6bad67b14a8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.682 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5f52a9e9-fe25-4863-b8a4-f902becbf20b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:32 compute-0 NetworkManager[55139]: <info>  [1769042012.6838] device (tap6a92d070-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:33:32 compute-0 NetworkManager[55139]: <info>  [1769042012.6850] device (tap0091268d-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:33:32 compute-0 NetworkManager[55139]: <info>  [1769042012.6863] device (tap0091268d-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.697 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[91fd684a-9984-42a9-b1ce-449b8cbb8901]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:32 compute-0 systemd-machined[154182]: New machine qemu-89-instance-000000ab.
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.724 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[95436951-b574-45a1-be45-8ae88186cfab]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:32 compute-0 systemd[1]: Started Virtual Machine qemu-89-instance-000000ab.
Jan 22 00:33:32 compute-0 ovn_controller[95047]: 2026-01-22T00:33:32Z|00692|binding|INFO|Setting lport 6a92d070-97be-4ea4-a051-f8b9882b8b9c ovn-installed in OVS
Jan 22 00:33:32 compute-0 ovn_controller[95047]: 2026-01-22T00:33:32Z|00693|binding|INFO|Setting lport 6a92d070-97be-4ea4-a051-f8b9882b8b9c up in Southbound
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.725 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.732 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 ovn_controller[95047]: 2026-01-22T00:33:32Z|00694|binding|INFO|Setting lport 0091268d-dee8-4a48-8f05-e20c0db2ec29 ovn-installed in OVS
Jan 22 00:33:32 compute-0 ovn_controller[95047]: 2026-01-22T00:33:32Z|00695|binding|INFO|Setting lport 0091268d-dee8-4a48-8f05-e20c0db2ec29 up in Southbound
Jan 22 00:33:32 compute-0 nova_compute[182935]: 2026-01-22 00:33:32.743 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.757 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[8af9d1bd-a0e1-4520-aa4e-ba9cf01b7e71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.762 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f21899e8-e50c-4ab3-808a-aad1b42e4dea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:32 compute-0 NetworkManager[55139]: <info>  [1769042012.7649] manager: (tap2c32b591-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/339)
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.796 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[e81f1d5a-96f8-4d9c-bd7e-efa84f2d7230]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.800 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[0ce7601a-e19b-4bcb-b6a5-73c1767f6597]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:32 compute-0 NetworkManager[55139]: <info>  [1769042012.8244] device (tap2c32b591-b0): carrier: link connected
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.830 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[4350f1cf-4370-4126-8308-dff955c70776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.849 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[902b375d-4146-48f6-968b-10a3b8281c70]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2c32b591-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:9c:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651556, 'reachable_time': 20996, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242537, 'error': None, 'target': 'ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:32.866 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a41c20d0-e4aa-41fe-ad85-db5d9f1518de]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe93:9c46'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 651556, 'tstamp': 651556}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242538, 'error': None, 'target': 'ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.007 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8f8bb9d9-e381-4e06-9014-0758897c848d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2c32b591-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:9c:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651556, 'reachable_time': 20996, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242539, 'error': None, 'target': 'ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.044 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c7f41e63-9a2f-45fa-b707-999a9d1e772c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.081 182939 DEBUG nova.compute.manager [req-cd817cb5-f1e9-43d8-bc20-32534075dbd9 req-a10ebbbf-21b5-4d84-a61d-5a80a4fcbfdd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-vif-plugged-0091268d-dee8-4a48-8f05-e20c0db2ec29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.082 182939 DEBUG oslo_concurrency.lockutils [req-cd817cb5-f1e9-43d8-bc20-32534075dbd9 req-a10ebbbf-21b5-4d84-a61d-5a80a4fcbfdd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.083 182939 DEBUG oslo_concurrency.lockutils [req-cd817cb5-f1e9-43d8-bc20-32534075dbd9 req-a10ebbbf-21b5-4d84-a61d-5a80a4fcbfdd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.083 182939 DEBUG oslo_concurrency.lockutils [req-cd817cb5-f1e9-43d8-bc20-32534075dbd9 req-a10ebbbf-21b5-4d84-a61d-5a80a4fcbfdd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.083 182939 DEBUG nova.compute.manager [req-cd817cb5-f1e9-43d8-bc20-32534075dbd9 req-a10ebbbf-21b5-4d84-a61d-5a80a4fcbfdd 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Processing event network-vif-plugged-0091268d-dee8-4a48-8f05-e20c0db2ec29 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.115 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e9347f7a-2c79-4d46-a86e-52ddee11de94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.116 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c32b591-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.116 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.116 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c32b591-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.116 182939 DEBUG nova.compute.manager [req-3f8203b4-b81e-4240-b5e2-8f6971066da2 req-04d14256-84b2-420c-835e-1bbab03ea31a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-vif-plugged-6a92d070-97be-4ea4-a051-f8b9882b8b9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.117 182939 DEBUG oslo_concurrency.lockutils [req-3f8203b4-b81e-4240-b5e2-8f6971066da2 req-04d14256-84b2-420c-835e-1bbab03ea31a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.117 182939 DEBUG oslo_concurrency.lockutils [req-3f8203b4-b81e-4240-b5e2-8f6971066da2 req-04d14256-84b2-420c-835e-1bbab03ea31a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.118 182939 DEBUG oslo_concurrency.lockutils [req-3f8203b4-b81e-4240-b5e2-8f6971066da2 req-04d14256-84b2-420c-835e-1bbab03ea31a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.118 182939 DEBUG nova.compute.manager [req-3f8203b4-b81e-4240-b5e2-8f6971066da2 req-04d14256-84b2-420c-835e-1bbab03ea31a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Processing event network-vif-plugged-6a92d070-97be-4ea4-a051-f8b9882b8b9c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:33:33 compute-0 kernel: tap2c32b591-b0: entered promiscuous mode
Jan 22 00:33:33 compute-0 NetworkManager[55139]: <info>  [1769042013.1190] manager: (tap2c32b591-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.118 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.126 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2c32b591-b0, col_values=(('external_ids', {'iface-id': 'e38b1907-f53c-4547-8cf1-c0fe946413c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.127 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:33 compute-0 ovn_controller[95047]: 2026-01-22T00:33:33Z|00696|binding|INFO|Releasing lport e38b1907-f53c-4547-8cf1-c0fe946413c0 from this chassis (sb_readonly=0)
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.128 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.141 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2c32b591-bafa-4089-9793-ef7884c86bda.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2c32b591-bafa-4089-9793-ef7884c86bda.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.142 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.142 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7282552c-4c15-41a4-9415-211ab843d11d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.143 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-2c32b591-bafa-4089-9793-ef7884c86bda
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/2c32b591-bafa-4089-9793-ef7884c86bda.pid.haproxy
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 2c32b591-bafa-4089-9793-ef7884c86bda
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.143 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda', 'env', 'PROCESS_TAG=haproxy-2c32b591-bafa-4089-9793-ef7884c86bda', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2c32b591-bafa-4089-9793-ef7884c86bda.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.360 182939 DEBUG nova.network.neutron [req-d95b0a88-637f-48f6-8b99-2ffd61410ff9 req-2d662cdc-8b03-4bd8-8c24-aff5c1c7fe9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Updated VIF entry in instance network info cache for port 0091268d-dee8-4a48-8f05-e20c0db2ec29. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.360 182939 DEBUG nova.network.neutron [req-d95b0a88-637f-48f6-8b99-2ffd61410ff9 req-2d662cdc-8b03-4bd8-8c24-aff5c1c7fe9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Updating instance_info_cache with network_info: [{"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.377 182939 DEBUG oslo_concurrency.lockutils [req-d95b0a88-637f-48f6-8b99-2ffd61410ff9 req-2d662cdc-8b03-4bd8-8c24-aff5c1c7fe9e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.475 182939 DEBUG nova.compute.manager [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.477 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042013.4750917, 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.477 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] VM Started (Lifecycle Event)
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.481 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.489 182939 INFO nova.virt.libvirt.driver [-] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Instance spawned successfully.
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.491 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.506 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.510 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.521 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.522 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.523 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:33:33 compute-0 podman[242578]: 2026-01-22 00:33:33.524312501 +0000 UTC m=+0.062375951 container create f052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.524 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.525 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.525 182939 DEBUG nova.virt.libvirt.driver [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.530 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.530 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042013.475303, 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.531 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] VM Paused (Lifecycle Event)
Jan 22 00:33:33 compute-0 systemd[1]: Started libpod-conmon-f052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113.scope.
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.575 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.579 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042013.4790306, 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.579 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] VM Resumed (Lifecycle Event)
Jan 22 00:33:33 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:33:33 compute-0 podman[242578]: 2026-01-22 00:33:33.498356797 +0000 UTC m=+0.036420267 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:33:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c501dfbce809a9358f6dbf042bb227a8a16fea2b1d416c08317ce668d72e259/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:33:33 compute-0 podman[242578]: 2026-01-22 00:33:33.601769634 +0000 UTC m=+0.139833114 container init f052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.603 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.606 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:33:33 compute-0 podman[242578]: 2026-01-22 00:33:33.607690807 +0000 UTC m=+0.145754257 container start f052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.613 182939 INFO nova.compute.manager [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Took 9.29 seconds to spawn the instance on the hypervisor.
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.614 182939 DEBUG nova.compute.manager [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.630 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:33:33 compute-0 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[242594]: [NOTICE]   (242598) : New worker (242600) forked
Jan 22 00:33:33 compute-0 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[242594]: [NOTICE]   (242598) : Loading success.
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.664 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 0091268d-dee8-4a48-8f05-e20c0db2ec29 in datapath ac047d42-8ff5-4760-85b5-73b5e4be7fc9 unbound from our chassis
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.666 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac047d42-8ff5-4760-85b5-73b5e4be7fc9
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.701 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[22ca7f3d-d8ab-483b-b897-0056c78bd764]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.703 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapac047d42-81 in ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.704 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapac047d42-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.705 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fc436518-392d-4357-ae31-65c0afdc9fd9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.706 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[91a011f0-98dd-4f3b-9480-d359c8d6aadd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.718 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[be3b7e29-4eab-40cc-84da-73aaaec00f49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.719 182939 INFO nova.compute.manager [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Took 9.94 seconds to build instance.
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.733 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[37685cd3-bb5d-4810-99ad-89bb057d744c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.746 182939 DEBUG oslo_concurrency.lockutils [None req-d4cdbcff-db67-495e-b1bd-2e45316077a6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.763 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f87b5d06-132d-4f3d-9d6f-0e5c5c8babab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 systemd-udevd[242529]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:33:33 compute-0 NetworkManager[55139]: <info>  [1769042013.7707] manager: (tapac047d42-80): new Veth device (/org/freedesktop/NetworkManager/Devices/341)
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.773 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[559929dd-3e20-43d7-9356-54fb90f17362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.806 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[15e0fde4-a762-4b36-85ac-a8f9b359a40f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.809 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7d7d27-9328-43d5-ac82-8ade3776de81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 NetworkManager[55139]: <info>  [1769042013.8378] device (tapac047d42-80): carrier: link connected
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.846 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[6a255a58-74d2-4b4e-9b4e-e88f7eccfa96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.868 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cb96a50c-50d4-411b-8996-3a504356c31f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac047d42-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:b1:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651657, 'reachable_time': 34663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242619, 'error': None, 'target': 'ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.886 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d29cb28f-4202-4748-bcac-8195879e1a4d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5f:b174'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 651657, 'tstamp': 651657}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242620, 'error': None, 'target': 'ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.902 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[966d9f9e-5783-4956-86d3-90abe204ec3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac047d42-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5f:b1:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651657, 'reachable_time': 34663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242621, 'error': None, 'target': 'ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.936 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[653592eb-7efe-46eb-a8e7-632c8274446a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.971 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f153c4ed-0b3f-4ba7-ad5a-0e0fc142c431]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.973 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac047d42-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.973 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.973 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac047d42-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.975 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:33 compute-0 NetworkManager[55139]: <info>  [1769042013.9760] manager: (tapac047d42-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/342)
Jan 22 00:33:33 compute-0 kernel: tapac047d42-80: entered promiscuous mode
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.982 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac047d42-80, col_values=(('external_ids', {'iface-id': 'eacecdac-1525-4e22-9343-339f328bc180'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.983 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:33 compute-0 ovn_controller[95047]: 2026-01-22T00:33:33Z|00697|binding|INFO|Releasing lport eacecdac-1525-4e22-9343-339f328bc180 from this chassis (sb_readonly=0)
Jan 22 00:33:33 compute-0 nova_compute[182935]: 2026-01-22 00:33:33.994 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.995 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ac047d42-8ff5-4760-85b5-73b5e4be7fc9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ac047d42-8ff5-4760-85b5-73b5e4be7fc9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.996 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f0530b-39e4-4891-80bc-1d91a1f75109]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.997 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-ac047d42-8ff5-4760-85b5-73b5e4be7fc9
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/ac047d42-8ff5-4760-85b5-73b5e4be7fc9.pid.haproxy
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID ac047d42-8ff5-4760-85b5-73b5e4be7fc9
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:33:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:33:33.998 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'env', 'PROCESS_TAG=haproxy-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ac047d42-8ff5-4760-85b5-73b5e4be7fc9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:33:34 compute-0 podman[242651]: 2026-01-22 00:33:34.441514552 +0000 UTC m=+0.056346156 container create 34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 22 00:33:34 compute-0 systemd[1]: Started libpod-conmon-34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98.scope.
Jan 22 00:33:34 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:33:34 compute-0 podman[242651]: 2026-01-22 00:33:34.414264127 +0000 UTC m=+0.029095691 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:33:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00dbd36dd705000553f3aec4c108d2aa8c9bda53c1948d507f0c885e51041f60/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:33:34 compute-0 podman[242651]: 2026-01-22 00:33:34.520646846 +0000 UTC m=+0.135478460 container init 34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:33:34 compute-0 podman[242651]: 2026-01-22 00:33:34.526158059 +0000 UTC m=+0.140989623 container start 34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 00:33:34 compute-0 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[242667]: [NOTICE]   (242671) : New worker (242673) forked
Jan 22 00:33:34 compute-0 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[242667]: [NOTICE]   (242671) : Loading success.
Jan 22 00:33:34 compute-0 nova_compute[182935]: 2026-01-22 00:33:34.884 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:35 compute-0 nova_compute[182935]: 2026-01-22 00:33:35.177 182939 DEBUG nova.compute.manager [req-f7532a07-3606-4e27-b068-ccec50630848 req-465eee9f-5b15-4fd2-8bec-01a30888bd11 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-vif-plugged-0091268d-dee8-4a48-8f05-e20c0db2ec29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:33:35 compute-0 nova_compute[182935]: 2026-01-22 00:33:35.178 182939 DEBUG oslo_concurrency.lockutils [req-f7532a07-3606-4e27-b068-ccec50630848 req-465eee9f-5b15-4fd2-8bec-01a30888bd11 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:35 compute-0 nova_compute[182935]: 2026-01-22 00:33:35.179 182939 DEBUG oslo_concurrency.lockutils [req-f7532a07-3606-4e27-b068-ccec50630848 req-465eee9f-5b15-4fd2-8bec-01a30888bd11 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:35 compute-0 nova_compute[182935]: 2026-01-22 00:33:35.179 182939 DEBUG oslo_concurrency.lockutils [req-f7532a07-3606-4e27-b068-ccec50630848 req-465eee9f-5b15-4fd2-8bec-01a30888bd11 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:35 compute-0 nova_compute[182935]: 2026-01-22 00:33:35.180 182939 DEBUG nova.compute.manager [req-f7532a07-3606-4e27-b068-ccec50630848 req-465eee9f-5b15-4fd2-8bec-01a30888bd11 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] No waiting events found dispatching network-vif-plugged-0091268d-dee8-4a48-8f05-e20c0db2ec29 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:33:35 compute-0 nova_compute[182935]: 2026-01-22 00:33:35.180 182939 WARNING nova.compute.manager [req-f7532a07-3606-4e27-b068-ccec50630848 req-465eee9f-5b15-4fd2-8bec-01a30888bd11 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received unexpected event network-vif-plugged-0091268d-dee8-4a48-8f05-e20c0db2ec29 for instance with vm_state active and task_state None.
Jan 22 00:33:35 compute-0 nova_compute[182935]: 2026-01-22 00:33:35.289 182939 DEBUG nova.compute.manager [req-4af7fe9d-f661-4d52-a0ac-a7e5689bfd63 req-b448d19b-213a-4598-ad7d-729a6778b341 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-vif-plugged-6a92d070-97be-4ea4-a051-f8b9882b8b9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:33:35 compute-0 nova_compute[182935]: 2026-01-22 00:33:35.289 182939 DEBUG oslo_concurrency.lockutils [req-4af7fe9d-f661-4d52-a0ac-a7e5689bfd63 req-b448d19b-213a-4598-ad7d-729a6778b341 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:35 compute-0 nova_compute[182935]: 2026-01-22 00:33:35.290 182939 DEBUG oslo_concurrency.lockutils [req-4af7fe9d-f661-4d52-a0ac-a7e5689bfd63 req-b448d19b-213a-4598-ad7d-729a6778b341 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:35 compute-0 nova_compute[182935]: 2026-01-22 00:33:35.291 182939 DEBUG oslo_concurrency.lockutils [req-4af7fe9d-f661-4d52-a0ac-a7e5689bfd63 req-b448d19b-213a-4598-ad7d-729a6778b341 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:35 compute-0 nova_compute[182935]: 2026-01-22 00:33:35.291 182939 DEBUG nova.compute.manager [req-4af7fe9d-f661-4d52-a0ac-a7e5689bfd63 req-b448d19b-213a-4598-ad7d-729a6778b341 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] No waiting events found dispatching network-vif-plugged-6a92d070-97be-4ea4-a051-f8b9882b8b9c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:33:35 compute-0 nova_compute[182935]: 2026-01-22 00:33:35.291 182939 WARNING nova.compute.manager [req-4af7fe9d-f661-4d52-a0ac-a7e5689bfd63 req-b448d19b-213a-4598-ad7d-729a6778b341 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received unexpected event network-vif-plugged-6a92d070-97be-4ea4-a051-f8b9882b8b9c for instance with vm_state active and task_state None.
Jan 22 00:33:37 compute-0 nova_compute[182935]: 2026-01-22 00:33:37.043 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:37 compute-0 ovn_controller[95047]: 2026-01-22T00:33:37Z|00698|binding|INFO|Releasing lport eacecdac-1525-4e22-9343-339f328bc180 from this chassis (sb_readonly=0)
Jan 22 00:33:37 compute-0 ovn_controller[95047]: 2026-01-22T00:33:37Z|00699|binding|INFO|Releasing lport e38b1907-f53c-4547-8cf1-c0fe946413c0 from this chassis (sb_readonly=0)
Jan 22 00:33:37 compute-0 nova_compute[182935]: 2026-01-22 00:33:37.968 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:37 compute-0 NetworkManager[55139]: <info>  [1769042017.9719] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/343)
Jan 22 00:33:37 compute-0 NetworkManager[55139]: <info>  [1769042017.9741] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/344)
Jan 22 00:33:38 compute-0 ovn_controller[95047]: 2026-01-22T00:33:38Z|00700|binding|INFO|Releasing lport eacecdac-1525-4e22-9343-339f328bc180 from this chassis (sb_readonly=0)
Jan 22 00:33:38 compute-0 ovn_controller[95047]: 2026-01-22T00:33:38Z|00701|binding|INFO|Releasing lport e38b1907-f53c-4547-8cf1-c0fe946413c0 from this chassis (sb_readonly=0)
Jan 22 00:33:38 compute-0 nova_compute[182935]: 2026-01-22 00:33:38.028 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:38 compute-0 nova_compute[182935]: 2026-01-22 00:33:38.042 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:38 compute-0 nova_compute[182935]: 2026-01-22 00:33:38.387 182939 DEBUG nova.compute.manager [req-eebde356-9153-44f8-b305-f4745f589d05 req-6a4cacc9-2f8c-42a1-8da0-87e0b7cf52d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-changed-6a92d070-97be-4ea4-a051-f8b9882b8b9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:33:38 compute-0 nova_compute[182935]: 2026-01-22 00:33:38.388 182939 DEBUG nova.compute.manager [req-eebde356-9153-44f8-b305-f4745f589d05 req-6a4cacc9-2f8c-42a1-8da0-87e0b7cf52d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Refreshing instance network info cache due to event network-changed-6a92d070-97be-4ea4-a051-f8b9882b8b9c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:33:38 compute-0 nova_compute[182935]: 2026-01-22 00:33:38.389 182939 DEBUG oslo_concurrency.lockutils [req-eebde356-9153-44f8-b305-f4745f589d05 req-6a4cacc9-2f8c-42a1-8da0-87e0b7cf52d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:33:38 compute-0 nova_compute[182935]: 2026-01-22 00:33:38.390 182939 DEBUG oslo_concurrency.lockutils [req-eebde356-9153-44f8-b305-f4745f589d05 req-6a4cacc9-2f8c-42a1-8da0-87e0b7cf52d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:33:38 compute-0 nova_compute[182935]: 2026-01-22 00:33:38.390 182939 DEBUG nova.network.neutron [req-eebde356-9153-44f8-b305-f4745f589d05 req-6a4cacc9-2f8c-42a1-8da0-87e0b7cf52d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Refreshing network info cache for port 6a92d070-97be-4ea4-a051-f8b9882b8b9c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:33:38 compute-0 nova_compute[182935]: 2026-01-22 00:33:38.815 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:38 compute-0 nova_compute[182935]: 2026-01-22 00:33:38.815 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:33:38 compute-0 nova_compute[182935]: 2026-01-22 00:33:38.834 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:33:39 compute-0 podman[242683]: 2026-01-22 00:33:39.682547697 +0000 UTC m=+0.054172173 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:33:39 compute-0 podman[242685]: 2026-01-22 00:33:39.696622686 +0000 UTC m=+0.054225405 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:33:39 compute-0 nova_compute[182935]: 2026-01-22 00:33:39.697 182939 DEBUG nova.network.neutron [req-eebde356-9153-44f8-b305-f4745f589d05 req-6a4cacc9-2f8c-42a1-8da0-87e0b7cf52d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Updated VIF entry in instance network info cache for port 6a92d070-97be-4ea4-a051-f8b9882b8b9c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:33:39 compute-0 nova_compute[182935]: 2026-01-22 00:33:39.698 182939 DEBUG nova.network.neutron [req-eebde356-9153-44f8-b305-f4745f589d05 req-6a4cacc9-2f8c-42a1-8da0-87e0b7cf52d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Updating instance_info_cache with network_info: [{"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:33:39 compute-0 nova_compute[182935]: 2026-01-22 00:33:39.718 182939 DEBUG oslo_concurrency.lockutils [req-eebde356-9153-44f8-b305-f4745f589d05 req-6a4cacc9-2f8c-42a1-8da0-87e0b7cf52d4 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:33:39 compute-0 podman[242684]: 2026-01-22 00:33:39.79904584 +0000 UTC m=+0.162367717 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Jan 22 00:33:39 compute-0 nova_compute[182935]: 2026-01-22 00:33:39.886 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:42 compute-0 nova_compute[182935]: 2026-01-22 00:33:42.046 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:44 compute-0 nova_compute[182935]: 2026-01-22 00:33:44.926 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:46 compute-0 ovn_controller[95047]: 2026-01-22T00:33:46Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:8b:c5 10.100.0.12
Jan 22 00:33:46 compute-0 ovn_controller[95047]: 2026-01-22T00:33:46Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:8b:c5 10.100.0.12
Jan 22 00:33:47 compute-0 nova_compute[182935]: 2026-01-22 00:33:47.052 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:48 compute-0 podman[242768]: 2026-01-22 00:33:48.722717954 +0000 UTC m=+0.073880918 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 22 00:33:49 compute-0 nova_compute[182935]: 2026-01-22 00:33:49.928 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:52 compute-0 nova_compute[182935]: 2026-01-22 00:33:52.056 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:53 compute-0 nova_compute[182935]: 2026-01-22 00:33:53.813 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:33:53 compute-0 nova_compute[182935]: 2026-01-22 00:33:53.950 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:53 compute-0 nova_compute[182935]: 2026-01-22 00:33:53.950 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:53 compute-0 nova_compute[182935]: 2026-01-22 00:33:53.951 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:53 compute-0 nova_compute[182935]: 2026-01-22 00:33:53.951 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:33:54 compute-0 podman[242791]: 2026-01-22 00:33:54.057786751 +0000 UTC m=+0.060785843 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:33:54 compute-0 podman[242790]: 2026-01-22 00:33:54.078177372 +0000 UTC m=+0.075890787 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.6, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.083 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:33:54 compute-0 sshd-session[242787]: Invalid user apache from 188.166.69.60 port 38088
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.148 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.150 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.209 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:33:54 compute-0 sshd-session[242787]: Connection closed by invalid user apache 188.166.69.60 port 38088 [preauth]
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.375 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.376 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5477MB free_disk=73.08989715576172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.376 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.377 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.478 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.478 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.478 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.528 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.572 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.642 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.642 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:33:54 compute-0 nova_compute[182935]: 2026-01-22 00:33:54.963 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:57 compute-0 nova_compute[182935]: 2026-01-22 00:33:57.059 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:33:59 compute-0 nova_compute[182935]: 2026-01-22 00:33:59.965 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:01 compute-0 nova_compute[182935]: 2026-01-22 00:34:01.622 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:01 compute-0 nova_compute[182935]: 2026-01-22 00:34:01.623 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:34:02 compute-0 nova_compute[182935]: 2026-01-22 00:34:02.107 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:03.231 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:03.231 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:03.233 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:03 compute-0 nova_compute[182935]: 2026-01-22 00:34:03.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:03 compute-0 nova_compute[182935]: 2026-01-22 00:34:03.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:34:03 compute-0 nova_compute[182935]: 2026-01-22 00:34:03.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:34:04 compute-0 nova_compute[182935]: 2026-01-22 00:34:04.070 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:34:04 compute-0 nova_compute[182935]: 2026-01-22 00:34:04.071 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:34:04 compute-0 nova_compute[182935]: 2026-01-22 00:34:04.071 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:34:04 compute-0 nova_compute[182935]: 2026-01-22 00:34:04.072 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:34:04 compute-0 nova_compute[182935]: 2026-01-22 00:34:04.967 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:07 compute-0 nova_compute[182935]: 2026-01-22 00:34:07.164 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:08 compute-0 nova_compute[182935]: 2026-01-22 00:34:08.413 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Updating instance_info_cache with network_info: [{"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:34:08 compute-0 nova_compute[182935]: 2026-01-22 00:34:08.448 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:34:08 compute-0 nova_compute[182935]: 2026-01-22 00:34:08.448 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:34:08 compute-0 nova_compute[182935]: 2026-01-22 00:34:08.449 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:08 compute-0 nova_compute[182935]: 2026-01-22 00:34:08.449 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:09 compute-0 nova_compute[182935]: 2026-01-22 00:34:09.969 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:10 compute-0 podman[242840]: 2026-01-22 00:34:10.682035601 +0000 UTC m=+0.050410714 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:34:10 compute-0 podman[242838]: 2026-01-22 00:34:10.688954877 +0000 UTC m=+0.065910676 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:34:10 compute-0 podman[242839]: 2026-01-22 00:34:10.745674801 +0000 UTC m=+0.119241639 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:34:10 compute-0 nova_compute[182935]: 2026-01-22 00:34:10.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:11 compute-0 nova_compute[182935]: 2026-01-22 00:34:11.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:11 compute-0 nova_compute[182935]: 2026-01-22 00:34:11.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:11 compute-0 nova_compute[182935]: 2026-01-22 00:34:11.962 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:11.963 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:34:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:11.964 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:34:12 compute-0 nova_compute[182935]: 2026-01-22 00:34:12.167 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:13 compute-0 nova_compute[182935]: 2026-01-22 00:34:13.787 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:14 compute-0 nova_compute[182935]: 2026-01-22 00:34:14.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:14 compute-0 nova_compute[182935]: 2026-01-22 00:34:14.970 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:17 compute-0 nova_compute[182935]: 2026-01-22 00:34:17.169 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:18 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:18.965 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:19 compute-0 podman[242907]: 2026-01-22 00:34:19.692670367 +0000 UTC m=+0.062477793 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 00:34:19 compute-0 nova_compute[182935]: 2026-01-22 00:34:19.971 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:22 compute-0 nova_compute[182935]: 2026-01-22 00:34:22.172 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.323 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'name': 'tempest-TestGettingAddress-server-1289953885', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000ab', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '837db8748d074b3c9179b47d30e7a1d4', 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'hostId': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.324 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.328 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 / tap6a92d070-97 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.329 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 / tap0091268d-de inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.329 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.330 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01a103a6-8a0b-4ced-943f-33e75ef10ba3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap6a92d070-97', 'timestamp': '2026-01-22T00:34:23.325168', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap6a92d070-97', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a92d070-97'}, 'message_id': '1947a644-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': '6af24409938d9b94c7a25ff7e98f39d5b85057a61d7fb251acf1fa4fb789b895'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap0091268d-de', 'timestamp': '2026-01-22T00:34:23.325168', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap0091268d-de', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5c:f9:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0091268d-de'}, 'message_id': '1947b95e-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': 'b5416cd8d52d50e497dc72bf7155ce10e43e2f32c9657384ba12986e51067927'}]}, 'timestamp': '2026-01-22 00:34:23.330523', '_unique_id': 'ae001bb29c6a4132a1a349f45fac8ec5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.333 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.334 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.359 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.write.requests volume: 316 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.359 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f103144d-c6d4-4218-b650-3aaaeb2fe32c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 316, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-vda', 'timestamp': '2026-01-22T00:34:23.334151', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '194c3150-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.128429191, 'message_signature': '367d544f375b85f35c6e05e12981586d8aeeda0fcbe03a87f83b882cd701bf1a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-sda', 'timestamp': '2026-01-22T00:34:23.334151', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '194c3e3e-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.128429191, 'message_signature': '568aa8fef41287b6104c1c74ff49fd12c750d20a62e6f40e3d79426eec163d48'}]}, 'timestamp': '2026-01-22 00:34:23.360073', '_unique_id': '2659c11d836a48d789662ec5f10fce44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.361 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.362 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.362 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.362 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df5a2f32-e24d-491d-9545-c80848471839', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-vda', 'timestamp': '2026-01-22T00:34:23.362245', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '194c9ece-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.128429191, 'message_signature': '2872bbfaa33c2b1df7419d52b75fd2278728d75e7a7f0594d9c1f06195f52135'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-sda', 'timestamp': '2026-01-22T00:34:23.362245', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '194caa40-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.128429191, 'message_signature': '8e410dcd0c7447a31c96d5f870e3b7d5fd2a58b68c6b58727d66c9b27948f059'}]}, 'timestamp': '2026-01-22 00:34:23.362876', '_unique_id': '1c8476b579634cdfac0590c2663b9bad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.363 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.364 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.364 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.364 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '206a2d30-973c-4190-bbb6-63ba2e466ab9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap6a92d070-97', 'timestamp': '2026-01-22T00:34:23.364229', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap6a92d070-97', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a92d070-97'}, 'message_id': '194ceb36-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': '9b01c004796c3fb853895cdb26229d2a08c5a0ef5da5288f577c6b86446bb763'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap0091268d-de', 'timestamp': '2026-01-22T00:34:23.364229', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap0091268d-de', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5c:f9:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0091268d-de'}, 'message_id': '194cf518-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': 'a72c6824b18636bbae6896e898e175db517710515bdb3841c52dab8e8c4a1d33'}]}, 'timestamp': '2026-01-22 00:34:23.364758', '_unique_id': 'c6b080dd41214dcb89a0140c6930aad6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.365 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.366 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.366 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.366 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.incoming.bytes volume: 740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '261f9bc0-992e-4beb-94dc-fc7dc0cbac99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap6a92d070-97', 'timestamp': '2026-01-22T00:34:23.366237', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap6a92d070-97', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a92d070-97'}, 'message_id': '194d3a00-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': '673d426f51abb8abb02f1ea91174d70f870b28566ecacd8071fd42eb795a6858'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 740, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap0091268d-de', 'timestamp': '2026-01-22T00:34:23.366237', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap0091268d-de', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5c:f9:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0091268d-de'}, 'message_id': '194d4536-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': '53d481b6ab9ea2dec908ffce3b62d31ca3a9e88278a9064c02d2a543659559e5'}]}, 'timestamp': '2026-01-22 00:34:23.366867', '_unique_id': 'bcef2d81b0fa42ebbd5e371afc34e583'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.367 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.368 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.381 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/memory.usage volume: 43.80078125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bcc3cbe-f44c-47f8-96b0-14150448e3c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.80078125, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'timestamp': '2026-01-22T00:34:23.368463', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '194f863e-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.175271438, 'message_signature': '86aa47b635ec8474c22f760134f8a5dd276639a2b4910bf52bbc3d34749f3a45'}]}, 'timestamp': '2026-01-22 00:34:23.381633', '_unique_id': '889b60854d1540ef9b7d287877c81e25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.382 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.383 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.383 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.383 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be90f180-fb10-45ec-889d-05854fad078c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap6a92d070-97', 'timestamp': '2026-01-22T00:34:23.383231', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap6a92d070-97', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a92d070-97'}, 'message_id': '194fd76a-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': 'a9b02b937946d5b64fba65ad87b4682ae3717b20060fca81466c9400ef32230e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap0091268d-de', 'timestamp': '2026-01-22T00:34:23.383231', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap0091268d-de', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5c:f9:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0091268d-de'}, 'message_id': '194fe03e-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': 'c39c19653975591805801e2a9488d19ea93662d705b069c5aa95aafab81026e5'}]}, 'timestamp': '2026-01-22 00:34:23.383883', '_unique_id': 'f2e6ac63fc2b4ee9a8a6582e0be46985'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.384 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.385 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.385 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '084e377a-e454-4a31-953d-0740d5a445bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap6a92d070-97', 'timestamp': '2026-01-22T00:34:23.385035', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap6a92d070-97', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a92d070-97'}, 'message_id': '19501694-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': 'f542b090ddd543ce78b86e7f59cb01942dd337ae6846cf22b9c36c5f7572170e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap0091268d-de', 'timestamp': '2026-01-22T00:34:23.385035', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap0091268d-de', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5c:f9:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0091268d-de'}, 'message_id': '19501ed2-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': '1a9825ad1a77d5f9b7035ba2562e41c1b3df09a9eaea3b66439feb65619fe76f'}]}, 'timestamp': '2026-01-22 00:34:23.385474', '_unique_id': '5c60cd556c8a44478ed05fc745649ebd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.386 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.387 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.outgoing.bytes volume: 2618 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3523093e-63c6-4ab3-ad71-f329121b49b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap6a92d070-97', 'timestamp': '2026-01-22T00:34:23.386912', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap6a92d070-97', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a92d070-97'}, 'message_id': '195060fe-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': 'd6e6aa39e8b85b9ca7669d4a19a72a6badcc5258ed2579ac4cf8eb086d0ebb79'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2618, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap0091268d-de', 'timestamp': '2026-01-22T00:34:23.386912', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap0091268d-de', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5c:f9:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0091268d-de'}, 'message_id': '195069dc-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': '2c65b63841a064f5335b107e9dd37402f44bd7ff6a23923ce4d8505db24c4725'}]}, 'timestamp': '2026-01-22 00:34:23.387423', '_unique_id': '56d5c83047b544d89ea6b666e2a21eef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.388 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.389 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.397 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.398 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd83207fd-412d-4ad9-9ca3-7c680f9fe255', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-vda', 'timestamp': '2026-01-22T00:34:23.389095', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '195206d4-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.183360552, 'message_signature': 'ab35c0374bb38f9809bd0e6f63350db5bd1fb5a1256183e4c8790454e41c0028'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-sda', 'timestamp': '2026-01-22T00:34:23.389095', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '19521110-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.183360552, 'message_signature': '0254a5dfc3f3c37f083f4b69761d20def2f99564fce27fc1c5c684c80817f2a1'}]}, 'timestamp': '2026-01-22 00:34:23.398225', '_unique_id': '1ced7454e0834d64bf54e134b5715d1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.399 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5be54819-131d-4f28-8d76-77855a953165', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap6a92d070-97', 'timestamp': '2026-01-22T00:34:23.399788', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap6a92d070-97', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a92d070-97'}, 'message_id': '1952581e-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': '94604479242b4f4a21fd098aeab568b030f398a5706aaead0f5d123bed98247e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap0091268d-de', 'timestamp': '2026-01-22T00:34:23.399788', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap0091268d-de', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5c:f9:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0091268d-de'}, 'message_id': '1952608e-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': 'e0576339ec8b415179d4b3bc3a59f8af40124433d6f653ad7056435e1dd1a0a0'}]}, 'timestamp': '2026-01-22 00:34:23.400324', '_unique_id': '831be1e3904c4fff900211231f01ae1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.400 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.401 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.401 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.402 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72b225ca-1174-4131-bff4-e36220775cf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap6a92d070-97', 'timestamp': '2026-01-22T00:34:23.401775', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap6a92d070-97', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a92d070-97'}, 'message_id': '1952a6e8-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': 'b3156682c49d55630c7ed1ff2798495d808291f885751caebc684be1d3484d31'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap0091268d-de', 'timestamp': '2026-01-22T00:34:23.401775', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap0091268d-de', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5c:f9:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0091268d-de'}, 'message_id': '1952b250-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': '0bf4e62bd8fb2cadb749aa3736c7a0f5d97b7f49abecf1acfc7bfbde136a88c6'}]}, 'timestamp': '2026-01-22 00:34:23.402391', '_unique_id': '96ebe792a41a4b15b124daa7fc9c3ede'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.403 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.read.bytes volume: 30530048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.404 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '750234d8-6fa4-4efa-9995-8bdf4bcaf793', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30530048, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-vda', 'timestamp': '2026-01-22T00:34:23.403843', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1952f6a2-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.128429191, 'message_signature': '1bdc438687893f5a35027908062bdac0fbb140c46434e97b3749b3bc824c5ee9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-sda', 'timestamp': '2026-01-22T00:34:23.403843', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '195301ba-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.128429191, 'message_signature': '7455ee6e1e72c0f1203091d7fed88ff0f7b2ef0dec4def6c97e6edcce5821601'}]}, 'timestamp': '2026-01-22 00:34:23.404412', '_unique_id': 'a6963527286b4dde858c81394ae61fa2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.405 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1289953885>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1289953885>]
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.406 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.406 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.406 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1289953885>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1289953885>]
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.406 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.406 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.406 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d87bf3e-5403-412b-ac42-a4f3250b9d31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-vda', 'timestamp': '2026-01-22T00:34:23.406498', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '19535cd2-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.183360552, 'message_signature': 'd193b2be749263365bb701e0284461d37d32c15d883a646dbd9fe5d260644887'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-sda', 'timestamp': '2026-01-22T00:34:23.406498', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '19536538-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.183360552, 'message_signature': 'd7ec07df4ebc06bb6c02498bdd0fb8a8c76a82d941cdcd6dd351250ed6518ae8'}]}, 'timestamp': '2026-01-22 00:34:23.406924', '_unique_id': '7850651ee18342deb1f18f6b2214d84d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.407 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.outgoing.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc76e1cd-3623-49ca-ae7f-78637896d62a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap6a92d070-97', 'timestamp': '2026-01-22T00:34:23.408018', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap6a92d070-97', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a92d070-97'}, 'message_id': '19539878-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': 'eb1d4c9f484b67620d6b16891becdd13f7bdb09d3beb66e3c677274824dc1284'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap0091268d-de', 'timestamp': '2026-01-22T00:34:23.408018', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap0091268d-de', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5c:f9:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0091268d-de'}, 'message_id': '1953a16a-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': 'cf9d0ba751df53589a14749e90cd9591888983d311bffc59f4a80195ed341e77'}]}, 'timestamp': '2026-01-22 00:34:23.408469', '_unique_id': 'e02fc97459b64859b5c3ff0f331fe0d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.408 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.409 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.409 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.409 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1289953885>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1289953885>]
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.409 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.409 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/cpu volume: 12000000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c00ee092-4b0d-4689-b56c-d1be679c5213', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12000000000, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'timestamp': '2026-01-22T00:34:23.409897', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '1953e24c-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.175271438, 'message_signature': '7aeab3d62cc0efb6f0957abba99e0ad58eb35404332ed149c39f38c7a67b679a'}]}, 'timestamp': '2026-01-22 00:34:23.410135', '_unique_id': '43b219953dac433da482577bb2bce8df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.410 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.411 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.411 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.411 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1289953885>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1289953885>]
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.411 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.411 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.412 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ceaa7975-ae58-4451-aa0b-f5eb57860668', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap6a92d070-97', 'timestamp': '2026-01-22T00:34:23.411784', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap6a92d070-97', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:95:8b:c5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6a92d070-97'}, 'message_id': '19542e0a-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': '56c7d40678b95d6bf339485bb1b58dba102f52d19034105ffd37b06287854086'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000ab-7a9023a9-d882-4cdd-aa9e-ee59c7795d38-tap0091268d-de', 'timestamp': '2026-01-22T00:34:23.411784', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'tap0091268d-de', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5c:f9:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0091268d-de'}, 'message_id': '19543a94-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.119451856, 'message_signature': '3dbf08d6fd6ee5de5e950a295251279ba5d0c732609d3d14ca648559b9161027'}]}, 'timestamp': '2026-01-22 00:34:23.412417', '_unique_id': '61705f58c2a3462dbd524e61943a5313'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.413 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.read.requests volume: 1098 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d40a3a8-7ef4-4334-a01f-cfa448d67e65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1098, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-vda', 'timestamp': '2026-01-22T00:34:23.413747', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1954793c-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.128429191, 'message_signature': 'e0e4c36ba72691a387a8b81049cd311203ea86c2dca308543bb83a1f4cd894c4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-sda', 'timestamp': '2026-01-22T00:34:23.413747', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '19548134-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.128429191, 'message_signature': 'b97a3d6b80ebf6f4b5ed9e645fb1a4c678e29c8eaeecb0513cd5638389f63a14'}]}, 'timestamp': '2026-01-22 00:34:23.414194', '_unique_id': '213060e24d1f4e12bf79520ff1d752c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.414 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.415 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.415 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.write.latency volume: 2677252128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.415 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86aa01c5-72ad-4522-a3c4-800b9c368e58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2677252128, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-vda', 'timestamp': '2026-01-22T00:34:23.415330', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1954b712-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.128429191, 'message_signature': 'f285d7e808aa6020e2283dc8207ae2f88e3d80edb739cbc1f0f3e976ed4381f2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-sda', 'timestamp': '2026-01-22T00:34:23.415330', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1954c108-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.128429191, 'message_signature': '3f448396d028dec90c342f008b09ab504a4b37ded085006e2ad7256eb1f994de'}]}, 'timestamp': '2026-01-22 00:34:23.415858', '_unique_id': '10fb3b6ea22b414db93493dd4ea46686'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.416 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.417 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.417 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.417 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ede9af8-2638-44ea-bf28-2f95ebef82ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-vda', 'timestamp': '2026-01-22T00:34:23.417120', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1954fc04-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.183360552, 'message_signature': '2251148bb2263ebe47a18d67e1557fc6c7b46376bb1d8d1ea61ea7cd1ef3eaf8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-sda', 'timestamp': '2026-01-22T00:34:23.417120', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '195503a2-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.183360552, 'message_signature': 'f826845980ea273ebf48303d9c1f3cdf46e088f656c23fa5931eb3de45ddaf4d'}]}, 'timestamp': '2026-01-22 00:34:23.417530', '_unique_id': '3ca97ca083e94df3a35505176d1e4148'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.read.latency volume: 164374722 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.418 12 DEBUG ceilometer.compute.pollsters [-] 7a9023a9-d882-4cdd-aa9e-ee59c7795d38/disk.device.read.latency volume: 18766151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '399e9feb-a52e-4fcb-89c1-906afe3b51f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 164374722, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-vda', 'timestamp': '2026-01-22T00:34:23.418652', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '195537be-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.128429191, 'message_signature': '6b435ff3660d67f98d4cb27e310ba9c84c91fd583bd3c21b5b839650e03b1727'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18766151, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38-sda', 'timestamp': '2026-01-22T00:34:23.418652', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1289953885', 'name': 'instance-000000ab', 'instance_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '19554344-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6566.128429191, 'message_signature': 'da327180a9bc1d851836ba7bbe06db36b358f6ee7059d1ba0ada2bcf7086bbbf'}]}, 'timestamp': '2026-01-22 00:34:23.419197', '_unique_id': '2df46695cff14b9a835987fac05536d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:34:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:34:23.419 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:34:23 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 22 00:34:24 compute-0 podman[242928]: 2026-01-22 00:34:24.685175253 +0000 UTC m=+0.058478297 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Jan 22 00:34:24 compute-0 podman[242929]: 2026-01-22 00:34:24.689099708 +0000 UTC m=+0.058058108 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:34:25 compute-0 nova_compute[182935]: 2026-01-22 00:34:25.012 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:27 compute-0 nova_compute[182935]: 2026-01-22 00:34:27.174 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:30 compute-0 nova_compute[182935]: 2026-01-22 00:34:30.057 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:32 compute-0 nova_compute[182935]: 2026-01-22 00:34:32.208 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:35 compute-0 nova_compute[182935]: 2026-01-22 00:34:35.100 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:35 compute-0 sshd-session[242967]: Invalid user apache from 188.166.69.60 port 50694
Jan 22 00:34:35 compute-0 sshd-session[242967]: Connection closed by invalid user apache 188.166.69.60 port 50694 [preauth]
Jan 22 00:34:37 compute-0 nova_compute[182935]: 2026-01-22 00:34:37.210 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:40 compute-0 nova_compute[182935]: 2026-01-22 00:34:40.107 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:41 compute-0 podman[242969]: 2026-01-22 00:34:41.699705773 +0000 UTC m=+0.061669185 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:34:41 compute-0 podman[242971]: 2026-01-22 00:34:41.727967202 +0000 UTC m=+0.079623486 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:34:41 compute-0 podman[242970]: 2026-01-22 00:34:41.743377053 +0000 UTC m=+0.109399262 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:34:42 compute-0 nova_compute[182935]: 2026-01-22 00:34:42.212 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:45 compute-0 nova_compute[182935]: 2026-01-22 00:34:45.110 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.798 182939 DEBUG nova.compute.manager [req-0ecb878e-31df-4178-8360-6a4e1427f282 req-124113cc-de3f-4876-8fdc-f946cacd844b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-changed-6a92d070-97be-4ea4-a051-f8b9882b8b9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.798 182939 DEBUG nova.compute.manager [req-0ecb878e-31df-4178-8360-6a4e1427f282 req-124113cc-de3f-4876-8fdc-f946cacd844b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Refreshing instance network info cache due to event network-changed-6a92d070-97be-4ea4-a051-f8b9882b8b9c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.799 182939 DEBUG oslo_concurrency.lockutils [req-0ecb878e-31df-4178-8360-6a4e1427f282 req-124113cc-de3f-4876-8fdc-f946cacd844b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.799 182939 DEBUG oslo_concurrency.lockutils [req-0ecb878e-31df-4178-8360-6a4e1427f282 req-124113cc-de3f-4876-8fdc-f946cacd844b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.800 182939 DEBUG nova.network.neutron [req-0ecb878e-31df-4178-8360-6a4e1427f282 req-124113cc-de3f-4876-8fdc-f946cacd844b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Refreshing network info cache for port 6a92d070-97be-4ea4-a051-f8b9882b8b9c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.891 182939 DEBUG oslo_concurrency.lockutils [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.892 182939 DEBUG oslo_concurrency.lockutils [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.892 182939 DEBUG oslo_concurrency.lockutils [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.892 182939 DEBUG oslo_concurrency.lockutils [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.893 182939 DEBUG oslo_concurrency.lockutils [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.908 182939 INFO nova.compute.manager [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Terminating instance
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.923 182939 DEBUG nova.compute.manager [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:34:46 compute-0 kernel: tap6a92d070-97 (unregistering): left promiscuous mode
Jan 22 00:34:46 compute-0 NetworkManager[55139]: <info>  [1769042086.9505] device (tap6a92d070-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.962 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:46 compute-0 ovn_controller[95047]: 2026-01-22T00:34:46Z|00702|binding|INFO|Releasing lport 6a92d070-97be-4ea4-a051-f8b9882b8b9c from this chassis (sb_readonly=0)
Jan 22 00:34:46 compute-0 ovn_controller[95047]: 2026-01-22T00:34:46Z|00703|binding|INFO|Setting lport 6a92d070-97be-4ea4-a051-f8b9882b8b9c down in Southbound
Jan 22 00:34:46 compute-0 ovn_controller[95047]: 2026-01-22T00:34:46Z|00704|binding|INFO|Removing iface tap6a92d070-97 ovn-installed in OVS
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.965 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:46.972 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:8b:c5 10.100.0.12'], port_security=['fa:16:3e:95:8b:c5 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c32b591-bafa-4089-9793-ef7884c86bda', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e757454-ac82-49d6-9905-0e379d7d274f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e2ff2e46-2af8-49fc-9f01-6a639111eeb4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=6a92d070-97be-4ea4-a051-f8b9882b8b9c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:34:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:46.973 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 6a92d070-97be-4ea4-a051-f8b9882b8b9c in datapath 2c32b591-bafa-4089-9793-ef7884c86bda unbound from our chassis
Jan 22 00:34:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:46.974 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2c32b591-bafa-4089-9793-ef7884c86bda, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:34:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:46.975 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4c507d91-ca9b-4bc5-804d-5874635936bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:46.976 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda namespace which is not needed anymore
Jan 22 00:34:46 compute-0 nova_compute[182935]: 2026-01-22 00:34:46.977 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 kernel: tap0091268d-de (unregistering): left promiscuous mode
Jan 22 00:34:47 compute-0 NetworkManager[55139]: <info>  [1769042087.0052] device (tap0091268d-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.012 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 ovn_controller[95047]: 2026-01-22T00:34:47Z|00705|binding|INFO|Releasing lport 0091268d-dee8-4a48-8f05-e20c0db2ec29 from this chassis (sb_readonly=0)
Jan 22 00:34:47 compute-0 ovn_controller[95047]: 2026-01-22T00:34:47Z|00706|binding|INFO|Setting lport 0091268d-dee8-4a48-8f05-e20c0db2ec29 down in Southbound
Jan 22 00:34:47 compute-0 ovn_controller[95047]: 2026-01-22T00:34:47Z|00707|binding|INFO|Removing iface tap0091268d-de ovn-installed in OVS
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.014 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.021 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5c:f9:6c 2001:db8::f816:3eff:fe5c:f96c'], port_security=['fa:16:3e:5c:f9:6c 2001:db8::f816:3eff:fe5c:f96c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe5c:f96c/64', 'neutron:device_id': '7a9023a9-d882-4cdd-aa9e-ee59c7795d38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2e757454-ac82-49d6-9905-0e379d7d274f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27a761c8-dbca-47a8-b596-d7db8b087bd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=0091268d-dee8-4a48-8f05-e20c0db2ec29) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.031 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000ab.scope: Deactivated successfully.
Jan 22 00:34:47 compute-0 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000ab.scope: Consumed 15.964s CPU time.
Jan 22 00:34:47 compute-0 systemd-machined[154182]: Machine qemu-89-instance-000000ab terminated.
Jan 22 00:34:47 compute-0 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[242594]: [NOTICE]   (242598) : haproxy version is 2.8.14-c23fe91
Jan 22 00:34:47 compute-0 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[242594]: [NOTICE]   (242598) : path to executable is /usr/sbin/haproxy
Jan 22 00:34:47 compute-0 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[242594]: [WARNING]  (242598) : Exiting Master process...
Jan 22 00:34:47 compute-0 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[242594]: [WARNING]  (242598) : Exiting Master process...
Jan 22 00:34:47 compute-0 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[242594]: [ALERT]    (242598) : Current worker (242600) exited with code 143 (Terminated)
Jan 22 00:34:47 compute-0 neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda[242594]: [WARNING]  (242598) : All workers exited. Exiting... (0)
Jan 22 00:34:47 compute-0 systemd[1]: libpod-f052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113.scope: Deactivated successfully.
Jan 22 00:34:47 compute-0 podman[243071]: 2026-01-22 00:34:47.127211892 +0000 UTC m=+0.053668061 container died f052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:34:47 compute-0 NetworkManager[55139]: <info>  [1769042087.1485] manager: (tap6a92d070-97): new Tun device (/org/freedesktop/NetworkManager/Devices/345)
Jan 22 00:34:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113-userdata-shm.mount: Deactivated successfully.
Jan 22 00:34:47 compute-0 NetworkManager[55139]: <info>  [1769042087.1794] manager: (tap0091268d-de): new Tun device (/org/freedesktop/NetworkManager/Devices/346)
Jan 22 00:34:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-7c501dfbce809a9358f6dbf042bb227a8a16fea2b1d416c08317ce668d72e259-merged.mount: Deactivated successfully.
Jan 22 00:34:47 compute-0 podman[243071]: 2026-01-22 00:34:47.186478768 +0000 UTC m=+0.112934927 container cleanup f052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 00:34:47 compute-0 systemd[1]: libpod-conmon-f052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113.scope: Deactivated successfully.
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.213 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.227 182939 INFO nova.virt.libvirt.driver [-] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Instance destroyed successfully.
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.227 182939 DEBUG nova.objects.instance [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.249 182939 DEBUG nova.virt.libvirt.vif [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:33:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1289953885',display_name='tempest-TestGettingAddress-server-1289953885',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1289953885',id=171,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCnBNde9LnK+U+D5ENHV5Bhm30z07BTZ3lIWHvMukOxK8Jf8o76X7/cU7jqimLz8JbkUji6/91McZm7z1O1Yc3tDz3ZuSaz3KrVuj+NWjZEoVIpu6UJrwRH+k4I2kfmaw==',key_name='tempest-TestGettingAddress-1562921949',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:33:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-pn756rnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:33:33Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=7a9023a9-d882-4cdd-aa9e-ee59c7795d38,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.250 182939 DEBUG nova.network.os_vif_util [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.251 182939 DEBUG nova.network.os_vif_util [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:8b:c5,bridge_name='br-int',has_traffic_filtering=True,id=6a92d070-97be-4ea4-a051-f8b9882b8b9c,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a92d070-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.251 182939 DEBUG os_vif [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:8b:c5,bridge_name='br-int',has_traffic_filtering=True,id=6a92d070-97be-4ea4-a051-f8b9882b8b9c,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a92d070-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.253 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.254 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a92d070-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.255 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.258 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.260 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 podman[243121]: 2026-01-22 00:34:47.260606712 +0000 UTC m=+0.046669345 container remove f052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.262 182939 INFO os_vif [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:8b:c5,bridge_name='br-int',has_traffic_filtering=True,id=6a92d070-97be-4ea4-a051-f8b9882b8b9c,network=Network(2c32b591-bafa-4089-9793-ef7884c86bda),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6a92d070-97')
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.263 182939 DEBUG nova.virt.libvirt.vif [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:33:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1289953885',display_name='tempest-TestGettingAddress-server-1289953885',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1289953885',id=171,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBCnBNde9LnK+U+D5ENHV5Bhm30z07BTZ3lIWHvMukOxK8Jf8o76X7/cU7jqimLz8JbkUji6/91McZm7z1O1Yc3tDz3ZuSaz3KrVuj+NWjZEoVIpu6UJrwRH+k4I2kfmaw==',key_name='tempest-TestGettingAddress-1562921949',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:33:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-pn756rnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:33:33Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=7a9023a9-d882-4cdd-aa9e-ee59c7795d38,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.263 182939 DEBUG nova.network.os_vif_util [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.264 182939 DEBUG nova.network.os_vif_util [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5c:f9:6c,bridge_name='br-int',has_traffic_filtering=True,id=0091268d-dee8-4a48-8f05-e20c0db2ec29,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0091268d-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.264 182939 DEBUG os_vif [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5c:f9:6c,bridge_name='br-int',has_traffic_filtering=True,id=0091268d-dee8-4a48-8f05-e20c0db2ec29,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0091268d-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.265 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.266 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0091268d-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.266 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c1c25b-95b1-4074-9c81-7a5b3248937d]: (4, ('Thu Jan 22 12:34:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda (f052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113)\nf052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113\nThu Jan 22 12:34:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda (f052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113)\nf052a1810be08adb71c5fa3b2e9d4c154dfa5025a35b159512e6b1fe03ba3113\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.267 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.269 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1f911de1-8d2f-41e1-ab2c-72873438c1af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.269 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.270 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c32b591-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.271 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 kernel: tap2c32b591-b0: left promiscuous mode
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.273 182939 INFO os_vif [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5c:f9:6c,bridge_name='br-int',has_traffic_filtering=True,id=0091268d-dee8-4a48-8f05-e20c0db2ec29,network=Network(ac047d42-8ff5-4760-85b5-73b5e4be7fc9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0091268d-de')
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.275 182939 INFO nova.virt.libvirt.driver [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Deleting instance files /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38_del
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.276 182939 INFO nova.virt.libvirt.driver [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Deletion of /var/lib/nova/instances/7a9023a9-d882-4cdd-aa9e-ee59c7795d38_del complete
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.288 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.289 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.292 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[38bdde4b-42a2-417f-9535-70d5cc100bfb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.311 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f752a725-1dbf-486f-b901-d07a81a12299]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.313 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b87f3450-84a1-40cd-be3b-e38142d5ba65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.332 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2c4518ae-5fea-41e7-b09f-6f214e112ecb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651549, 'reachable_time': 42177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243141, 'error': None, 'target': 'ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 systemd[1]: run-netns-ovnmeta\x2d2c32b591\x2dbafa\x2d4089\x2d9793\x2def7884c86bda.mount: Deactivated successfully.
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.338 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2c32b591-bafa-4089-9793-ef7884c86bda deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.338 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[3e436d27-6b83-4129-af8d-373e8555de56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.340 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 0091268d-dee8-4a48-8f05-e20c0db2ec29 in datapath ac047d42-8ff5-4760-85b5-73b5e4be7fc9 unbound from our chassis
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.341 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac047d42-8ff5-4760-85b5-73b5e4be7fc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.341 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5751c022-566d-4be3-8bf7-4d4be844caee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.342 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9 namespace which is not needed anymore
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.386 182939 INFO nova.compute.manager [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Took 0.46 seconds to destroy the instance on the hypervisor.
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.387 182939 DEBUG oslo.service.loopingcall [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.387 182939 DEBUG nova.compute.manager [-] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.387 182939 DEBUG nova.network.neutron [-] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.424 182939 DEBUG nova.compute.manager [req-ec751bc0-7be4-4b82-80a8-6a9c9bd3c8b4 req-da954448-3bbe-4a57-a625-472381ed37c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-vif-unplugged-0091268d-dee8-4a48-8f05-e20c0db2ec29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.425 182939 DEBUG oslo_concurrency.lockutils [req-ec751bc0-7be4-4b82-80a8-6a9c9bd3c8b4 req-da954448-3bbe-4a57-a625-472381ed37c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.425 182939 DEBUG oslo_concurrency.lockutils [req-ec751bc0-7be4-4b82-80a8-6a9c9bd3c8b4 req-da954448-3bbe-4a57-a625-472381ed37c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.425 182939 DEBUG oslo_concurrency.lockutils [req-ec751bc0-7be4-4b82-80a8-6a9c9bd3c8b4 req-da954448-3bbe-4a57-a625-472381ed37c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.426 182939 DEBUG nova.compute.manager [req-ec751bc0-7be4-4b82-80a8-6a9c9bd3c8b4 req-da954448-3bbe-4a57-a625-472381ed37c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] No waiting events found dispatching network-vif-unplugged-0091268d-dee8-4a48-8f05-e20c0db2ec29 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.426 182939 DEBUG nova.compute.manager [req-ec751bc0-7be4-4b82-80a8-6a9c9bd3c8b4 req-da954448-3bbe-4a57-a625-472381ed37c3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-vif-unplugged-0091268d-dee8-4a48-8f05-e20c0db2ec29 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:34:47 compute-0 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[242667]: [NOTICE]   (242671) : haproxy version is 2.8.14-c23fe91
Jan 22 00:34:47 compute-0 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[242667]: [NOTICE]   (242671) : path to executable is /usr/sbin/haproxy
Jan 22 00:34:47 compute-0 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[242667]: [WARNING]  (242671) : Exiting Master process...
Jan 22 00:34:47 compute-0 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[242667]: [ALERT]    (242671) : Current worker (242673) exited with code 143 (Terminated)
Jan 22 00:34:47 compute-0 neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9[242667]: [WARNING]  (242671) : All workers exited. Exiting... (0)
Jan 22 00:34:47 compute-0 systemd[1]: libpod-34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98.scope: Deactivated successfully.
Jan 22 00:34:47 compute-0 podman[243160]: 2026-01-22 00:34:47.50581826 +0000 UTC m=+0.049581485 container died 34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:34:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98-userdata-shm.mount: Deactivated successfully.
Jan 22 00:34:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-00dbd36dd705000553f3aec4c108d2aa8c9bda53c1948d507f0c885e51041f60-merged.mount: Deactivated successfully.
Jan 22 00:34:47 compute-0 podman[243160]: 2026-01-22 00:34:47.563474816 +0000 UTC m=+0.107238051 container cleanup 34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 00:34:47 compute-0 systemd[1]: libpod-conmon-34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98.scope: Deactivated successfully.
Jan 22 00:34:47 compute-0 podman[243191]: 2026-01-22 00:34:47.624629487 +0000 UTC m=+0.041527800 container remove 34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.630 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[574f6d8c-3707-4682-8914-8aaaf50784c3]: (4, ('Thu Jan 22 12:34:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9 (34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98)\n34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98\nThu Jan 22 12:34:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9 (34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98)\n34fcf5da0c7d3a0d12e2880b42c02c0898d3d642a5e427ada924f02ed2000f98\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.632 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6ecc7039-7287-49ba-a9b7-a5dc7276bad5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.633 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac047d42-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.636 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 kernel: tapac047d42-80: left promiscuous mode
Jan 22 00:34:47 compute-0 nova_compute[182935]: 2026-01-22 00:34:47.647 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.650 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[84c8bbc6-1a62-42fd-b645-b34f584f80b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.666 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[81ad965d-e19a-4e0d-bc7b-c31a927edc69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.667 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5c3034-a61c-43a6-a0f2-83484b6cc38f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.690 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dc88337a-78bc-459b-a108-e1ece7da4665]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651649, 'reachable_time': 19426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243206, 'error': None, 'target': 'ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.693 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ac047d42-8ff5-4760-85b5-73b5e4be7fc9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:34:47 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:34:47.693 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[c61a9438-7673-4e52-86fe-ea3f7d813e48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:34:48 compute-0 systemd[1]: run-netns-ovnmeta\x2dac047d42\x2d8ff5\x2d4760\x2d85b5\x2d73b5e4be7fc9.mount: Deactivated successfully.
Jan 22 00:34:48 compute-0 nova_compute[182935]: 2026-01-22 00:34:48.928 182939 DEBUG nova.compute.manager [req-70c8c2cd-0ebc-4898-8808-82be1744b4c7 req-481ce4fe-d4af-4ff2-967a-36a02cbab715 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-vif-unplugged-6a92d070-97be-4ea4-a051-f8b9882b8b9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:48 compute-0 nova_compute[182935]: 2026-01-22 00:34:48.928 182939 DEBUG oslo_concurrency.lockutils [req-70c8c2cd-0ebc-4898-8808-82be1744b4c7 req-481ce4fe-d4af-4ff2-967a-36a02cbab715 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:48 compute-0 nova_compute[182935]: 2026-01-22 00:34:48.929 182939 DEBUG oslo_concurrency.lockutils [req-70c8c2cd-0ebc-4898-8808-82be1744b4c7 req-481ce4fe-d4af-4ff2-967a-36a02cbab715 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:48 compute-0 nova_compute[182935]: 2026-01-22 00:34:48.929 182939 DEBUG oslo_concurrency.lockutils [req-70c8c2cd-0ebc-4898-8808-82be1744b4c7 req-481ce4fe-d4af-4ff2-967a-36a02cbab715 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:48 compute-0 nova_compute[182935]: 2026-01-22 00:34:48.929 182939 DEBUG nova.compute.manager [req-70c8c2cd-0ebc-4898-8808-82be1744b4c7 req-481ce4fe-d4af-4ff2-967a-36a02cbab715 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] No waiting events found dispatching network-vif-unplugged-6a92d070-97be-4ea4-a051-f8b9882b8b9c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:34:48 compute-0 nova_compute[182935]: 2026-01-22 00:34:48.929 182939 DEBUG nova.compute.manager [req-70c8c2cd-0ebc-4898-8808-82be1744b4c7 req-481ce4fe-d4af-4ff2-967a-36a02cbab715 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-vif-unplugged-6a92d070-97be-4ea4-a051-f8b9882b8b9c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:34:48 compute-0 nova_compute[182935]: 2026-01-22 00:34:48.930 182939 DEBUG nova.compute.manager [req-70c8c2cd-0ebc-4898-8808-82be1744b4c7 req-481ce4fe-d4af-4ff2-967a-36a02cbab715 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-vif-plugged-6a92d070-97be-4ea4-a051-f8b9882b8b9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:48 compute-0 nova_compute[182935]: 2026-01-22 00:34:48.930 182939 DEBUG oslo_concurrency.lockutils [req-70c8c2cd-0ebc-4898-8808-82be1744b4c7 req-481ce4fe-d4af-4ff2-967a-36a02cbab715 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:48 compute-0 nova_compute[182935]: 2026-01-22 00:34:48.930 182939 DEBUG oslo_concurrency.lockutils [req-70c8c2cd-0ebc-4898-8808-82be1744b4c7 req-481ce4fe-d4af-4ff2-967a-36a02cbab715 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:48 compute-0 nova_compute[182935]: 2026-01-22 00:34:48.930 182939 DEBUG oslo_concurrency.lockutils [req-70c8c2cd-0ebc-4898-8808-82be1744b4c7 req-481ce4fe-d4af-4ff2-967a-36a02cbab715 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:48 compute-0 nova_compute[182935]: 2026-01-22 00:34:48.930 182939 DEBUG nova.compute.manager [req-70c8c2cd-0ebc-4898-8808-82be1744b4c7 req-481ce4fe-d4af-4ff2-967a-36a02cbab715 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] No waiting events found dispatching network-vif-plugged-6a92d070-97be-4ea4-a051-f8b9882b8b9c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:34:48 compute-0 nova_compute[182935]: 2026-01-22 00:34:48.931 182939 WARNING nova.compute.manager [req-70c8c2cd-0ebc-4898-8808-82be1744b4c7 req-481ce4fe-d4af-4ff2-967a-36a02cbab715 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received unexpected event network-vif-plugged-6a92d070-97be-4ea4-a051-f8b9882b8b9c for instance with vm_state active and task_state deleting.
Jan 22 00:34:49 compute-0 nova_compute[182935]: 2026-01-22 00:34:49.530 182939 DEBUG nova.compute.manager [req-f1ccfa7d-36bf-4729-9725-b93a0536f974 req-838e98a4-84cb-4c4c-b13a-91dc8e1df58d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-vif-plugged-0091268d-dee8-4a48-8f05-e20c0db2ec29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:49 compute-0 nova_compute[182935]: 2026-01-22 00:34:49.530 182939 DEBUG oslo_concurrency.lockutils [req-f1ccfa7d-36bf-4729-9725-b93a0536f974 req-838e98a4-84cb-4c4c-b13a-91dc8e1df58d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:49 compute-0 nova_compute[182935]: 2026-01-22 00:34:49.530 182939 DEBUG oslo_concurrency.lockutils [req-f1ccfa7d-36bf-4729-9725-b93a0536f974 req-838e98a4-84cb-4c4c-b13a-91dc8e1df58d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:49 compute-0 nova_compute[182935]: 2026-01-22 00:34:49.531 182939 DEBUG oslo_concurrency.lockutils [req-f1ccfa7d-36bf-4729-9725-b93a0536f974 req-838e98a4-84cb-4c4c-b13a-91dc8e1df58d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:49 compute-0 nova_compute[182935]: 2026-01-22 00:34:49.531 182939 DEBUG nova.compute.manager [req-f1ccfa7d-36bf-4729-9725-b93a0536f974 req-838e98a4-84cb-4c4c-b13a-91dc8e1df58d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] No waiting events found dispatching network-vif-plugged-0091268d-dee8-4a48-8f05-e20c0db2ec29 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:34:49 compute-0 nova_compute[182935]: 2026-01-22 00:34:49.531 182939 WARNING nova.compute.manager [req-f1ccfa7d-36bf-4729-9725-b93a0536f974 req-838e98a4-84cb-4c4c-b13a-91dc8e1df58d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received unexpected event network-vif-plugged-0091268d-dee8-4a48-8f05-e20c0db2ec29 for instance with vm_state active and task_state deleting.
Jan 22 00:34:50 compute-0 nova_compute[182935]: 2026-01-22 00:34:50.112 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:50 compute-0 podman[243207]: 2026-01-22 00:34:50.684771552 +0000 UTC m=+0.053742053 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.287 182939 DEBUG nova.compute.manager [req-720756a1-7367-4d20-a2c8-1df9bc422e09 req-b2402971-acd1-4c8c-b460-022d08030aca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-vif-deleted-6a92d070-97be-4ea4-a051-f8b9882b8b9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.287 182939 INFO nova.compute.manager [req-720756a1-7367-4d20-a2c8-1df9bc422e09 req-b2402971-acd1-4c8c-b460-022d08030aca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Neutron deleted interface 6a92d070-97be-4ea4-a051-f8b9882b8b9c; detaching it from the instance and deleting it from the info cache
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.287 182939 DEBUG nova.network.neutron [req-720756a1-7367-4d20-a2c8-1df9bc422e09 req-b2402971-acd1-4c8c-b460-022d08030aca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Updating instance_info_cache with network_info: [{"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.298 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.324 182939 DEBUG nova.compute.manager [req-720756a1-7367-4d20-a2c8-1df9bc422e09 req-b2402971-acd1-4c8c-b460-022d08030aca 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Detach interface failed, port_id=6a92d070-97be-4ea4-a051-f8b9882b8b9c, reason: Instance 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.648 182939 DEBUG nova.network.neutron [-] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.667 182939 INFO nova.compute.manager [-] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Took 5.28 seconds to deallocate network for instance.
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.698 182939 DEBUG nova.network.neutron [req-0ecb878e-31df-4178-8360-6a4e1427f282 req-124113cc-de3f-4876-8fdc-f946cacd844b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Updated VIF entry in instance network info cache for port 6a92d070-97be-4ea4-a051-f8b9882b8b9c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.699 182939 DEBUG nova.network.neutron [req-0ecb878e-31df-4178-8360-6a4e1427f282 req-124113cc-de3f-4876-8fdc-f946cacd844b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Updating instance_info_cache with network_info: [{"id": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "address": "fa:16:3e:95:8b:c5", "network": {"id": "2c32b591-bafa-4089-9793-ef7884c86bda", "bridge": "br-int", "label": "tempest-network-smoke--1415909396", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6a92d070-97", "ovs_interfaceid": "6a92d070-97be-4ea4-a051-f8b9882b8b9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "address": "fa:16:3e:5c:f9:6c", "network": {"id": "ac047d42-8ff5-4760-85b5-73b5e4be7fc9", "bridge": "br-int", "label": "tempest-network-smoke--1874258653", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe5c:f96c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0091268d-de", "ovs_interfaceid": "0091268d-dee8-4a48-8f05-e20c0db2ec29", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.733 182939 DEBUG oslo_concurrency.lockutils [req-0ecb878e-31df-4178-8360-6a4e1427f282 req-124113cc-de3f-4876-8fdc-f946cacd844b 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-7a9023a9-d882-4cdd-aa9e-ee59c7795d38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.761 182939 DEBUG oslo_concurrency.lockutils [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.762 182939 DEBUG oslo_concurrency.lockutils [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:52 compute-0 nova_compute[182935]: 2026-01-22 00:34:52.988 182939 DEBUG nova.compute.provider_tree [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:34:53 compute-0 nova_compute[182935]: 2026-01-22 00:34:53.008 182939 DEBUG nova.scheduler.client.report [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:34:53 compute-0 nova_compute[182935]: 2026-01-22 00:34:53.039 182939 DEBUG oslo_concurrency.lockutils [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:53 compute-0 nova_compute[182935]: 2026-01-22 00:34:53.061 182939 INFO nova.scheduler.client.report [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance 7a9023a9-d882-4cdd-aa9e-ee59c7795d38
Jan 22 00:34:53 compute-0 nova_compute[182935]: 2026-01-22 00:34:53.144 182939 DEBUG oslo_concurrency.lockutils [None req-4b32ef1f-d890-439a-b1e5-e5993b685504 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "7a9023a9-d882-4cdd-aa9e-ee59c7795d38" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:54 compute-0 nova_compute[182935]: 2026-01-22 00:34:54.378 182939 DEBUG nova.compute.manager [req-5db33a86-f730-476c-b57e-34c726e70891 req-ebc6034a-1ce5-4132-99db-a2bba44dc574 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Received event network-vif-deleted-0091268d-dee8-4a48-8f05-e20c0db2ec29 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:34:54 compute-0 nova_compute[182935]: 2026-01-22 00:34:54.379 182939 INFO nova.compute.manager [req-5db33a86-f730-476c-b57e-34c726e70891 req-ebc6034a-1ce5-4132-99db-a2bba44dc574 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Neutron deleted interface 0091268d-dee8-4a48-8f05-e20c0db2ec29; detaching it from the instance and deleting it from the info cache
Jan 22 00:34:54 compute-0 nova_compute[182935]: 2026-01-22 00:34:54.379 182939 DEBUG nova.network.neutron [req-5db33a86-f730-476c-b57e-34c726e70891 req-ebc6034a-1ce5-4132-99db-a2bba44dc574 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 22 00:34:54 compute-0 nova_compute[182935]: 2026-01-22 00:34:54.381 182939 DEBUG nova.compute.manager [req-5db33a86-f730-476c-b57e-34c726e70891 req-ebc6034a-1ce5-4132-99db-a2bba44dc574 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Detach interface failed, port_id=0091268d-dee8-4a48-8f05-e20c0db2ec29, reason: Instance 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:34:55 compute-0 nova_compute[182935]: 2026-01-22 00:34:55.114 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:34:55 compute-0 podman[243226]: 2026-01-22 00:34:55.684854151 +0000 UTC m=+0.058134279 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Jan 22 00:34:55 compute-0 podman[243227]: 2026-01-22 00:34:55.695552239 +0000 UTC m=+0.064420961 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 00:34:55 compute-0 nova_compute[182935]: 2026-01-22 00:34:55.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:34:55 compute-0 nova_compute[182935]: 2026-01-22 00:34:55.820 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:55 compute-0 nova_compute[182935]: 2026-01-22 00:34:55.821 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:55 compute-0 nova_compute[182935]: 2026-01-22 00:34:55.821 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:55 compute-0 nova_compute[182935]: 2026-01-22 00:34:55.822 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:34:55 compute-0 nova_compute[182935]: 2026-01-22 00:34:55.987 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:34:55 compute-0 nova_compute[182935]: 2026-01-22 00:34:55.988 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5694MB free_disk=73.11867141723633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:34:55 compute-0 nova_compute[182935]: 2026-01-22 00:34:55.988 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:34:55 compute-0 nova_compute[182935]: 2026-01-22 00:34:55.988 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:34:56 compute-0 nova_compute[182935]: 2026-01-22 00:34:56.050 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:34:56 compute-0 nova_compute[182935]: 2026-01-22 00:34:56.051 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:34:56 compute-0 nova_compute[182935]: 2026-01-22 00:34:56.088 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:34:56 compute-0 nova_compute[182935]: 2026-01-22 00:34:56.109 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:34:56 compute-0 nova_compute[182935]: 2026-01-22 00:34:56.128 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:34:56 compute-0 nova_compute[182935]: 2026-01-22 00:34:56.128 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:34:57 compute-0 nova_compute[182935]: 2026-01-22 00:34:57.301 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:00 compute-0 nova_compute[182935]: 2026-01-22 00:35:00.116 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:02 compute-0 nova_compute[182935]: 2026-01-22 00:35:02.129 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:02 compute-0 nova_compute[182935]: 2026-01-22 00:35:02.130 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:35:02 compute-0 nova_compute[182935]: 2026-01-22 00:35:02.225 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042087.223642, 7a9023a9-d882-4cdd-aa9e-ee59c7795d38 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:35:02 compute-0 nova_compute[182935]: 2026-01-22 00:35:02.226 182939 INFO nova.compute.manager [-] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] VM Stopped (Lifecycle Event)
Jan 22 00:35:02 compute-0 nova_compute[182935]: 2026-01-22 00:35:02.251 182939 DEBUG nova.compute.manager [None req-177d4b04-3398-46f3-a5ab-d141110e93a7 - - - - - -] [instance: 7a9023a9-d882-4cdd-aa9e-ee59c7795d38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:35:02 compute-0 nova_compute[182935]: 2026-01-22 00:35:02.303 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:02 compute-0 nova_compute[182935]: 2026-01-22 00:35:02.391 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:02 compute-0 nova_compute[182935]: 2026-01-22 00:35:02.485 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:35:03.232 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:35:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:35:03.233 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:35:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:35:03.233 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:35:03 compute-0 nova_compute[182935]: 2026-01-22 00:35:03.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:03 compute-0 nova_compute[182935]: 2026-01-22 00:35:03.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:35:03 compute-0 nova_compute[182935]: 2026-01-22 00:35:03.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:35:03 compute-0 nova_compute[182935]: 2026-01-22 00:35:03.808 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:35:04 compute-0 nova_compute[182935]: 2026-01-22 00:35:04.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:05 compute-0 nova_compute[182935]: 2026-01-22 00:35:05.121 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:06 compute-0 nova_compute[182935]: 2026-01-22 00:35:06.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:07 compute-0 nova_compute[182935]: 2026-01-22 00:35:07.306 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:10 compute-0 nova_compute[182935]: 2026-01-22 00:35:10.124 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:12 compute-0 nova_compute[182935]: 2026-01-22 00:35:12.309 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:12 compute-0 podman[243266]: 2026-01-22 00:35:12.729923645 +0000 UTC m=+0.102989818 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:35:12 compute-0 podman[243268]: 2026-01-22 00:35:12.754282032 +0000 UTC m=+0.120223553 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:35:12 compute-0 podman[243267]: 2026-01-22 00:35:12.770779268 +0000 UTC m=+0.142573600 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Jan 22 00:35:12 compute-0 nova_compute[182935]: 2026-01-22 00:35:12.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:12 compute-0 nova_compute[182935]: 2026-01-22 00:35:12.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:12 compute-0 nova_compute[182935]: 2026-01-22 00:35:12.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:35:12.967 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:35:12 compute-0 nova_compute[182935]: 2026-01-22 00:35:12.967 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:35:12.968 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:35:15 compute-0 nova_compute[182935]: 2026-01-22 00:35:15.152 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:16 compute-0 nova_compute[182935]: 2026-01-22 00:35:16.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:17 compute-0 nova_compute[182935]: 2026-01-22 00:35:17.311 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:17 compute-0 sshd-session[243332]: Invalid user apache from 188.166.69.60 port 41878
Jan 22 00:35:18 compute-0 sshd-session[243332]: Connection closed by invalid user apache 188.166.69.60 port 41878 [preauth]
Jan 22 00:35:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:35:19.970 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:35:20 compute-0 nova_compute[182935]: 2026-01-22 00:35:20.156 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:21 compute-0 podman[243334]: 2026-01-22 00:35:21.701905392 +0000 UTC m=+0.075528650 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 00:35:22 compute-0 nova_compute[182935]: 2026-01-22 00:35:22.342 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:25 compute-0 nova_compute[182935]: 2026-01-22 00:35:25.157 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:26 compute-0 podman[243353]: 2026-01-22 00:35:26.694639555 +0000 UTC m=+0.068548339 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Jan 22 00:35:26 compute-0 podman[243354]: 2026-01-22 00:35:26.710950298 +0000 UTC m=+0.079274258 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Jan 22 00:35:27 compute-0 nova_compute[182935]: 2026-01-22 00:35:27.345 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:30 compute-0 nova_compute[182935]: 2026-01-22 00:35:30.159 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:30 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:35:30.181 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:32:e3 2001:db8:0:1:f816:3eff:fe17:32e3 2001:db8::f816:3eff:fe17:32e3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe17:32e3/64 2001:db8::f816:3eff:fe17:32e3/64', 'neutron:device_id': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d6cac94-5c44-44de-a872-7bf42948d910, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=44d0292d-d743-4a92-8996-3ae3a26c0afc) old=Port_Binding(mac=['fa:16:3e:17:32:e3 2001:db8::f816:3eff:fe17:32e3'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe17:32e3/64', 'neutron:device_id': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:35:30 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:35:30.182 104408 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 44d0292d-d743-4a92-8996-3ae3a26c0afc in datapath 041654ff-0c5d-4cd2-89f6-0863dbbf44a8 updated
Jan 22 00:35:30 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:35:30.183 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 041654ff-0c5d-4cd2-89f6-0863dbbf44a8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:35:30 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:35:30.185 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c54bbc-cc41-42ee-99a0-f805aaa9ed83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:35:32 compute-0 nova_compute[182935]: 2026-01-22 00:35:32.347 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:35 compute-0 nova_compute[182935]: 2026-01-22 00:35:35.161 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:37 compute-0 nova_compute[182935]: 2026-01-22 00:35:37.349 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:40 compute-0 nova_compute[182935]: 2026-01-22 00:35:40.163 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:42 compute-0 nova_compute[182935]: 2026-01-22 00:35:42.351 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:43 compute-0 podman[243394]: 2026-01-22 00:35:43.690733311 +0000 UTC m=+0.056975122 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:35:43 compute-0 podman[243396]: 2026-01-22 00:35:43.695845854 +0000 UTC m=+0.059579854 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:35:43 compute-0 podman[243395]: 2026-01-22 00:35:43.745609061 +0000 UTC m=+0.112011966 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:35:45 compute-0 nova_compute[182935]: 2026-01-22 00:35:45.208 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:47 compute-0 nova_compute[182935]: 2026-01-22 00:35:47.353 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:48 compute-0 ovn_controller[95047]: 2026-01-22T00:35:48Z|00708|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 00:35:50 compute-0 nova_compute[182935]: 2026-01-22 00:35:50.213 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:52 compute-0 nova_compute[182935]: 2026-01-22 00:35:52.378 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:52 compute-0 podman[243464]: 2026-01-22 00:35:52.670859114 +0000 UTC m=+0.048969410 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:35:55 compute-0 nova_compute[182935]: 2026-01-22 00:35:55.217 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:55 compute-0 nova_compute[182935]: 2026-01-22 00:35:55.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:35:57 compute-0 nova_compute[182935]: 2026-01-22 00:35:57.380 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:35:57 compute-0 podman[243483]: 2026-01-22 00:35:57.693150297 +0000 UTC m=+0.070093037 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-type=git, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 00:35:57 compute-0 podman[243484]: 2026-01-22 00:35:57.701442967 +0000 UTC m=+0.074799451 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:36:00 compute-0 nova_compute[182935]: 2026-01-22 00:36:00.220 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:01 compute-0 sshd-session[243528]: Invalid user apache from 188.166.69.60 port 48410
Jan 22 00:36:01 compute-0 sshd-session[243528]: Connection closed by invalid user apache 188.166.69.60 port 48410 [preauth]
Jan 22 00:36:01 compute-0 nova_compute[182935]: 2026-01-22 00:36:01.580 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:01 compute-0 nova_compute[182935]: 2026-01-22 00:36:01.581 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:01 compute-0 nova_compute[182935]: 2026-01-22 00:36:01.581 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:01 compute-0 nova_compute[182935]: 2026-01-22 00:36:01.581 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:36:01 compute-0 nova_compute[182935]: 2026-01-22 00:36:01.776 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:36:01 compute-0 nova_compute[182935]: 2026-01-22 00:36:01.777 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5699MB free_disk=73.11869049072266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:36:01 compute-0 nova_compute[182935]: 2026-01-22 00:36:01.777 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:01 compute-0 nova_compute[182935]: 2026-01-22 00:36:01.778 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:02 compute-0 nova_compute[182935]: 2026-01-22 00:36:02.382 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:02 compute-0 nova_compute[182935]: 2026-01-22 00:36:02.432 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:36:02 compute-0 nova_compute[182935]: 2026-01-22 00:36:02.432 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:36:02 compute-0 nova_compute[182935]: 2026-01-22 00:36:02.525 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:36:02 compute-0 nova_compute[182935]: 2026-01-22 00:36:02.620 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:36:02 compute-0 nova_compute[182935]: 2026-01-22 00:36:02.620 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:36:02 compute-0 nova_compute[182935]: 2026-01-22 00:36:02.651 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:36:02 compute-0 nova_compute[182935]: 2026-01-22 00:36:02.678 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:36:02 compute-0 nova_compute[182935]: 2026-01-22 00:36:02.699 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:36:02 compute-0 nova_compute[182935]: 2026-01-22 00:36:02.755 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:36:02 compute-0 nova_compute[182935]: 2026-01-22 00:36:02.756 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:36:02 compute-0 nova_compute[182935]: 2026-01-22 00:36:02.757 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.979s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:03.234 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:03.234 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:03.234 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:05 compute-0 nova_compute[182935]: 2026-01-22 00:36:05.222 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:07 compute-0 nova_compute[182935]: 2026-01-22 00:36:07.384 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:08 compute-0 nova_compute[182935]: 2026-01-22 00:36:08.756 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:08 compute-0 nova_compute[182935]: 2026-01-22 00:36:08.756 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:36:08 compute-0 nova_compute[182935]: 2026-01-22 00:36:08.756 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:36:09 compute-0 nova_compute[182935]: 2026-01-22 00:36:09.615 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:36:09 compute-0 nova_compute[182935]: 2026-01-22 00:36:09.616 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:09 compute-0 nova_compute[182935]: 2026-01-22 00:36:09.616 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:09 compute-0 nova_compute[182935]: 2026-01-22 00:36:09.617 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:09 compute-0 nova_compute[182935]: 2026-01-22 00:36:09.617 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:36:10 compute-0 nova_compute[182935]: 2026-01-22 00:36:10.223 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:12 compute-0 nova_compute[182935]: 2026-01-22 00:36:12.387 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:14 compute-0 podman[243531]: 2026-01-22 00:36:14.669721383 +0000 UTC m=+0.047809942 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:36:14 compute-0 podman[243533]: 2026-01-22 00:36:14.70704393 +0000 UTC m=+0.079944954 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:36:14 compute-0 podman[243532]: 2026-01-22 00:36:14.707266785 +0000 UTC m=+0.082926276 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:36:14 compute-0 nova_compute[182935]: 2026-01-22 00:36:14.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:14 compute-0 nova_compute[182935]: 2026-01-22 00:36:14.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:14 compute-0 nova_compute[182935]: 2026-01-22 00:36:14.821 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:14 compute-0 nova_compute[182935]: 2026-01-22 00:36:14.822 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:15 compute-0 nova_compute[182935]: 2026-01-22 00:36:15.225 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:17 compute-0 nova_compute[182935]: 2026-01-22 00:36:17.390 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:17 compute-0 nova_compute[182935]: 2026-01-22 00:36:17.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.447 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.448 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.475 182939 DEBUG nova.compute.manager [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.589 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.589 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.598 182939 DEBUG nova.virt.hardware [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.598 182939 INFO nova.compute.claims [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.775 182939 DEBUG nova.compute.provider_tree [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.797 182939 DEBUG nova.scheduler.client.report [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.825 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.826 182939 DEBUG nova.compute.manager [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.902 182939 DEBUG nova.compute.manager [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.902 182939 DEBUG nova.network.neutron [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.930 182939 INFO nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:36:18 compute-0 nova_compute[182935]: 2026-01-22 00:36:18.976 182939 DEBUG nova.compute.manager [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.110 182939 DEBUG nova.compute.manager [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.111 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.111 182939 INFO nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Creating image(s)
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.112 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.112 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.112 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.124 182939 DEBUG oslo_concurrency.processutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.184 182939 DEBUG nova.policy [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.194 182939 DEBUG oslo_concurrency.processutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.195 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.195 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.206 182939 DEBUG oslo_concurrency.processutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.268 182939 DEBUG oslo_concurrency.processutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.269 182939 DEBUG oslo_concurrency.processutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.305 182939 DEBUG oslo_concurrency.processutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.308 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.309 182939 DEBUG oslo_concurrency.processutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.368 182939 DEBUG oslo_concurrency.processutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.370 182939 DEBUG nova.virt.disk.api [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.370 182939 DEBUG oslo_concurrency.processutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.429 182939 DEBUG oslo_concurrency.processutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.430 182939 DEBUG nova.virt.disk.api [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.430 182939 DEBUG nova.objects.instance [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 3bc9ee69-8032-4370-a7b1-e1905436fac1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.449 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.450 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Ensure instance console log exists: /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.450 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.450 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:19 compute-0 nova_compute[182935]: 2026-01-22 00:36:19.451 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:20 compute-0 nova_compute[182935]: 2026-01-22 00:36:20.227 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:20 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:20.385 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:36:20 compute-0 nova_compute[182935]: 2026-01-22 00:36:20.386 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:20 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:20.387 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:36:20 compute-0 nova_compute[182935]: 2026-01-22 00:36:20.775 182939 DEBUG nova.network.neutron [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Successfully created port: e2c10e81-3919-45ac-acd4-8de925e499e6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:36:21 compute-0 nova_compute[182935]: 2026-01-22 00:36:21.599 182939 DEBUG nova.network.neutron [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Successfully created port: 10cd7f40-3848-4590-b23a-4e832bf2f2b4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:36:22 compute-0 nova_compute[182935]: 2026-01-22 00:36:22.440 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:22 compute-0 nova_compute[182935]: 2026-01-22 00:36:22.763 182939 DEBUG nova.network.neutron [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Successfully updated port: e2c10e81-3919-45ac-acd4-8de925e499e6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:36:22 compute-0 nova_compute[182935]: 2026-01-22 00:36:22.951 182939 DEBUG nova.compute.manager [req-ac952a3e-6d06-4c84-934b-482007e9d9b1 req-fed2df9c-a998-4ecd-a645-9a3eee8dca73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-changed-e2c10e81-3919-45ac-acd4-8de925e499e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:36:22 compute-0 nova_compute[182935]: 2026-01-22 00:36:22.951 182939 DEBUG nova.compute.manager [req-ac952a3e-6d06-4c84-934b-482007e9d9b1 req-fed2df9c-a998-4ecd-a645-9a3eee8dca73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Refreshing instance network info cache due to event network-changed-e2c10e81-3919-45ac-acd4-8de925e499e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:36:22 compute-0 nova_compute[182935]: 2026-01-22 00:36:22.951 182939 DEBUG oslo_concurrency.lockutils [req-ac952a3e-6d06-4c84-934b-482007e9d9b1 req-fed2df9c-a998-4ecd-a645-9a3eee8dca73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:36:22 compute-0 nova_compute[182935]: 2026-01-22 00:36:22.951 182939 DEBUG oslo_concurrency.lockutils [req-ac952a3e-6d06-4c84-934b-482007e9d9b1 req-fed2df9c-a998-4ecd-a645-9a3eee8dca73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:36:22 compute-0 nova_compute[182935]: 2026-01-22 00:36:22.952 182939 DEBUG nova.network.neutron [req-ac952a3e-6d06-4c84-934b-482007e9d9b1 req-fed2df9c-a998-4ecd-a645-9a3eee8dca73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Refreshing network info cache for port e2c10e81-3919-45ac-acd4-8de925e499e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:36:23 compute-0 nova_compute[182935]: 2026-01-22 00:36:23.178 182939 DEBUG nova.network.neutron [req-ac952a3e-6d06-4c84-934b-482007e9d9b1 req-fed2df9c-a998-4ecd-a645-9a3eee8dca73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:36:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:36:23 compute-0 nova_compute[182935]: 2026-01-22 00:36:23.529 182939 DEBUG nova.network.neutron [req-ac952a3e-6d06-4c84-934b-482007e9d9b1 req-fed2df9c-a998-4ecd-a645-9a3eee8dca73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:36:23 compute-0 nova_compute[182935]: 2026-01-22 00:36:23.548 182939 DEBUG oslo_concurrency.lockutils [req-ac952a3e-6d06-4c84-934b-482007e9d9b1 req-fed2df9c-a998-4ecd-a645-9a3eee8dca73 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:36:23 compute-0 podman[243617]: 2026-01-22 00:36:23.674068626 +0000 UTC m=+0.051045949 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:36:23 compute-0 nova_compute[182935]: 2026-01-22 00:36:23.732 182939 DEBUG nova.network.neutron [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Successfully updated port: 10cd7f40-3848-4590-b23a-4e832bf2f2b4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:36:23 compute-0 nova_compute[182935]: 2026-01-22 00:36:23.761 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:36:23 compute-0 nova_compute[182935]: 2026-01-22 00:36:23.761 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:36:23 compute-0 nova_compute[182935]: 2026-01-22 00:36:23.762 182939 DEBUG nova.network.neutron [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:36:25 compute-0 nova_compute[182935]: 2026-01-22 00:36:25.008 182939 DEBUG nova.network.neutron [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:36:25 compute-0 nova_compute[182935]: 2026-01-22 00:36:25.088 182939 DEBUG nova.compute.manager [req-04795ac6-a4ce-4fd4-b72b-1befff32c37f req-abcab4ea-a189-43f9-8746-18e223e0f112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-changed-10cd7f40-3848-4590-b23a-4e832bf2f2b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:36:25 compute-0 nova_compute[182935]: 2026-01-22 00:36:25.088 182939 DEBUG nova.compute.manager [req-04795ac6-a4ce-4fd4-b72b-1befff32c37f req-abcab4ea-a189-43f9-8746-18e223e0f112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Refreshing instance network info cache due to event network-changed-10cd7f40-3848-4590-b23a-4e832bf2f2b4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:36:25 compute-0 nova_compute[182935]: 2026-01-22 00:36:25.088 182939 DEBUG oslo_concurrency.lockutils [req-04795ac6-a4ce-4fd4-b72b-1befff32c37f req-abcab4ea-a189-43f9-8746-18e223e0f112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:36:25 compute-0 nova_compute[182935]: 2026-01-22 00:36:25.262 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:25.388 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:27 compute-0 nova_compute[182935]: 2026-01-22 00:36:27.442 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:28 compute-0 podman[243637]: 2026-01-22 00:36:28.680767394 +0000 UTC m=+0.051663954 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 00:36:28 compute-0 podman[243636]: 2026-01-22 00:36:28.680945879 +0000 UTC m=+0.054235316 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Jan 22 00:36:30 compute-0 nova_compute[182935]: 2026-01-22 00:36:30.264 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.045 182939 DEBUG nova.network.neutron [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Updating instance_info_cache with network_info: [{"id": "e2c10e81-3919-45ac-acd4-8de925e499e6", "address": "fa:16:3e:62:8e:82", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2c10e81-39", "ovs_interfaceid": "e2c10e81-3919-45ac-acd4-8de925e499e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.080 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.080 182939 DEBUG nova.compute.manager [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Instance network_info: |[{"id": "e2c10e81-3919-45ac-acd4-8de925e499e6", "address": "fa:16:3e:62:8e:82", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2c10e81-39", "ovs_interfaceid": "e2c10e81-3919-45ac-acd4-8de925e499e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.081 182939 DEBUG oslo_concurrency.lockutils [req-04795ac6-a4ce-4fd4-b72b-1befff32c37f req-abcab4ea-a189-43f9-8746-18e223e0f112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.081 182939 DEBUG nova.network.neutron [req-04795ac6-a4ce-4fd4-b72b-1befff32c37f req-abcab4ea-a189-43f9-8746-18e223e0f112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Refreshing network info cache for port 10cd7f40-3848-4590-b23a-4e832bf2f2b4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.085 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Start _get_guest_xml network_info=[{"id": "e2c10e81-3919-45ac-acd4-8de925e499e6", "address": "fa:16:3e:62:8e:82", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2c10e81-39", "ovs_interfaceid": "e2c10e81-3919-45ac-acd4-8de925e499e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.092 182939 WARNING nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.103 182939 DEBUG nova.virt.libvirt.host [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.104 182939 DEBUG nova.virt.libvirt.host [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.118 182939 DEBUG nova.virt.libvirt.host [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.119 182939 DEBUG nova.virt.libvirt.host [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.120 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.121 182939 DEBUG nova.virt.hardware [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.121 182939 DEBUG nova.virt.hardware [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.122 182939 DEBUG nova.virt.hardware [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.122 182939 DEBUG nova.virt.hardware [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.122 182939 DEBUG nova.virt.hardware [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.122 182939 DEBUG nova.virt.hardware [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.123 182939 DEBUG nova.virt.hardware [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.123 182939 DEBUG nova.virt.hardware [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.123 182939 DEBUG nova.virt.hardware [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.123 182939 DEBUG nova.virt.hardware [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.124 182939 DEBUG nova.virt.hardware [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.128 182939 DEBUG nova.virt.libvirt.vif [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:36:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1496462229',display_name='tempest-TestGettingAddress-server-1496462229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1496462229',id=174,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK7nR7wFnCoiySw65REL0XK2oqhDKLhnFVsBGaeEJezobQmbvet9F136TafqeB+t847DytpkOvQ0+Cnej4wWLBEdCAU4r3MTN2LY3bi428WR2O1oEXJJ3VIylh32sSXeGw==',key_name='tempest-TestGettingAddress-2092456170',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-axqaz2jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:36:19Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=3bc9ee69-8032-4370-a7b1-e1905436fac1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e2c10e81-3919-45ac-acd4-8de925e499e6", "address": "fa:16:3e:62:8e:82", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2c10e81-39", "ovs_interfaceid": "e2c10e81-3919-45ac-acd4-8de925e499e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.129 182939 DEBUG nova.network.os_vif_util [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "e2c10e81-3919-45ac-acd4-8de925e499e6", "address": "fa:16:3e:62:8e:82", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2c10e81-39", "ovs_interfaceid": "e2c10e81-3919-45ac-acd4-8de925e499e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.130 182939 DEBUG nova.network.os_vif_util [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:8e:82,bridge_name='br-int',has_traffic_filtering=True,id=e2c10e81-3919-45ac-acd4-8de925e499e6,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2c10e81-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.130 182939 DEBUG nova.virt.libvirt.vif [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:36:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1496462229',display_name='tempest-TestGettingAddress-server-1496462229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1496462229',id=174,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK7nR7wFnCoiySw65REL0XK2oqhDKLhnFVsBGaeEJezobQmbvet9F136TafqeB+t847DytpkOvQ0+Cnej4wWLBEdCAU4r3MTN2LY3bi428WR2O1oEXJJ3VIylh32sSXeGw==',key_name='tempest-TestGettingAddress-2092456170',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-axqaz2jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:36:19Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=3bc9ee69-8032-4370-a7b1-e1905436fac1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.131 182939 DEBUG nova.network.os_vif_util [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.132 182939 DEBUG nova.network.os_vif_util [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:f5:0e,bridge_name='br-int',has_traffic_filtering=True,id=10cd7f40-3848-4590-b23a-4e832bf2f2b4,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10cd7f40-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.132 182939 DEBUG nova.objects.instance [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3bc9ee69-8032-4370-a7b1-e1905436fac1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.152 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:36:31 compute-0 nova_compute[182935]:   <uuid>3bc9ee69-8032-4370-a7b1-e1905436fac1</uuid>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   <name>instance-000000ae</name>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <nova:name>tempest-TestGettingAddress-server-1496462229</nova:name>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:36:31</nova:creationTime>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:36:31 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:36:31 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:36:31 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:36:31 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:36:31 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:36:31 compute-0 nova_compute[182935]:         <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 22 00:36:31 compute-0 nova_compute[182935]:         <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:36:31 compute-0 nova_compute[182935]:         <nova:port uuid="e2c10e81-3919-45ac-acd4-8de925e499e6">
Jan 22 00:36:31 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:36:31 compute-0 nova_compute[182935]:         <nova:port uuid="10cd7f40-3848-4590-b23a-4e832bf2f2b4">
Jan 22 00:36:31 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe50:f50e" ipVersion="6"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe50:f50e" ipVersion="6"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <system>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <entry name="serial">3bc9ee69-8032-4370-a7b1-e1905436fac1</entry>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <entry name="uuid">3bc9ee69-8032-4370-a7b1-e1905436fac1</entry>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     </system>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   <os>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   </os>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   <features>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   </features>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk.config"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:62:8e:82"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <target dev="tape2c10e81-39"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:50:f5:0e"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <target dev="tap10cd7f40-38"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/console.log" append="off"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <video>
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     </video>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:36:31 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:36:31 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:36:31 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:36:31 compute-0 nova_compute[182935]: </domain>
Jan 22 00:36:31 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.154 182939 DEBUG nova.compute.manager [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Preparing to wait for external event network-vif-plugged-e2c10e81-3919-45ac-acd4-8de925e499e6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.155 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.155 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.155 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.155 182939 DEBUG nova.compute.manager [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Preparing to wait for external event network-vif-plugged-10cd7f40-3848-4590-b23a-4e832bf2f2b4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.155 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.155 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.156 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.156 182939 DEBUG nova.virt.libvirt.vif [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:36:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1496462229',display_name='tempest-TestGettingAddress-server-1496462229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1496462229',id=174,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK7nR7wFnCoiySw65REL0XK2oqhDKLhnFVsBGaeEJezobQmbvet9F136TafqeB+t847DytpkOvQ0+Cnej4wWLBEdCAU4r3MTN2LY3bi428WR2O1oEXJJ3VIylh32sSXeGw==',key_name='tempest-TestGettingAddress-2092456170',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-axqaz2jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:36:19Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=3bc9ee69-8032-4370-a7b1-e1905436fac1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e2c10e81-3919-45ac-acd4-8de925e499e6", "address": "fa:16:3e:62:8e:82", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2c10e81-39", "ovs_interfaceid": "e2c10e81-3919-45ac-acd4-8de925e499e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.156 182939 DEBUG nova.network.os_vif_util [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "e2c10e81-3919-45ac-acd4-8de925e499e6", "address": "fa:16:3e:62:8e:82", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2c10e81-39", "ovs_interfaceid": "e2c10e81-3919-45ac-acd4-8de925e499e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.157 182939 DEBUG nova.network.os_vif_util [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:8e:82,bridge_name='br-int',has_traffic_filtering=True,id=e2c10e81-3919-45ac-acd4-8de925e499e6,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2c10e81-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.157 182939 DEBUG os_vif [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:8e:82,bridge_name='br-int',has_traffic_filtering=True,id=e2c10e81-3919-45ac-acd4-8de925e499e6,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2c10e81-39') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.158 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.158 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.159 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.163 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.163 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape2c10e81-39, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.163 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape2c10e81-39, col_values=(('external_ids', {'iface-id': 'e2c10e81-3919-45ac-acd4-8de925e499e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:8e:82', 'vm-uuid': '3bc9ee69-8032-4370-a7b1-e1905436fac1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:31 compute-0 NetworkManager[55139]: <info>  [1769042191.1659] manager: (tape2c10e81-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/347)
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.169 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.175 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.177 182939 INFO os_vif [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:8e:82,bridge_name='br-int',has_traffic_filtering=True,id=e2c10e81-3919-45ac-acd4-8de925e499e6,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2c10e81-39')
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.179 182939 DEBUG nova.virt.libvirt.vif [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:36:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1496462229',display_name='tempest-TestGettingAddress-server-1496462229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1496462229',id=174,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK7nR7wFnCoiySw65REL0XK2oqhDKLhnFVsBGaeEJezobQmbvet9F136TafqeB+t847DytpkOvQ0+Cnej4wWLBEdCAU4r3MTN2LY3bi428WR2O1oEXJJ3VIylh32sSXeGw==',key_name='tempest-TestGettingAddress-2092456170',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-axqaz2jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:36:19Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=3bc9ee69-8032-4370-a7b1-e1905436fac1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.179 182939 DEBUG nova.network.os_vif_util [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.181 182939 DEBUG nova.network.os_vif_util [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:f5:0e,bridge_name='br-int',has_traffic_filtering=True,id=10cd7f40-3848-4590-b23a-4e832bf2f2b4,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10cd7f40-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.182 182939 DEBUG os_vif [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:f5:0e,bridge_name='br-int',has_traffic_filtering=True,id=10cd7f40-3848-4590-b23a-4e832bf2f2b4,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10cd7f40-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.182 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.183 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.183 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.186 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.187 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap10cd7f40-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.187 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap10cd7f40-38, col_values=(('external_ids', {'iface-id': '10cd7f40-3848-4590-b23a-4e832bf2f2b4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:f5:0e', 'vm-uuid': '3bc9ee69-8032-4370-a7b1-e1905436fac1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.189 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:31 compute-0 NetworkManager[55139]: <info>  [1769042191.1903] manager: (tap10cd7f40-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/348)
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.193 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.198 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.200 182939 INFO os_vif [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:f5:0e,bridge_name='br-int',has_traffic_filtering=True,id=10cd7f40-3848-4590-b23a-4e832bf2f2b4,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10cd7f40-38')
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.272 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.273 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.274 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:62:8e:82, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.274 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:50:f5:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:36:31 compute-0 nova_compute[182935]: 2026-01-22 00:36:31.275 182939 INFO nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Using config drive
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.022 182939 INFO nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Creating config drive at /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk.config
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.028 182939 DEBUG oslo_concurrency.processutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcg49lnda execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.156 182939 DEBUG oslo_concurrency.processutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcg49lnda" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:36:32 compute-0 kernel: tape2c10e81-39: entered promiscuous mode
Jan 22 00:36:32 compute-0 NetworkManager[55139]: <info>  [1769042192.2185] manager: (tape2c10e81-39): new Tun device (/org/freedesktop/NetworkManager/Devices/349)
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.275 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:32 compute-0 ovn_controller[95047]: 2026-01-22T00:36:32Z|00709|binding|INFO|Claiming lport e2c10e81-3919-45ac-acd4-8de925e499e6 for this chassis.
Jan 22 00:36:32 compute-0 ovn_controller[95047]: 2026-01-22T00:36:32Z|00710|binding|INFO|e2c10e81-3919-45ac-acd4-8de925e499e6: Claiming fa:16:3e:62:8e:82 10.100.0.8
Jan 22 00:36:32 compute-0 NetworkManager[55139]: <info>  [1769042192.2833] manager: (tap10cd7f40-38): new Tun device (/org/freedesktop/NetworkManager/Devices/350)
Jan 22 00:36:32 compute-0 kernel: tap10cd7f40-38: entered promiscuous mode
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.285 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.288 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:32 compute-0 ovn_controller[95047]: 2026-01-22T00:36:32Z|00711|if_status|INFO|Not updating pb chassis for 10cd7f40-3848-4590-b23a-4e832bf2f2b4 now as sb is readonly
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.290 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:32 compute-0 NetworkManager[55139]: <info>  [1769042192.2938] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/351)
Jan 22 00:36:32 compute-0 NetworkManager[55139]: <info>  [1769042192.2947] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/352)
Jan 22 00:36:32 compute-0 systemd-udevd[243702]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:36:32 compute-0 systemd-udevd[243703]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:36:32 compute-0 NetworkManager[55139]: <info>  [1769042192.3155] device (tap10cd7f40-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:36:32 compute-0 NetworkManager[55139]: <info>  [1769042192.3165] device (tape2c10e81-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:36:32 compute-0 NetworkManager[55139]: <info>  [1769042192.3174] device (tap10cd7f40-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:36:32 compute-0 NetworkManager[55139]: <info>  [1769042192.3179] device (tape2c10e81-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.318 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:8e:82 10.100.0.8'], port_security=['fa:16:3e:62:8e:82 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3bc9ee69-8032-4370-a7b1-e1905436fac1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a02fb4eb-eda5-4559-8b41-ffe0af33e841', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9c9c72d4-43bc-43b5-af16-0875792fba89, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=e2c10e81-3919-45ac-acd4-8de925e499e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.319 104408 INFO neutron.agent.ovn.metadata.agent [-] Port e2c10e81-3919-45ac-acd4-8de925e499e6 in datapath 9d3a0d92-0a01-43e0-bbe5-a677082b8f1b bound to our chassis
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.320 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9d3a0d92-0a01-43e0-bbe5-a677082b8f1b
Jan 22 00:36:32 compute-0 systemd-machined[154182]: New machine qemu-90-instance-000000ae.
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.336 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[82227afc-cc7f-40cc-938f-8848ab70c044]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.337 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9d3a0d92-01 in ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.341 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9d3a0d92-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.342 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[45c60fd1-37df-4020-abb8-9c45f5afbf47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.342 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb726ed-df52-48bb-8aa5-028f77b62c32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.356 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[3e1c25b9-c50b-4e21-8feb-b5ea9beb298e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 systemd[1]: Started Virtual Machine qemu-90-instance-000000ae.
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.376 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.375 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bc7fb58e-40f8-4a03-9673-3f6a6d8ff0c6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_controller[95047]: 2026-01-22T00:36:32Z|00712|binding|INFO|Claiming lport 10cd7f40-3848-4590-b23a-4e832bf2f2b4 for this chassis.
Jan 22 00:36:32 compute-0 ovn_controller[95047]: 2026-01-22T00:36:32Z|00713|binding|INFO|10cd7f40-3848-4590-b23a-4e832bf2f2b4: Claiming fa:16:3e:50:f5:0e 2001:db8:0:1:f816:3eff:fe50:f50e 2001:db8::f816:3eff:fe50:f50e
Jan 22 00:36:32 compute-0 ovn_controller[95047]: 2026-01-22T00:36:32Z|00714|binding|INFO|Setting lport e2c10e81-3919-45ac-acd4-8de925e499e6 ovn-installed in OVS
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.388 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:32 compute-0 ovn_controller[95047]: 2026-01-22T00:36:32Z|00715|binding|INFO|Setting lport e2c10e81-3919-45ac-acd4-8de925e499e6 up in Southbound
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.408 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:f5:0e 2001:db8:0:1:f816:3eff:fe50:f50e 2001:db8::f816:3eff:fe50:f50e'], port_security=['fa:16:3e:50:f5:0e 2001:db8:0:1:f816:3eff:fe50:f50e 2001:db8::f816:3eff:fe50:f50e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe50:f50e/64 2001:db8::f816:3eff:fe50:f50e/64', 'neutron:device_id': '3bc9ee69-8032-4370-a7b1-e1905436fac1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a02fb4eb-eda5-4559-8b41-ffe0af33e841', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d6cac94-5c44-44de-a872-7bf42948d910, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=10cd7f40-3848-4590-b23a-4e832bf2f2b4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:36:32 compute-0 ovn_controller[95047]: 2026-01-22T00:36:32Z|00716|binding|INFO|Setting lport 10cd7f40-3848-4590-b23a-4e832bf2f2b4 ovn-installed in OVS
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.410 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.411 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[059f5115-3601-49cf-8696-4c6315fcfc4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 NetworkManager[55139]: <info>  [1769042192.4194] manager: (tap9d3a0d92-00): new Veth device (/org/freedesktop/NetworkManager/Devices/353)
Jan 22 00:36:32 compute-0 ovn_controller[95047]: 2026-01-22T00:36:32Z|00717|binding|INFO|Setting lport 10cd7f40-3848-4590-b23a-4e832bf2f2b4 up in Southbound
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.418 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[eccbcd6e-b65d-4dd3-b423-b41f0ba24f61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.448 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[713a2121-fb85-4944-8563-56c70b7d885c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.451 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[dfa568d1-7fff-48ae-97f9-0ff3062faa3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 NetworkManager[55139]: <info>  [1769042192.4718] device (tap9d3a0d92-00): carrier: link connected
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.477 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[bd35a903-69e6-4958-9e04-62868e654cf0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.500 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7867570d-2dde-437e-9cdd-53a7a589d23a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9d3a0d92-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:b0:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669521, 'reachable_time': 37528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243739, 'error': None, 'target': 'ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.518 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fc69f792-99df-4250-a238-028ff25b01ef]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:b0af'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 669521, 'tstamp': 669521}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243740, 'error': None, 'target': 'ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.536 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[456834f2-5831-45ea-a59b-68148966cdfe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9d3a0d92-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:b0:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669521, 'reachable_time': 37528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243741, 'error': None, 'target': 'ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.576 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[48be5aa5-7883-4e8e-bd7e-0cca723bc675]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.640 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c09f0693-aacb-456b-a80b-fdd79a6a42c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.641 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d3a0d92-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.642 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.642 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9d3a0d92-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.644 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:32 compute-0 kernel: tap9d3a0d92-00: entered promiscuous mode
Jan 22 00:36:32 compute-0 NetworkManager[55139]: <info>  [1769042192.6445] manager: (tap9d3a0d92-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.645 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.649 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9d3a0d92-00, col_values=(('external_ids', {'iface-id': '285533c3-11fb-4871-bfe4-af8cc3d787e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.650 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:32 compute-0 ovn_controller[95047]: 2026-01-22T00:36:32Z|00718|binding|INFO|Releasing lport 285533c3-11fb-4871-bfe4-af8cc3d787e8 from this chassis (sb_readonly=0)
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.650 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.652 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9d3a0d92-0a01-43e0-bbe5-a677082b8f1b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9d3a0d92-0a01-43e0-bbe5-a677082b8f1b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.653 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bd93c9d9-0493-4286-9e8a-767c74818727]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.654 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/9d3a0d92-0a01-43e0-bbe5-a677082b8f1b.pid.haproxy
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 9d3a0d92-0a01-43e0-bbe5-a677082b8f1b
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:36:32 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:32.654 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'env', 'PROCESS_TAG=haproxy-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9d3a0d92-0a01-43e0-bbe5-a677082b8f1b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.661 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.748 182939 DEBUG nova.compute.manager [req-68833255-aa93-4cfc-80df-ea95344258cc req-5439ae7e-b5e5-423f-85ef-ac160d331361 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-vif-plugged-e2c10e81-3919-45ac-acd4-8de925e499e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.749 182939 DEBUG oslo_concurrency.lockutils [req-68833255-aa93-4cfc-80df-ea95344258cc req-5439ae7e-b5e5-423f-85ef-ac160d331361 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.750 182939 DEBUG oslo_concurrency.lockutils [req-68833255-aa93-4cfc-80df-ea95344258cc req-5439ae7e-b5e5-423f-85ef-ac160d331361 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.750 182939 DEBUG oslo_concurrency.lockutils [req-68833255-aa93-4cfc-80df-ea95344258cc req-5439ae7e-b5e5-423f-85ef-ac160d331361 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:32 compute-0 nova_compute[182935]: 2026-01-22 00:36:32.750 182939 DEBUG nova.compute.manager [req-68833255-aa93-4cfc-80df-ea95344258cc req-5439ae7e-b5e5-423f-85ef-ac160d331361 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Processing event network-vif-plugged-e2c10e81-3919-45ac-acd4-8de925e499e6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:36:33 compute-0 podman[243773]: 2026-01-22 00:36:33.015747445 +0000 UTC m=+0.056742105 container create 68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 22 00:36:33 compute-0 systemd[1]: Started libpod-conmon-68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f.scope.
Jan 22 00:36:33 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:36:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d755378462cc508a2aa7d289de9c308482fdd0ab19f092a02f0ea5ef7e95ef40/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:36:33 compute-0 podman[243773]: 2026-01-22 00:36:32.982394534 +0000 UTC m=+0.023389214 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:36:33 compute-0 podman[243773]: 2026-01-22 00:36:33.092218885 +0000 UTC m=+0.133213545 container init 68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:36:33 compute-0 podman[243773]: 2026-01-22 00:36:33.097276217 +0000 UTC m=+0.138270877 container start 68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:36:33 compute-0 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[243788]: [NOTICE]   (243792) : New worker (243794) forked
Jan 22 00:36:33 compute-0 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[243788]: [NOTICE]   (243792) : Loading success.
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.157 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 10cd7f40-3848-4590-b23a-4e832bf2f2b4 in datapath 041654ff-0c5d-4cd2-89f6-0863dbbf44a8 unbound from our chassis
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.159 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 041654ff-0c5d-4cd2-89f6-0863dbbf44a8
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.170 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[401f78b1-0acf-4726-bf2e-857fa1afd3ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.171 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap041654ff-01 in ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.173 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap041654ff-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.173 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2511ba31-2c82-4cf2-b930-1e91e35bbcdb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.174 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5cfcc7d0-5201-4279-a9e0-27c29a4e98aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.183 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[af529301-5e1f-41db-b6e3-5d08b270468a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.204 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4afc96-ec0c-4026-a558-bfc6fc7aae05]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.225 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[94512518-2b71-4f96-8bd3-57dee4bcb261]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.229 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[669b842d-482f-47d1-aedc-e1f701940f9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 NetworkManager[55139]: <info>  [1769042193.2305] manager: (tap041654ff-00): new Veth device (/org/freedesktop/NetworkManager/Devices/355)
Jan 22 00:36:33 compute-0 systemd-udevd[243734]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.260 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[bb37ff4d-4438-45be-af4c-ff9375d22b0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.265 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[0a31956f-eeb7-4c4d-a095-e5d896d9b84e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 NetworkManager[55139]: <info>  [1769042193.2870] device (tap041654ff-00): carrier: link connected
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.291 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[93a87585-12c9-4145-86dd-4ff489f6270a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.306 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cf21b6ea-79c2-4ba7-908e-0e273be6bbc2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap041654ff-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:32:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 228], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669602, 'reachable_time': 39810, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243813, 'error': None, 'target': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.317 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[20af0611-2743-4f3d-aab6-d2651cdc0f85]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe17:32e3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 669602, 'tstamp': 669602}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243814, 'error': None, 'target': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.335 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c7207c94-2376-4bc3-9d44-ffcf3acdfe01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap041654ff-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:32:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 228], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669602, 'reachable_time': 39810, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243815, 'error': None, 'target': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.367 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[52b9cd84-af4a-42ca-aa6e-548c828f8523]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.397 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[975c5098-c9ad-42b5-9ae2-b32c69b66858]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.399 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap041654ff-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.399 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.400 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap041654ff-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.401 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:33 compute-0 NetworkManager[55139]: <info>  [1769042193.4024] manager: (tap041654ff-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/356)
Jan 22 00:36:33 compute-0 kernel: tap041654ff-00: entered promiscuous mode
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.404 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.405 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap041654ff-00, col_values=(('external_ids', {'iface-id': '44d0292d-d743-4a92-8996-3ae3a26c0afc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.406 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.407 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:33 compute-0 ovn_controller[95047]: 2026-01-22T00:36:33Z|00719|binding|INFO|Releasing lport 44d0292d-d743-4a92-8996-3ae3a26c0afc from this chassis (sb_readonly=0)
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.408 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/041654ff-0c5d-4cd2-89f6-0863dbbf44a8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/041654ff-0c5d-4cd2-89f6-0863dbbf44a8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.409 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a5be6384-d511-4f5f-9a9a-b1e866d826c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.410 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-041654ff-0c5d-4cd2-89f6-0863dbbf44a8
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/041654ff-0c5d-4cd2-89f6-0863dbbf44a8.pid.haproxy
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 041654ff-0c5d-4cd2-89f6-0863dbbf44a8
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:36:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:36:33.411 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'env', 'PROCESS_TAG=haproxy-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/041654ff-0c5d-4cd2-89f6-0863dbbf44a8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.421 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.514 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042193.5141056, 3bc9ee69-8032-4370-a7b1-e1905436fac1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.515 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] VM Started (Lifecycle Event)
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.550 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.554 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042193.5169616, 3bc9ee69-8032-4370-a7b1-e1905436fac1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.554 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] VM Paused (Lifecycle Event)
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.590 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.594 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.628 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:36:33 compute-0 podman[243853]: 2026-01-22 00:36:33.820564304 +0000 UTC m=+0.067389052 container create 1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.822 182939 DEBUG nova.network.neutron [req-04795ac6-a4ce-4fd4-b72b-1befff32c37f req-abcab4ea-a189-43f9-8746-18e223e0f112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Updated VIF entry in instance network info cache for port 10cd7f40-3848-4590-b23a-4e832bf2f2b4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.824 182939 DEBUG nova.network.neutron [req-04795ac6-a4ce-4fd4-b72b-1befff32c37f req-abcab4ea-a189-43f9-8746-18e223e0f112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Updating instance_info_cache with network_info: [{"id": "e2c10e81-3919-45ac-acd4-8de925e499e6", "address": "fa:16:3e:62:8e:82", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2c10e81-39", "ovs_interfaceid": "e2c10e81-3919-45ac-acd4-8de925e499e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:36:33 compute-0 nova_compute[182935]: 2026-01-22 00:36:33.843 182939 DEBUG oslo_concurrency.lockutils [req-04795ac6-a4ce-4fd4-b72b-1befff32c37f req-abcab4ea-a189-43f9-8746-18e223e0f112 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:36:33 compute-0 systemd[1]: Started libpod-conmon-1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547.scope.
Jan 22 00:36:33 compute-0 podman[243853]: 2026-01-22 00:36:33.793404001 +0000 UTC m=+0.040228799 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:36:33 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:36:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/626b1a37c90a7de198acf0183e42017a11527b41f7c9f879f1fa917431a79ec3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:36:33 compute-0 podman[243853]: 2026-01-22 00:36:33.905450986 +0000 UTC m=+0.152275794 container init 1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:36:33 compute-0 podman[243853]: 2026-01-22 00:36:33.910546769 +0000 UTC m=+0.157371527 container start 1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:36:33 compute-0 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[243869]: [NOTICE]   (243873) : New worker (243875) forked
Jan 22 00:36:33 compute-0 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[243869]: [NOTICE]   (243873) : Loading success.
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.865 182939 DEBUG nova.compute.manager [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-vif-plugged-e2c10e81-3919-45ac-acd4-8de925e499e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.866 182939 DEBUG oslo_concurrency.lockutils [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.866 182939 DEBUG oslo_concurrency.lockutils [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.866 182939 DEBUG oslo_concurrency.lockutils [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.866 182939 DEBUG nova.compute.manager [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] No event matching network-vif-plugged-e2c10e81-3919-45ac-acd4-8de925e499e6 in dict_keys([('network-vif-plugged', '10cd7f40-3848-4590-b23a-4e832bf2f2b4')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.867 182939 WARNING nova.compute.manager [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received unexpected event network-vif-plugged-e2c10e81-3919-45ac-acd4-8de925e499e6 for instance with vm_state building and task_state spawning.
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.867 182939 DEBUG nova.compute.manager [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-vif-plugged-10cd7f40-3848-4590-b23a-4e832bf2f2b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.867 182939 DEBUG oslo_concurrency.lockutils [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.867 182939 DEBUG oslo_concurrency.lockutils [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.868 182939 DEBUG oslo_concurrency.lockutils [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.868 182939 DEBUG nova.compute.manager [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Processing event network-vif-plugged-10cd7f40-3848-4590-b23a-4e832bf2f2b4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.868 182939 DEBUG nova.compute.manager [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-vif-plugged-10cd7f40-3848-4590-b23a-4e832bf2f2b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.868 182939 DEBUG oslo_concurrency.lockutils [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.869 182939 DEBUG oslo_concurrency.lockutils [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.869 182939 DEBUG oslo_concurrency.lockutils [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.869 182939 DEBUG nova.compute.manager [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] No waiting events found dispatching network-vif-plugged-10cd7f40-3848-4590-b23a-4e832bf2f2b4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.869 182939 WARNING nova.compute.manager [req-ef688d8b-3f15-41ff-9904-adaebebb3980 req-b736a6a4-ac6f-4fa9-9a7b-7036086220fa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received unexpected event network-vif-plugged-10cd7f40-3848-4590-b23a-4e832bf2f2b4 for instance with vm_state building and task_state spawning.
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.870 182939 DEBUG nova.compute.manager [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.874 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042194.8743804, 3bc9ee69-8032-4370-a7b1-e1905436fac1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.874 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] VM Resumed (Lifecycle Event)
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.876 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.880 182939 INFO nova.virt.libvirt.driver [-] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Instance spawned successfully.
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.880 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.909 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.909 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.909 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.910 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.910 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.910 182939 DEBUG nova.virt.libvirt.driver [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.918 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.920 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:36:34 compute-0 nova_compute[182935]: 2026-01-22 00:36:34.953 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:36:35 compute-0 nova_compute[182935]: 2026-01-22 00:36:35.017 182939 INFO nova.compute.manager [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Took 15.91 seconds to spawn the instance on the hypervisor.
Jan 22 00:36:35 compute-0 nova_compute[182935]: 2026-01-22 00:36:35.018 182939 DEBUG nova.compute.manager [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:36:35 compute-0 nova_compute[182935]: 2026-01-22 00:36:35.122 182939 INFO nova.compute.manager [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Took 16.58 seconds to build instance.
Jan 22 00:36:35 compute-0 nova_compute[182935]: 2026-01-22 00:36:35.144 182939 DEBUG oslo_concurrency.lockutils [None req-6855a5ea-93f2-45a0-81d3-60666a690846 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:35 compute-0 nova_compute[182935]: 2026-01-22 00:36:35.267 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:36 compute-0 nova_compute[182935]: 2026-01-22 00:36:36.190 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:39 compute-0 nova_compute[182935]: 2026-01-22 00:36:39.567 182939 DEBUG nova.compute.manager [req-69ef80dc-02cf-44ba-8cc3-a052cc7cd03e req-eacc05f2-a4df-431b-8aa4-bf179e8b0521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-changed-e2c10e81-3919-45ac-acd4-8de925e499e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:36:39 compute-0 nova_compute[182935]: 2026-01-22 00:36:39.568 182939 DEBUG nova.compute.manager [req-69ef80dc-02cf-44ba-8cc3-a052cc7cd03e req-eacc05f2-a4df-431b-8aa4-bf179e8b0521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Refreshing instance network info cache due to event network-changed-e2c10e81-3919-45ac-acd4-8de925e499e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:36:39 compute-0 nova_compute[182935]: 2026-01-22 00:36:39.568 182939 DEBUG oslo_concurrency.lockutils [req-69ef80dc-02cf-44ba-8cc3-a052cc7cd03e req-eacc05f2-a4df-431b-8aa4-bf179e8b0521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:36:39 compute-0 nova_compute[182935]: 2026-01-22 00:36:39.568 182939 DEBUG oslo_concurrency.lockutils [req-69ef80dc-02cf-44ba-8cc3-a052cc7cd03e req-eacc05f2-a4df-431b-8aa4-bf179e8b0521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:36:39 compute-0 nova_compute[182935]: 2026-01-22 00:36:39.568 182939 DEBUG nova.network.neutron [req-69ef80dc-02cf-44ba-8cc3-a052cc7cd03e req-eacc05f2-a4df-431b-8aa4-bf179e8b0521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Refreshing network info cache for port e2c10e81-3919-45ac-acd4-8de925e499e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:36:40 compute-0 nova_compute[182935]: 2026-01-22 00:36:40.268 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:40 compute-0 nova_compute[182935]: 2026-01-22 00:36:40.806 182939 DEBUG nova.network.neutron [req-69ef80dc-02cf-44ba-8cc3-a052cc7cd03e req-eacc05f2-a4df-431b-8aa4-bf179e8b0521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Updated VIF entry in instance network info cache for port e2c10e81-3919-45ac-acd4-8de925e499e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:36:40 compute-0 nova_compute[182935]: 2026-01-22 00:36:40.807 182939 DEBUG nova.network.neutron [req-69ef80dc-02cf-44ba-8cc3-a052cc7cd03e req-eacc05f2-a4df-431b-8aa4-bf179e8b0521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Updating instance_info_cache with network_info: [{"id": "e2c10e81-3919-45ac-acd4-8de925e499e6", "address": "fa:16:3e:62:8e:82", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2c10e81-39", "ovs_interfaceid": "e2c10e81-3919-45ac-acd4-8de925e499e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:36:40 compute-0 nova_compute[182935]: 2026-01-22 00:36:40.829 182939 DEBUG oslo_concurrency.lockutils [req-69ef80dc-02cf-44ba-8cc3-a052cc7cd03e req-eacc05f2-a4df-431b-8aa4-bf179e8b0521 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:36:41 compute-0 nova_compute[182935]: 2026-01-22 00:36:41.231 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:45 compute-0 nova_compute[182935]: 2026-01-22 00:36:45.271 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:45 compute-0 podman[243887]: 2026-01-22 00:36:45.686770757 +0000 UTC m=+0.056866438 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:36:45 compute-0 podman[243889]: 2026-01-22 00:36:45.687945335 +0000 UTC m=+0.052698768 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:36:45 compute-0 podman[243888]: 2026-01-22 00:36:45.724683169 +0000 UTC m=+0.094948554 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 00:36:45 compute-0 sshd-session[243885]: Invalid user nginx from 188.166.69.60 port 60836
Jan 22 00:36:45 compute-0 sshd-session[243885]: Connection closed by invalid user nginx 188.166.69.60 port 60836 [preauth]
Jan 22 00:36:46 compute-0 nova_compute[182935]: 2026-01-22 00:36:46.271 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:48 compute-0 ovn_controller[95047]: 2026-01-22T00:36:48Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:62:8e:82 10.100.0.8
Jan 22 00:36:48 compute-0 ovn_controller[95047]: 2026-01-22T00:36:48Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:62:8e:82 10.100.0.8
Jan 22 00:36:50 compute-0 nova_compute[182935]: 2026-01-22 00:36:50.272 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:51 compute-0 nova_compute[182935]: 2026-01-22 00:36:51.274 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:54 compute-0 podman[243978]: 2026-01-22 00:36:54.680774084 +0000 UTC m=+0.056258124 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:36:55 compute-0 nova_compute[182935]: 2026-01-22 00:36:55.275 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:56 compute-0 nova_compute[182935]: 2026-01-22 00:36:56.276 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:36:57 compute-0 nova_compute[182935]: 2026-01-22 00:36:57.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:36:57 compute-0 nova_compute[182935]: 2026-01-22 00:36:57.822 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:57 compute-0 nova_compute[182935]: 2026-01-22 00:36:57.823 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:57 compute-0 nova_compute[182935]: 2026-01-22 00:36:57.823 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:57 compute-0 nova_compute[182935]: 2026-01-22 00:36:57.824 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:36:57 compute-0 nova_compute[182935]: 2026-01-22 00:36:57.907 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:36:57 compute-0 nova_compute[182935]: 2026-01-22 00:36:57.970 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:36:57 compute-0 nova_compute[182935]: 2026-01-22 00:36:57.971 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:36:58 compute-0 nova_compute[182935]: 2026-01-22 00:36:58.031 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:36:58 compute-0 nova_compute[182935]: 2026-01-22 00:36:58.224 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:36:58 compute-0 nova_compute[182935]: 2026-01-22 00:36:58.226 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5481MB free_disk=73.09416198730469GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:36:58 compute-0 nova_compute[182935]: 2026-01-22 00:36:58.227 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:36:58 compute-0 nova_compute[182935]: 2026-01-22 00:36:58.227 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:36:58 compute-0 nova_compute[182935]: 2026-01-22 00:36:58.937 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 3bc9ee69-8032-4370-a7b1-e1905436fac1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:36:58 compute-0 nova_compute[182935]: 2026-01-22 00:36:58.938 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:36:58 compute-0 nova_compute[182935]: 2026-01-22 00:36:58.938 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:36:58 compute-0 nova_compute[182935]: 2026-01-22 00:36:58.984 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:36:59 compute-0 nova_compute[182935]: 2026-01-22 00:36:59.007 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:36:59 compute-0 nova_compute[182935]: 2026-01-22 00:36:59.038 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:36:59 compute-0 nova_compute[182935]: 2026-01-22 00:36:59.038 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:36:59 compute-0 podman[244004]: 2026-01-22 00:36:59.691846556 +0000 UTC m=+0.061331876 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, release=1755695350, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 00:36:59 compute-0 podman[244005]: 2026-01-22 00:36:59.698705751 +0000 UTC m=+0.064113273 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 00:37:00 compute-0 nova_compute[182935]: 2026-01-22 00:37:00.276 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.293 182939 DEBUG nova.compute.manager [req-3963879d-ff7f-45bf-bdc5-4f0453394ca9 req-a94a9b7d-36c9-4925-bde3-bfe325a55995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-changed-e2c10e81-3919-45ac-acd4-8de925e499e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.294 182939 DEBUG nova.compute.manager [req-3963879d-ff7f-45bf-bdc5-4f0453394ca9 req-a94a9b7d-36c9-4925-bde3-bfe325a55995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Refreshing instance network info cache due to event network-changed-e2c10e81-3919-45ac-acd4-8de925e499e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.294 182939 DEBUG oslo_concurrency.lockutils [req-3963879d-ff7f-45bf-bdc5-4f0453394ca9 req-a94a9b7d-36c9-4925-bde3-bfe325a55995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.294 182939 DEBUG oslo_concurrency.lockutils [req-3963879d-ff7f-45bf-bdc5-4f0453394ca9 req-a94a9b7d-36c9-4925-bde3-bfe325a55995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.295 182939 DEBUG nova.network.neutron [req-3963879d-ff7f-45bf-bdc5-4f0453394ca9 req-a94a9b7d-36c9-4925-bde3-bfe325a55995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Refreshing network info cache for port e2c10e81-3919-45ac-acd4-8de925e499e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.318 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.391 182939 DEBUG oslo_concurrency.lockutils [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.392 182939 DEBUG oslo_concurrency.lockutils [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.392 182939 DEBUG oslo_concurrency.lockutils [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.392 182939 DEBUG oslo_concurrency.lockutils [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.392 182939 DEBUG oslo_concurrency.lockutils [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.405 182939 INFO nova.compute.manager [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Terminating instance
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.416 182939 DEBUG nova.compute.manager [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:37:01 compute-0 kernel: tape2c10e81-39 (unregistering): left promiscuous mode
Jan 22 00:37:01 compute-0 NetworkManager[55139]: <info>  [1769042221.4520] device (tape2c10e81-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:37:01 compute-0 ovn_controller[95047]: 2026-01-22T00:37:01Z|00720|binding|INFO|Releasing lport e2c10e81-3919-45ac-acd4-8de925e499e6 from this chassis (sb_readonly=0)
Jan 22 00:37:01 compute-0 ovn_controller[95047]: 2026-01-22T00:37:01Z|00721|binding|INFO|Setting lport e2c10e81-3919-45ac-acd4-8de925e499e6 down in Southbound
Jan 22 00:37:01 compute-0 ovn_controller[95047]: 2026-01-22T00:37:01Z|00722|binding|INFO|Removing iface tape2c10e81-39 ovn-installed in OVS
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.463 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.465 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.473 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:8e:82 10.100.0.8'], port_security=['fa:16:3e:62:8e:82 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3bc9ee69-8032-4370-a7b1-e1905436fac1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a02fb4eb-eda5-4559-8b41-ffe0af33e841', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9c9c72d4-43bc-43b5-af16-0875792fba89, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=e2c10e81-3919-45ac-acd4-8de925e499e6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.475 104408 INFO neutron.agent.ovn.metadata.agent [-] Port e2c10e81-3919-45ac-acd4-8de925e499e6 in datapath 9d3a0d92-0a01-43e0-bbe5-a677082b8f1b unbound from our chassis
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.476 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.476 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.477 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[53f0ccd2-15cf-44f9-b391-f65f7df55aa5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.479 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b namespace which is not needed anymore
Jan 22 00:37:01 compute-0 kernel: tap10cd7f40-38 (unregistering): left promiscuous mode
Jan 22 00:37:01 compute-0 NetworkManager[55139]: <info>  [1769042221.4942] device (tap10cd7f40-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.495 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 ovn_controller[95047]: 2026-01-22T00:37:01Z|00723|binding|INFO|Releasing lport 10cd7f40-3848-4590-b23a-4e832bf2f2b4 from this chassis (sb_readonly=0)
Jan 22 00:37:01 compute-0 ovn_controller[95047]: 2026-01-22T00:37:01Z|00724|binding|INFO|Setting lport 10cd7f40-3848-4590-b23a-4e832bf2f2b4 down in Southbound
Jan 22 00:37:01 compute-0 ovn_controller[95047]: 2026-01-22T00:37:01Z|00725|binding|INFO|Removing iface tap10cd7f40-38 ovn-installed in OVS
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.506 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.508 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.514 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:f5:0e 2001:db8:0:1:f816:3eff:fe50:f50e 2001:db8::f816:3eff:fe50:f50e'], port_security=['fa:16:3e:50:f5:0e 2001:db8:0:1:f816:3eff:fe50:f50e 2001:db8::f816:3eff:fe50:f50e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe50:f50e/64 2001:db8::f816:3eff:fe50:f50e/64', 'neutron:device_id': '3bc9ee69-8032-4370-a7b1-e1905436fac1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a02fb4eb-eda5-4559-8b41-ffe0af33e841', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d6cac94-5c44-44de-a872-7bf42948d910, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=10cd7f40-3848-4590-b23a-4e832bf2f2b4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.522 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000ae.scope: Deactivated successfully.
Jan 22 00:37:01 compute-0 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000ae.scope: Consumed 14.947s CPU time.
Jan 22 00:37:01 compute-0 systemd-machined[154182]: Machine qemu-90-instance-000000ae terminated.
Jan 22 00:37:01 compute-0 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[243788]: [NOTICE]   (243792) : haproxy version is 2.8.14-c23fe91
Jan 22 00:37:01 compute-0 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[243788]: [NOTICE]   (243792) : path to executable is /usr/sbin/haproxy
Jan 22 00:37:01 compute-0 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[243788]: [WARNING]  (243792) : Exiting Master process...
Jan 22 00:37:01 compute-0 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[243788]: [WARNING]  (243792) : Exiting Master process...
Jan 22 00:37:01 compute-0 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[243788]: [ALERT]    (243792) : Current worker (243794) exited with code 143 (Terminated)
Jan 22 00:37:01 compute-0 neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b[243788]: [WARNING]  (243792) : All workers exited. Exiting... (0)
Jan 22 00:37:01 compute-0 systemd[1]: libpod-68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f.scope: Deactivated successfully.
Jan 22 00:37:01 compute-0 podman[244074]: 2026-01-22 00:37:01.626967072 +0000 UTC m=+0.049123842 container died 68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:37:01 compute-0 NetworkManager[55139]: <info>  [1769042221.6700] manager: (tap10cd7f40-38): new Tun device (/org/freedesktop/NetworkManager/Devices/357)
Jan 22 00:37:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f-userdata-shm.mount: Deactivated successfully.
Jan 22 00:37:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-d755378462cc508a2aa7d289de9c308482fdd0ab19f092a02f0ea5ef7e95ef40-merged.mount: Deactivated successfully.
Jan 22 00:37:01 compute-0 podman[244074]: 2026-01-22 00:37:01.683676807 +0000 UTC m=+0.105833577 container cleanup 68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:37:01 compute-0 systemd[1]: libpod-conmon-68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f.scope: Deactivated successfully.
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.716 182939 INFO nova.virt.libvirt.driver [-] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Instance destroyed successfully.
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.717 182939 DEBUG nova.objects.instance [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid 3bc9ee69-8032-4370-a7b1-e1905436fac1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.733 182939 DEBUG nova.virt.libvirt.vif [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:36:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1496462229',display_name='tempest-TestGettingAddress-server-1496462229',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1496462229',id=174,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK7nR7wFnCoiySw65REL0XK2oqhDKLhnFVsBGaeEJezobQmbvet9F136TafqeB+t847DytpkOvQ0+Cnej4wWLBEdCAU4r3MTN2LY3bi428WR2O1oEXJJ3VIylh32sSXeGw==',key_name='tempest-TestGettingAddress-2092456170',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:36:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-axqaz2jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:36:35Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=3bc9ee69-8032-4370-a7b1-e1905436fac1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e2c10e81-3919-45ac-acd4-8de925e499e6", "address": "fa:16:3e:62:8e:82", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2c10e81-39", "ovs_interfaceid": "e2c10e81-3919-45ac-acd4-8de925e499e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.733 182939 DEBUG nova.network.os_vif_util [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "e2c10e81-3919-45ac-acd4-8de925e499e6", "address": "fa:16:3e:62:8e:82", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2c10e81-39", "ovs_interfaceid": "e2c10e81-3919-45ac-acd4-8de925e499e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.734 182939 DEBUG nova.network.os_vif_util [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:8e:82,bridge_name='br-int',has_traffic_filtering=True,id=e2c10e81-3919-45ac-acd4-8de925e499e6,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2c10e81-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.735 182939 DEBUG os_vif [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:8e:82,bridge_name='br-int',has_traffic_filtering=True,id=e2c10e81-3919-45ac-acd4-8de925e499e6,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2c10e81-39') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.738 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.739 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape2c10e81-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.740 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.742 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.749 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 podman[244125]: 2026-01-22 00:37:01.750004022 +0000 UTC m=+0.043625260 container remove 68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.751 182939 INFO os_vif [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:8e:82,bridge_name='br-int',has_traffic_filtering=True,id=e2c10e81-3919-45ac-acd4-8de925e499e6,network=Network(9d3a0d92-0a01-43e0-bbe5-a677082b8f1b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2c10e81-39')
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.752 182939 DEBUG nova.virt.libvirt.vif [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:36:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1496462229',display_name='tempest-TestGettingAddress-server-1496462229',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1496462229',id=174,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK7nR7wFnCoiySw65REL0XK2oqhDKLhnFVsBGaeEJezobQmbvet9F136TafqeB+t847DytpkOvQ0+Cnej4wWLBEdCAU4r3MTN2LY3bi428WR2O1oEXJJ3VIylh32sSXeGw==',key_name='tempest-TestGettingAddress-2092456170',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:36:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-axqaz2jn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:36:35Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=3bc9ee69-8032-4370-a7b1-e1905436fac1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.752 182939 DEBUG nova.network.os_vif_util [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.753 182939 DEBUG nova.network.os_vif_util [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:f5:0e,bridge_name='br-int',has_traffic_filtering=True,id=10cd7f40-3848-4590-b23a-4e832bf2f2b4,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10cd7f40-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.754 182939 DEBUG os_vif [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:f5:0e,bridge_name='br-int',has_traffic_filtering=True,id=10cd7f40-3848-4590-b23a-4e832bf2f2b4,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10cd7f40-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.755 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.755 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap10cd7f40-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.756 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.755 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5e7d3e8e-d194-4cd5-9779-a2fe6a49d51c]: (4, ('Thu Jan 22 12:37:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b (68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f)\n68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f\nThu Jan 22 12:37:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b (68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f)\n68fa33a5e3f68b58c40932434409f118a01f195c1b9cbc1e75bfbdc87f2bad2f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.758 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.759 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[02b34907-a81f-444b-b004-89adde05e873]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.759 182939 INFO os_vif [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:f5:0e,bridge_name='br-int',has_traffic_filtering=True,id=10cd7f40-3848-4590-b23a-4e832bf2f2b4,network=Network(041654ff-0c5d-4cd2-89f6-0863dbbf44a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap10cd7f40-38')
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.760 182939 INFO nova.virt.libvirt.driver [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Deleting instance files /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1_del
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.760 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d3a0d92-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.761 182939 INFO nova.virt.libvirt.driver [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Deletion of /var/lib/nova/instances/3bc9ee69-8032-4370-a7b1-e1905436fac1_del complete
Jan 22 00:37:01 compute-0 kernel: tap9d3a0d92-00: left promiscuous mode
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.766 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.773 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.774 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.777 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ce62c890-6fd1-492d-8302-53c770e1f594]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.799 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a45cab68-1340-4980-a493-2efe9c8037d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.801 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[684a8aef-0e69-4a22-9bd0-2204d91a48e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.822 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[da66d435-65ca-4144-8b45-655d9f2b60f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669514, 'reachable_time': 16397, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244148, 'error': None, 'target': 'ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.825 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9d3a0d92-0a01-43e0-bbe5-a677082b8f1b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.825 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[20af8b9f-e31d-414f-ad74-8e9ac8f3f3e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:01 compute-0 systemd[1]: run-netns-ovnmeta\x2d9d3a0d92\x2d0a01\x2d43e0\x2dbbe5\x2da677082b8f1b.mount: Deactivated successfully.
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.827 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 10cd7f40-3848-4590-b23a-4e832bf2f2b4 in datapath 041654ff-0c5d-4cd2-89f6-0863dbbf44a8 unbound from our chassis
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.828 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 041654ff-0c5d-4cd2-89f6-0863dbbf44a8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.830 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fc972230-7b78-4fa6-a3bc-ecb2a50dacd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:01 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:01.830 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8 namespace which is not needed anymore
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.834 182939 INFO nova.compute.manager [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.835 182939 DEBUG oslo.service.loopingcall [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.835 182939 DEBUG nova.compute.manager [-] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:37:01 compute-0 nova_compute[182935]: 2026-01-22 00:37:01.835 182939 DEBUG nova.network.neutron [-] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:37:01 compute-0 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[243869]: [NOTICE]   (243873) : haproxy version is 2.8.14-c23fe91
Jan 22 00:37:01 compute-0 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[243869]: [NOTICE]   (243873) : path to executable is /usr/sbin/haproxy
Jan 22 00:37:01 compute-0 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[243869]: [WARNING]  (243873) : Exiting Master process...
Jan 22 00:37:01 compute-0 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[243869]: [WARNING]  (243873) : Exiting Master process...
Jan 22 00:37:01 compute-0 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[243869]: [ALERT]    (243873) : Current worker (243875) exited with code 143 (Terminated)
Jan 22 00:37:01 compute-0 neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8[243869]: [WARNING]  (243873) : All workers exited. Exiting... (0)
Jan 22 00:37:01 compute-0 systemd[1]: libpod-1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547.scope: Deactivated successfully.
Jan 22 00:37:01 compute-0 podman[244166]: 2026-01-22 00:37:01.957128204 +0000 UTC m=+0.042181725 container died 1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:37:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547-userdata-shm.mount: Deactivated successfully.
Jan 22 00:37:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-626b1a37c90a7de198acf0183e42017a11527b41f7c9f879f1fa917431a79ec3-merged.mount: Deactivated successfully.
Jan 22 00:37:01 compute-0 podman[244166]: 2026-01-22 00:37:01.989491012 +0000 UTC m=+0.074544533 container cleanup 1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:37:01 compute-0 systemd[1]: libpod-conmon-1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547.scope: Deactivated successfully.
Jan 22 00:37:02 compute-0 podman[244197]: 2026-01-22 00:37:02.051698568 +0000 UTC m=+0.041068438 container remove 1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:37:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:02.057 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d4373c5b-e282-4d72-8de3-3fa434971143]: (4, ('Thu Jan 22 12:37:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8 (1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547)\n1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547\nThu Jan 22 12:37:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8 (1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547)\n1dbc0968b382db0ab0282a0fe3c6ac1831cd82b1f067ebeb1a57d0eef733e547\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:02.059 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4a7f9da2-c2d8-4948-bd6e-ac7a73ae99f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:02.060 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap041654ff-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:37:02 compute-0 nova_compute[182935]: 2026-01-22 00:37:02.062 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:02 compute-0 kernel: tap041654ff-00: left promiscuous mode
Jan 22 00:37:02 compute-0 nova_compute[182935]: 2026-01-22 00:37:02.073 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:02.078 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a50af515-a025-4c1c-a76f-ae61acc43dc4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:02.099 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[633b108f-fbad-47d5-8099-968cb7ffc375]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:02.100 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd602c8-083a-4351-81fb-ef29884343bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:02.116 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8d3775e6-0815-40fe-8ae8-3b5915c58126]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 669596, 'reachable_time': 39672, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244212, 'error': None, 'target': 'ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:02.118 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-041654ff-0c5d-4cd2-89f6-0863dbbf44a8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:37:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:02.118 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[248e2a33-5357-44ed-ba5c-b989e7d10f7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d041654ff\x2d0c5d\x2d4cd2\x2d89f6\x2d0863dbbf44a8.mount: Deactivated successfully.
Jan 22 00:37:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:03.235 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:03.235 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:03.235 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.288 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:03.288 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:37:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:03.289 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.460 182939 DEBUG nova.compute.manager [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-vif-unplugged-e2c10e81-3919-45ac-acd4-8de925e499e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.461 182939 DEBUG oslo_concurrency.lockutils [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.461 182939 DEBUG oslo_concurrency.lockutils [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.462 182939 DEBUG oslo_concurrency.lockutils [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.462 182939 DEBUG nova.compute.manager [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] No waiting events found dispatching network-vif-unplugged-e2c10e81-3919-45ac-acd4-8de925e499e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.462 182939 DEBUG nova.compute.manager [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-vif-unplugged-e2c10e81-3919-45ac-acd4-8de925e499e6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.463 182939 DEBUG nova.compute.manager [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-vif-plugged-e2c10e81-3919-45ac-acd4-8de925e499e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.463 182939 DEBUG oslo_concurrency.lockutils [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.463 182939 DEBUG oslo_concurrency.lockutils [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.464 182939 DEBUG oslo_concurrency.lockutils [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.464 182939 DEBUG nova.compute.manager [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] No waiting events found dispatching network-vif-plugged-e2c10e81-3919-45ac-acd4-8de925e499e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.464 182939 WARNING nova.compute.manager [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received unexpected event network-vif-plugged-e2c10e81-3919-45ac-acd4-8de925e499e6 for instance with vm_state active and task_state deleting.
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.465 182939 DEBUG nova.compute.manager [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-vif-unplugged-10cd7f40-3848-4590-b23a-4e832bf2f2b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.465 182939 DEBUG oslo_concurrency.lockutils [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.465 182939 DEBUG oslo_concurrency.lockutils [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.466 182939 DEBUG oslo_concurrency.lockutils [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.466 182939 DEBUG nova.compute.manager [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] No waiting events found dispatching network-vif-unplugged-10cd7f40-3848-4590-b23a-4e832bf2f2b4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.467 182939 DEBUG nova.compute.manager [req-671170b6-43e5-44ac-92e2-55b4082b77bb req-ad529d7e-c7f0-46c3-86d8-d51e6069703a 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-vif-unplugged-10cd7f40-3848-4590-b23a-4e832bf2f2b4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.511 182939 DEBUG nova.compute.manager [req-741e1681-f6d5-4c4b-8448-bb13b51872b0 req-f7a3b3bc-3880-4821-a5e3-ce1b25548371 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-vif-deleted-e2c10e81-3919-45ac-acd4-8de925e499e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.512 182939 INFO nova.compute.manager [req-741e1681-f6d5-4c4b-8448-bb13b51872b0 req-f7a3b3bc-3880-4821-a5e3-ce1b25548371 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Neutron deleted interface e2c10e81-3919-45ac-acd4-8de925e499e6; detaching it from the instance and deleting it from the info cache
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.512 182939 DEBUG nova.network.neutron [req-741e1681-f6d5-4c4b-8448-bb13b51872b0 req-f7a3b3bc-3880-4821-a5e3-ce1b25548371 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Updating instance_info_cache with network_info: [{"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.536 182939 DEBUG nova.compute.manager [req-741e1681-f6d5-4c4b-8448-bb13b51872b0 req-f7a3b3bc-3880-4821-a5e3-ce1b25548371 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Detach interface failed, port_id=e2c10e81-3919-45ac-acd4-8de925e499e6, reason: Instance 3bc9ee69-8032-4370-a7b1-e1905436fac1 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.930 182939 DEBUG nova.network.neutron [req-3963879d-ff7f-45bf-bdc5-4f0453394ca9 req-a94a9b7d-36c9-4925-bde3-bfe325a55995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Updated VIF entry in instance network info cache for port e2c10e81-3919-45ac-acd4-8de925e499e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.931 182939 DEBUG nova.network.neutron [req-3963879d-ff7f-45bf-bdc5-4f0453394ca9 req-a94a9b7d-36c9-4925-bde3-bfe325a55995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Updating instance_info_cache with network_info: [{"id": "e2c10e81-3919-45ac-acd4-8de925e499e6", "address": "fa:16:3e:62:8e:82", "network": {"id": "9d3a0d92-0a01-43e0-bbe5-a677082b8f1b", "bridge": "br-int", "label": "tempest-network-smoke--1303473530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2c10e81-39", "ovs_interfaceid": "e2c10e81-3919-45ac-acd4-8de925e499e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "address": "fa:16:3e:50:f5:0e", "network": {"id": "041654ff-0c5d-4cd2-89f6-0863dbbf44a8", "bridge": "br-int", "label": "tempest-network-smoke--1472393287", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe50:f50e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap10cd7f40-38", "ovs_interfaceid": "10cd7f40-3848-4590-b23a-4e832bf2f2b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:37:03 compute-0 nova_compute[182935]: 2026-01-22 00:37:03.957 182939 DEBUG oslo_concurrency.lockutils [req-3963879d-ff7f-45bf-bdc5-4f0453394ca9 req-a94a9b7d-36c9-4925-bde3-bfe325a55995 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3bc9ee69-8032-4370-a7b1-e1905436fac1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.039 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.039 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.384 182939 DEBUG nova.network.neutron [-] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.403 182939 INFO nova.compute.manager [-] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Took 2.57 seconds to deallocate network for instance.
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.469 182939 DEBUG oslo_concurrency.lockutils [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.470 182939 DEBUG oslo_concurrency.lockutils [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.536 182939 DEBUG nova.compute.provider_tree [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.561 182939 DEBUG nova.scheduler.client.report [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.583 182939 DEBUG oslo_concurrency.lockutils [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.630 182939 INFO nova.scheduler.client.report [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance 3bc9ee69-8032-4370-a7b1-e1905436fac1
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.728 182939 DEBUG oslo_concurrency.lockutils [None req-1575a3f3-f449-4b6c-a2df-01d81155f310 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:37:04 compute-0 nova_compute[182935]: 2026-01-22 00:37:04.812 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:37:05 compute-0 nova_compute[182935]: 2026-01-22 00:37:05.279 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:05 compute-0 nova_compute[182935]: 2026-01-22 00:37:05.561 182939 DEBUG nova.compute.manager [req-f522a7bb-48f8-49c7-9171-908251f94479 req-2eaa59f5-e711-4a8c-afeb-5d24aefdeb63 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-vif-plugged-10cd7f40-3848-4590-b23a-4e832bf2f2b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:05 compute-0 nova_compute[182935]: 2026-01-22 00:37:05.561 182939 DEBUG oslo_concurrency.lockutils [req-f522a7bb-48f8-49c7-9171-908251f94479 req-2eaa59f5-e711-4a8c-afeb-5d24aefdeb63 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:05 compute-0 nova_compute[182935]: 2026-01-22 00:37:05.562 182939 DEBUG oslo_concurrency.lockutils [req-f522a7bb-48f8-49c7-9171-908251f94479 req-2eaa59f5-e711-4a8c-afeb-5d24aefdeb63 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:05 compute-0 nova_compute[182935]: 2026-01-22 00:37:05.562 182939 DEBUG oslo_concurrency.lockutils [req-f522a7bb-48f8-49c7-9171-908251f94479 req-2eaa59f5-e711-4a8c-afeb-5d24aefdeb63 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3bc9ee69-8032-4370-a7b1-e1905436fac1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:05 compute-0 nova_compute[182935]: 2026-01-22 00:37:05.563 182939 DEBUG nova.compute.manager [req-f522a7bb-48f8-49c7-9171-908251f94479 req-2eaa59f5-e711-4a8c-afeb-5d24aefdeb63 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] No waiting events found dispatching network-vif-plugged-10cd7f40-3848-4590-b23a-4e832bf2f2b4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:37:05 compute-0 nova_compute[182935]: 2026-01-22 00:37:05.563 182939 WARNING nova.compute.manager [req-f522a7bb-48f8-49c7-9171-908251f94479 req-2eaa59f5-e711-4a8c-afeb-5d24aefdeb63 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received unexpected event network-vif-plugged-10cd7f40-3848-4590-b23a-4e832bf2f2b4 for instance with vm_state deleted and task_state None.
Jan 22 00:37:05 compute-0 nova_compute[182935]: 2026-01-22 00:37:05.599 182939 DEBUG nova.compute.manager [req-128f5989-422e-4b84-9ffd-6c8161fd5120 req-36ecc411-be8e-4594-abf9-403ef2bfd00d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Received event network-vif-deleted-10cd7f40-3848-4590-b23a-4e832bf2f2b4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:37:06 compute-0 nova_compute[182935]: 2026-01-22 00:37:06.758 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:06 compute-0 nova_compute[182935]: 2026-01-22 00:37:06.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:09 compute-0 nova_compute[182935]: 2026-01-22 00:37:09.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:10 compute-0 nova_compute[182935]: 2026-01-22 00:37:10.327 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:11 compute-0 nova_compute[182935]: 2026-01-22 00:37:11.761 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:11 compute-0 nova_compute[182935]: 2026-01-22 00:37:11.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:13 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:13.291 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:37:14 compute-0 nova_compute[182935]: 2026-01-22 00:37:14.440 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:14 compute-0 nova_compute[182935]: 2026-01-22 00:37:14.524 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:14 compute-0 nova_compute[182935]: 2026-01-22 00:37:14.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:15 compute-0 nova_compute[182935]: 2026-01-22 00:37:15.329 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:15 compute-0 nova_compute[182935]: 2026-01-22 00:37:15.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:15 compute-0 nova_compute[182935]: 2026-01-22 00:37:15.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:16 compute-0 podman[244216]: 2026-01-22 00:37:16.688575646 +0000 UTC m=+0.051358910 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:37:16 compute-0 nova_compute[182935]: 2026-01-22 00:37:16.715 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042221.713168, 3bc9ee69-8032-4370-a7b1-e1905436fac1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:37:16 compute-0 nova_compute[182935]: 2026-01-22 00:37:16.715 182939 INFO nova.compute.manager [-] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] VM Stopped (Lifecycle Event)
Jan 22 00:37:16 compute-0 podman[244215]: 2026-01-22 00:37:16.727440416 +0000 UTC m=+0.091132582 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:37:16 compute-0 podman[244214]: 2026-01-22 00:37:16.727355814 +0000 UTC m=+0.090130578 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:37:16 compute-0 nova_compute[182935]: 2026-01-22 00:37:16.735 182939 DEBUG nova.compute.manager [None req-b7ff9f39-9096-458b-9200-3a0f3ee86960 - - - - - -] [instance: 3bc9ee69-8032-4370-a7b1-e1905436fac1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:37:16 compute-0 nova_compute[182935]: 2026-01-22 00:37:16.762 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:17 compute-0 nova_compute[182935]: 2026-01-22 00:37:17.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:20 compute-0 nova_compute[182935]: 2026-01-22 00:37:20.331 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:21 compute-0 nova_compute[182935]: 2026-01-22 00:37:21.765 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:25 compute-0 nova_compute[182935]: 2026-01-22 00:37:25.333 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:25 compute-0 podman[244281]: 2026-01-22 00:37:25.684995003 +0000 UTC m=+0.058676036 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:37:26 compute-0 nova_compute[182935]: 2026-01-22 00:37:26.815 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:30 compute-0 nova_compute[182935]: 2026-01-22 00:37:30.395 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:30 compute-0 sshd-session[244300]: Invalid user nginx from 188.166.69.60 port 50760
Jan 22 00:37:30 compute-0 podman[244302]: 2026-01-22 00:37:30.475691279 +0000 UTC m=+0.059529326 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Jan 22 00:37:30 compute-0 podman[244303]: 2026-01-22 00:37:30.475788531 +0000 UTC m=+0.056755389 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:37:30 compute-0 sshd-session[244300]: Connection closed by invalid user nginx 188.166.69.60 port 50760 [preauth]
Jan 22 00:37:31 compute-0 nova_compute[182935]: 2026-01-22 00:37:31.818 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:35 compute-0 nova_compute[182935]: 2026-01-22 00:37:35.430 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:36 compute-0 nova_compute[182935]: 2026-01-22 00:37:36.821 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:39 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:39.727 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:ff:76 2001:db8:0:1:f816:3eff:fe5f:ff76 2001:db8::f816:3eff:fe5f:ff76'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe5f:ff76/64 2001:db8::f816:3eff:fe5f:ff76/64', 'neutron:device_id': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eaa35b5e-130a-4933-a219-b6429231aa8c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=63ad2747-135a-46c8-90ca-ec1def31a1c2) old=Port_Binding(mac=['fa:16:3e:5f:ff:76 2001:db8::f816:3eff:fe5f:ff76'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe5f:ff76/64', 'neutron:device_id': 'ovnmeta-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01fa8e13-9f62-4b06-88db-79f2e6ca65b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:37:39 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:39.728 104408 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 63ad2747-135a-46c8-90ca-ec1def31a1c2 in datapath 01fa8e13-9f62-4b06-88db-79f2e6ca65b8 updated
Jan 22 00:37:39 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:39.730 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01fa8e13-9f62-4b06-88db-79f2e6ca65b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:37:39 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:37:39.731 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[be35c9af-4d22-47b6-a548-1335092eba13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:37:40 compute-0 nova_compute[182935]: 2026-01-22 00:37:40.470 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:41 compute-0 systemd[1]: Starting dnf makecache...
Jan 22 00:37:41 compute-0 dnf[244343]: Metadata cache refreshed recently.
Jan 22 00:37:41 compute-0 nova_compute[182935]: 2026-01-22 00:37:41.825 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:41 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 22 00:37:41 compute-0 systemd[1]: Finished dnf makecache.
Jan 22 00:37:45 compute-0 nova_compute[182935]: 2026-01-22 00:37:45.493 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:46 compute-0 nova_compute[182935]: 2026-01-22 00:37:46.829 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:47 compute-0 podman[244347]: 2026-01-22 00:37:47.683789815 +0000 UTC m=+0.053226695 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:37:47 compute-0 podman[244345]: 2026-01-22 00:37:47.68771647 +0000 UTC m=+0.061775391 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:37:47 compute-0 podman[244346]: 2026-01-22 00:37:47.703933818 +0000 UTC m=+0.077419815 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 22 00:37:50 compute-0 nova_compute[182935]: 2026-01-22 00:37:50.537 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:51 compute-0 nova_compute[182935]: 2026-01-22 00:37:51.833 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:55 compute-0 nova_compute[182935]: 2026-01-22 00:37:55.539 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:56 compute-0 podman[244417]: 2026-01-22 00:37:56.676259519 +0000 UTC m=+0.052231831 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 00:37:56 compute-0 nova_compute[182935]: 2026-01-22 00:37:56.877 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:37:58 compute-0 nova_compute[182935]: 2026-01-22 00:37:58.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:37:58 compute-0 nova_compute[182935]: 2026-01-22 00:37:58.830 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:58 compute-0 nova_compute[182935]: 2026-01-22 00:37:58.831 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:58 compute-0 nova_compute[182935]: 2026-01-22 00:37:58.831 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:37:58 compute-0 nova_compute[182935]: 2026-01-22 00:37:58.831 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:37:58 compute-0 nova_compute[182935]: 2026-01-22 00:37:58.974 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:37:58 compute-0 nova_compute[182935]: 2026-01-22 00:37:58.975 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5689MB free_disk=73.12295532226562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:37:58 compute-0 nova_compute[182935]: 2026-01-22 00:37:58.975 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:37:58 compute-0 nova_compute[182935]: 2026-01-22 00:37:58.976 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:37:59 compute-0 nova_compute[182935]: 2026-01-22 00:37:59.041 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:37:59 compute-0 nova_compute[182935]: 2026-01-22 00:37:59.041 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:37:59 compute-0 nova_compute[182935]: 2026-01-22 00:37:59.059 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:37:59 compute-0 nova_compute[182935]: 2026-01-22 00:37:59.075 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:37:59 compute-0 nova_compute[182935]: 2026-01-22 00:37:59.098 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:37:59 compute-0 nova_compute[182935]: 2026-01-22 00:37:59.098 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:00 compute-0 nova_compute[182935]: 2026-01-22 00:38:00.540 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:00 compute-0 podman[244438]: 2026-01-22 00:38:00.67813199 +0000 UTC m=+0.055843419 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, architecture=x86_64)
Jan 22 00:38:00 compute-0 podman[244439]: 2026-01-22 00:38:00.678512579 +0000 UTC m=+0.053022341 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:38:01 compute-0 nova_compute[182935]: 2026-01-22 00:38:01.880 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:02 compute-0 ovn_controller[95047]: 2026-01-22T00:38:02Z|00726|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 22 00:38:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:38:03.236 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:38:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:38:03.236 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:38:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:38:03.237 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:38:05 compute-0 nova_compute[182935]: 2026-01-22 00:38:05.099 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:05 compute-0 nova_compute[182935]: 2026-01-22 00:38:05.114 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:05 compute-0 nova_compute[182935]: 2026-01-22 00:38:05.115 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:38:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:38:05.301 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:38:05 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:38:05.302 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:38:05 compute-0 nova_compute[182935]: 2026-01-22 00:38:05.302 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:05 compute-0 nova_compute[182935]: 2026-01-22 00:38:05.542 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:05 compute-0 nova_compute[182935]: 2026-01-22 00:38:05.809 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:05 compute-0 nova_compute[182935]: 2026-01-22 00:38:05.809 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:38:05 compute-0 nova_compute[182935]: 2026-01-22 00:38:05.809 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:38:05 compute-0 nova_compute[182935]: 2026-01-22 00:38:05.827 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:38:06 compute-0 nova_compute[182935]: 2026-01-22 00:38:06.883 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:38:07.304 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:38:08 compute-0 sshd-session[244479]: Received disconnect from 91.224.92.78 port 43542:11:  [preauth]
Jan 22 00:38:08 compute-0 sshd-session[244479]: Disconnected from authenticating user root 91.224.92.78 port 43542 [preauth]
Jan 22 00:38:08 compute-0 nova_compute[182935]: 2026-01-22 00:38:08.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:10 compute-0 nova_compute[182935]: 2026-01-22 00:38:10.544 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:10 compute-0 nova_compute[182935]: 2026-01-22 00:38:10.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:11 compute-0 nova_compute[182935]: 2026-01-22 00:38:11.885 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:12 compute-0 sshd-session[244481]: Invalid user nginx from 188.166.69.60 port 36030
Jan 22 00:38:12 compute-0 sshd-session[244481]: Connection closed by invalid user nginx 188.166.69.60 port 36030 [preauth]
Jan 22 00:38:14 compute-0 nova_compute[182935]: 2026-01-22 00:38:14.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:15 compute-0 nova_compute[182935]: 2026-01-22 00:38:15.584 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:15 compute-0 nova_compute[182935]: 2026-01-22 00:38:15.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:15 compute-0 nova_compute[182935]: 2026-01-22 00:38:15.808 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:16 compute-0 nova_compute[182935]: 2026-01-22 00:38:16.807 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:16 compute-0 nova_compute[182935]: 2026-01-22 00:38:16.887 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:17 compute-0 nova_compute[182935]: 2026-01-22 00:38:17.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:18 compute-0 podman[244483]: 2026-01-22 00:38:18.672987303 +0000 UTC m=+0.050561772 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:38:18 compute-0 podman[244485]: 2026-01-22 00:38:18.700703596 +0000 UTC m=+0.063175303 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:38:18 compute-0 podman[244484]: 2026-01-22 00:38:18.72966181 +0000 UTC m=+0.106191524 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:38:20 compute-0 nova_compute[182935]: 2026-01-22 00:38:20.586 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:20 compute-0 nova_compute[182935]: 2026-01-22 00:38:20.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:21 compute-0 nova_compute[182935]: 2026-01-22 00:38:21.890 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:38:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:38:25 compute-0 nova_compute[182935]: 2026-01-22 00:38:25.619 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:26 compute-0 nova_compute[182935]: 2026-01-22 00:38:26.934 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:27 compute-0 podman[244551]: 2026-01-22 00:38:27.716971732 +0000 UTC m=+0.082236310 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:38:30 compute-0 nova_compute[182935]: 2026-01-22 00:38:30.622 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:31 compute-0 podman[244572]: 2026-01-22 00:38:31.680685548 +0000 UTC m=+0.056997296 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 00:38:31 compute-0 podman[244573]: 2026-01-22 00:38:31.684254763 +0000 UTC m=+0.055312195 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:38:31 compute-0 nova_compute[182935]: 2026-01-22 00:38:31.806 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:31 compute-0 nova_compute[182935]: 2026-01-22 00:38:31.807 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:38:31 compute-0 nova_compute[182935]: 2026-01-22 00:38:31.937 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:35 compute-0 nova_compute[182935]: 2026-01-22 00:38:35.147 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:35 compute-0 nova_compute[182935]: 2026-01-22 00:38:35.646 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:36 compute-0 nova_compute[182935]: 2026-01-22 00:38:36.940 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:40 compute-0 nova_compute[182935]: 2026-01-22 00:38:40.648 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:41 compute-0 nova_compute[182935]: 2026-01-22 00:38:41.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:38:41 compute-0 nova_compute[182935]: 2026-01-22 00:38:41.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:38:41 compute-0 nova_compute[182935]: 2026-01-22 00:38:41.810 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:38:41 compute-0 nova_compute[182935]: 2026-01-22 00:38:41.945 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:45 compute-0 nova_compute[182935]: 2026-01-22 00:38:45.681 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:46 compute-0 nova_compute[182935]: 2026-01-22 00:38:46.948 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:49 compute-0 podman[244613]: 2026-01-22 00:38:49.685189173 +0000 UTC m=+0.048544985 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:38:49 compute-0 podman[244611]: 2026-01-22 00:38:49.694588997 +0000 UTC m=+0.061927484 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:38:49 compute-0 podman[244612]: 2026-01-22 00:38:49.722939647 +0000 UTC m=+0.088264805 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:38:50 compute-0 nova_compute[182935]: 2026-01-22 00:38:50.684 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:51 compute-0 nova_compute[182935]: 2026-01-22 00:38:51.950 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:54 compute-0 sshd-session[244681]: Invalid user nginx from 188.166.69.60 port 58062
Jan 22 00:38:54 compute-0 sshd-session[244681]: Connection closed by invalid user nginx 188.166.69.60 port 58062 [preauth]
Jan 22 00:38:55 compute-0 nova_compute[182935]: 2026-01-22 00:38:55.719 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:56 compute-0 nova_compute[182935]: 2026-01-22 00:38:56.992 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:38:58 compute-0 podman[244683]: 2026-01-22 00:38:58.680565315 +0000 UTC m=+0.048085622 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 00:39:00 compute-0 nova_compute[182935]: 2026-01-22 00:39:00.722 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:00 compute-0 nova_compute[182935]: 2026-01-22 00:39:00.811 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:00 compute-0 nova_compute[182935]: 2026-01-22 00:39:00.834 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:00 compute-0 nova_compute[182935]: 2026-01-22 00:39:00.834 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:00 compute-0 nova_compute[182935]: 2026-01-22 00:39:00.835 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:00 compute-0 nova_compute[182935]: 2026-01-22 00:39:00.835 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:39:00 compute-0 nova_compute[182935]: 2026-01-22 00:39:00.958 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:39:00 compute-0 nova_compute[182935]: 2026-01-22 00:39:00.959 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5712MB free_disk=73.12314987182617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:39:00 compute-0 nova_compute[182935]: 2026-01-22 00:39:00.960 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:00 compute-0 nova_compute[182935]: 2026-01-22 00:39:00.960 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:01 compute-0 nova_compute[182935]: 2026-01-22 00:39:01.036 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:39:01 compute-0 nova_compute[182935]: 2026-01-22 00:39:01.037 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:39:01 compute-0 nova_compute[182935]: 2026-01-22 00:39:01.064 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:39:01 compute-0 nova_compute[182935]: 2026-01-22 00:39:01.086 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:39:01 compute-0 nova_compute[182935]: 2026-01-22 00:39:01.088 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:39:01 compute-0 nova_compute[182935]: 2026-01-22 00:39:01.088 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:01 compute-0 nova_compute[182935]: 2026-01-22 00:39:01.995 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:02 compute-0 podman[244702]: 2026-01-22 00:39:02.675787217 +0000 UTC m=+0.053545693 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:39:02 compute-0 podman[244703]: 2026-01-22 00:39:02.68595245 +0000 UTC m=+0.058577684 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:39:03 compute-0 podman[198815]: time="2026-01-22T00:39:03Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 22 00:39:03 compute-0 podman[198815]: @ - - [22/Jan/2026:00:39:03 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 21520 "" "Go-http-client/1.1"
Jan 22 00:39:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:03.237 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:03.238 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:03.238 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:05 compute-0 nova_compute[182935]: 2026-01-22 00:39:05.725 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:06 compute-0 nova_compute[182935]: 2026-01-22 00:39:06.071 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:06 compute-0 nova_compute[182935]: 2026-01-22 00:39:06.071 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:39:06 compute-0 nova_compute[182935]: 2026-01-22 00:39:06.072 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:39:06 compute-0 nova_compute[182935]: 2026-01-22 00:39:06.256 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:39:06 compute-0 nova_compute[182935]: 2026-01-22 00:39:06.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:06 compute-0 nova_compute[182935]: 2026-01-22 00:39:06.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:39:06 compute-0 nova_compute[182935]: 2026-01-22 00:39:06.998 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:07.409 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:39:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:07.410 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:39:07 compute-0 nova_compute[182935]: 2026-01-22 00:39:07.410 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:08 compute-0 nova_compute[182935]: 2026-01-22 00:39:08.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:10 compute-0 nova_compute[182935]: 2026-01-22 00:39:10.726 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:12 compute-0 nova_compute[182935]: 2026-01-22 00:39:12.000 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:12 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:12.412 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:12 compute-0 nova_compute[182935]: 2026-01-22 00:39:12.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:15 compute-0 nova_compute[182935]: 2026-01-22 00:39:15.774 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:16 compute-0 nova_compute[182935]: 2026-01-22 00:39:16.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:17 compute-0 nova_compute[182935]: 2026-01-22 00:39:17.057 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:17 compute-0 nova_compute[182935]: 2026-01-22 00:39:17.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:17 compute-0 nova_compute[182935]: 2026-01-22 00:39:17.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:18 compute-0 nova_compute[182935]: 2026-01-22 00:39:18.787 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:39:20 compute-0 podman[244742]: 2026-01-22 00:39:20.679685054 +0000 UTC m=+0.053606074 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:39:20 compute-0 podman[244744]: 2026-01-22 00:39:20.686113808 +0000 UTC m=+0.051257238 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:39:20 compute-0 podman[244743]: 2026-01-22 00:39:20.71373113 +0000 UTC m=+0.082053756 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 22 00:39:20 compute-0 nova_compute[182935]: 2026-01-22 00:39:20.813 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:22 compute-0 nova_compute[182935]: 2026-01-22 00:39:22.060 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:25 compute-0 nova_compute[182935]: 2026-01-22 00:39:25.814 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:27 compute-0 nova_compute[182935]: 2026-01-22 00:39:27.108 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:29 compute-0 podman[244815]: 2026-01-22 00:39:29.689936184 +0000 UTC m=+0.063031930 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:39:30 compute-0 nova_compute[182935]: 2026-01-22 00:39:30.816 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:32 compute-0 nova_compute[182935]: 2026-01-22 00:39:32.112 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:33 compute-0 podman[244835]: 2026-01-22 00:39:33.671736744 +0000 UTC m=+0.050404667 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 00:39:33 compute-0 podman[244836]: 2026-01-22 00:39:33.680525745 +0000 UTC m=+0.055658294 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:39:35 compute-0 nova_compute[182935]: 2026-01-22 00:39:35.856 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:37 compute-0 nova_compute[182935]: 2026-01-22 00:39:37.163 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:38 compute-0 sshd-session[244875]: Invalid user nginx from 188.166.69.60 port 54466
Jan 22 00:39:38 compute-0 sshd-session[244875]: Connection closed by invalid user nginx 188.166.69.60 port 54466 [preauth]
Jan 22 00:39:40 compute-0 nova_compute[182935]: 2026-01-22 00:39:40.899 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:42 compute-0 nova_compute[182935]: 2026-01-22 00:39:42.199 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:45 compute-0 nova_compute[182935]: 2026-01-22 00:39:45.901 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:47 compute-0 nova_compute[182935]: 2026-01-22 00:39:47.200 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.091 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.092 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.116 182939 DEBUG nova.compute.manager [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.288 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.289 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.296 182939 DEBUG nova.virt.hardware [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.296 182939 INFO nova.compute.claims [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.420 182939 DEBUG nova.compute.provider_tree [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.464 182939 DEBUG nova.scheduler.client.report [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.496 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.496 182939 DEBUG nova.compute.manager [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.566 182939 DEBUG nova.compute.manager [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.567 182939 DEBUG nova.network.neutron [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.584 182939 INFO nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.606 182939 DEBUG nova.compute.manager [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.713 182939 DEBUG nova.compute.manager [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.714 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.715 182939 INFO nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Creating image(s)
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.715 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.716 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.716 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.732 182939 DEBUG oslo_concurrency.processutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.796 182939 DEBUG oslo_concurrency.processutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.797 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.798 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.810 182939 DEBUG oslo_concurrency.processutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.866 182939 DEBUG oslo_concurrency.processutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.867 182939 DEBUG oslo_concurrency.processutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.899 182939 DEBUG oslo_concurrency.processutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.900 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.901 182939 DEBUG oslo_concurrency.processutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.957 182939 DEBUG oslo_concurrency.processutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.958 182939 DEBUG nova.virt.disk.api [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:39:48 compute-0 nova_compute[182935]: 2026-01-22 00:39:48.959 182939 DEBUG oslo_concurrency.processutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:39:49 compute-0 nova_compute[182935]: 2026-01-22 00:39:49.014 182939 DEBUG oslo_concurrency.processutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:39:49 compute-0 nova_compute[182935]: 2026-01-22 00:39:49.015 182939 DEBUG nova.virt.disk.api [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:39:49 compute-0 nova_compute[182935]: 2026-01-22 00:39:49.015 182939 DEBUG nova.objects.instance [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid e5e2845b-3703-4c14-8ea6-9c2553e54198 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:39:49 compute-0 nova_compute[182935]: 2026-01-22 00:39:49.159 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:39:49 compute-0 nova_compute[182935]: 2026-01-22 00:39:49.160 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Ensure instance console log exists: /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:39:49 compute-0 nova_compute[182935]: 2026-01-22 00:39:49.160 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:49 compute-0 nova_compute[182935]: 2026-01-22 00:39:49.160 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:49 compute-0 nova_compute[182935]: 2026-01-22 00:39:49.161 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:49 compute-0 nova_compute[182935]: 2026-01-22 00:39:49.235 182939 DEBUG nova.policy [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:39:50 compute-0 nova_compute[182935]: 2026-01-22 00:39:50.491 182939 DEBUG nova.network.neutron [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Successfully created port: 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:39:50 compute-0 nova_compute[182935]: 2026-01-22 00:39:50.904 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:51 compute-0 nova_compute[182935]: 2026-01-22 00:39:51.585 182939 DEBUG nova.network.neutron [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Successfully created port: 68410b8d-352f-40ee-9abf-04ba4c6996ec _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:39:51 compute-0 podman[244892]: 2026-01-22 00:39:51.690480409 +0000 UTC m=+0.058714897 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:39:51 compute-0 podman[244894]: 2026-01-22 00:39:51.690614232 +0000 UTC m=+0.054470056 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:39:51 compute-0 podman[244893]: 2026-01-22 00:39:51.723998401 +0000 UTC m=+0.092257310 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:39:52 compute-0 nova_compute[182935]: 2026-01-22 00:39:52.201 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:53 compute-0 nova_compute[182935]: 2026-01-22 00:39:53.525 182939 DEBUG nova.network.neutron [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Successfully updated port: 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:39:53 compute-0 nova_compute[182935]: 2026-01-22 00:39:53.663 182939 DEBUG nova.compute.manager [req-0400514b-4b35-48ce-bd47-a07d8279b81a req-55da28f1-c370-4e8d-91db-ad13ae9dcbfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-changed-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:39:53 compute-0 nova_compute[182935]: 2026-01-22 00:39:53.663 182939 DEBUG nova.compute.manager [req-0400514b-4b35-48ce-bd47-a07d8279b81a req-55da28f1-c370-4e8d-91db-ad13ae9dcbfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Refreshing instance network info cache due to event network-changed-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:39:53 compute-0 nova_compute[182935]: 2026-01-22 00:39:53.664 182939 DEBUG oslo_concurrency.lockutils [req-0400514b-4b35-48ce-bd47-a07d8279b81a req-55da28f1-c370-4e8d-91db-ad13ae9dcbfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:39:53 compute-0 nova_compute[182935]: 2026-01-22 00:39:53.664 182939 DEBUG oslo_concurrency.lockutils [req-0400514b-4b35-48ce-bd47-a07d8279b81a req-55da28f1-c370-4e8d-91db-ad13ae9dcbfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:39:53 compute-0 nova_compute[182935]: 2026-01-22 00:39:53.664 182939 DEBUG nova.network.neutron [req-0400514b-4b35-48ce-bd47-a07d8279b81a req-55da28f1-c370-4e8d-91db-ad13ae9dcbfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Refreshing network info cache for port 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:39:53 compute-0 nova_compute[182935]: 2026-01-22 00:39:53.945 182939 DEBUG nova.network.neutron [req-0400514b-4b35-48ce-bd47-a07d8279b81a req-55da28f1-c370-4e8d-91db-ad13ae9dcbfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:39:54 compute-0 nova_compute[182935]: 2026-01-22 00:39:54.370 182939 DEBUG nova.network.neutron [req-0400514b-4b35-48ce-bd47-a07d8279b81a req-55da28f1-c370-4e8d-91db-ad13ae9dcbfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:39:54 compute-0 nova_compute[182935]: 2026-01-22 00:39:54.386 182939 DEBUG oslo_concurrency.lockutils [req-0400514b-4b35-48ce-bd47-a07d8279b81a req-55da28f1-c370-4e8d-91db-ad13ae9dcbfa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:39:54 compute-0 nova_compute[182935]: 2026-01-22 00:39:54.530 182939 DEBUG nova.network.neutron [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Successfully updated port: 68410b8d-352f-40ee-9abf-04ba4c6996ec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:39:54 compute-0 nova_compute[182935]: 2026-01-22 00:39:54.551 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:39:54 compute-0 nova_compute[182935]: 2026-01-22 00:39:54.551 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:39:54 compute-0 nova_compute[182935]: 2026-01-22 00:39:54.552 182939 DEBUG nova.network.neutron [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:39:54 compute-0 nova_compute[182935]: 2026-01-22 00:39:54.693 182939 DEBUG nova.network.neutron [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:39:55 compute-0 nova_compute[182935]: 2026-01-22 00:39:55.780 182939 DEBUG nova.compute.manager [req-55a34152-2f58-4772-b2fb-9d1b408ec2e7 req-e2445691-63ff-4891-9861-a465e5710260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-changed-68410b8d-352f-40ee-9abf-04ba4c6996ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:39:55 compute-0 nova_compute[182935]: 2026-01-22 00:39:55.780 182939 DEBUG nova.compute.manager [req-55a34152-2f58-4772-b2fb-9d1b408ec2e7 req-e2445691-63ff-4891-9861-a465e5710260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Refreshing instance network info cache due to event network-changed-68410b8d-352f-40ee-9abf-04ba4c6996ec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:39:55 compute-0 nova_compute[182935]: 2026-01-22 00:39:55.781 182939 DEBUG oslo_concurrency.lockutils [req-55a34152-2f58-4772-b2fb-9d1b408ec2e7 req-e2445691-63ff-4891-9861-a465e5710260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:39:55 compute-0 nova_compute[182935]: 2026-01-22 00:39:55.905 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.798 182939 DEBUG nova.network.neutron [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Updating instance_info_cache with network_info: [{"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.822 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.823 182939 DEBUG nova.compute.manager [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Instance network_info: |[{"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.824 182939 DEBUG oslo_concurrency.lockutils [req-55a34152-2f58-4772-b2fb-9d1b408ec2e7 req-e2445691-63ff-4891-9861-a465e5710260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.825 182939 DEBUG nova.network.neutron [req-55a34152-2f58-4772-b2fb-9d1b408ec2e7 req-e2445691-63ff-4891-9861-a465e5710260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Refreshing network info cache for port 68410b8d-352f-40ee-9abf-04ba4c6996ec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.832 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Start _get_guest_xml network_info=[{"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.839 182939 WARNING nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.844 182939 DEBUG nova.virt.libvirt.host [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.845 182939 DEBUG nova.virt.libvirt.host [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.857 182939 DEBUG nova.virt.libvirt.host [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.858 182939 DEBUG nova.virt.libvirt.host [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.860 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.861 182939 DEBUG nova.virt.hardware [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.862 182939 DEBUG nova.virt.hardware [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.863 182939 DEBUG nova.virt.hardware [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.863 182939 DEBUG nova.virt.hardware [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.864 182939 DEBUG nova.virt.hardware [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.864 182939 DEBUG nova.virt.hardware [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.865 182939 DEBUG nova.virt.hardware [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.866 182939 DEBUG nova.virt.hardware [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.867 182939 DEBUG nova.virt.hardware [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.867 182939 DEBUG nova.virt.hardware [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.868 182939 DEBUG nova.virt.hardware [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.878 182939 DEBUG nova.virt.libvirt.vif [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:39:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1528864554',display_name='tempest-TestGettingAddress-server-1528864554',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1528864554',id=177,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfT5O6ZRpgmE55cZJ3QGKqkIsXzVQor6hmo4DYN0Y1I8FjHDDT3akgLWKRb+GjMp1RPaUHp87VJJNRMEPu8nfNaeXczwCaPpHi3Lj4qUZVeX1rL89fg6akvTfTjYIINQg==',key_name='tempest-TestGettingAddress-1576090324',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-31l429uo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:39:48Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=e5e2845b-3703-4c14-8ea6-9c2553e54198,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.879 182939 DEBUG nova.network.os_vif_util [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.881 182939 DEBUG nova.network.os_vif_util [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:4e:dd,bridge_name='br-int',has_traffic_filtering=True,id=05861bb8-a81c-4f72-af9d-ec27c5f9f6e2,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05861bb8-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.882 182939 DEBUG nova.virt.libvirt.vif [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:39:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1528864554',display_name='tempest-TestGettingAddress-server-1528864554',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1528864554',id=177,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfT5O6ZRpgmE55cZJ3QGKqkIsXzVQor6hmo4DYN0Y1I8FjHDDT3akgLWKRb+GjMp1RPaUHp87VJJNRMEPu8nfNaeXczwCaPpHi3Lj4qUZVeX1rL89fg6akvTfTjYIINQg==',key_name='tempest-TestGettingAddress-1576090324',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-31l429uo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:39:48Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=e5e2845b-3703-4c14-8ea6-9c2553e54198,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.883 182939 DEBUG nova.network.os_vif_util [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.884 182939 DEBUG nova.network.os_vif_util [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:61:b4,bridge_name='br-int',has_traffic_filtering=True,id=68410b8d-352f-40ee-9abf-04ba4c6996ec,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68410b8d-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.886 182939 DEBUG nova.objects.instance [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid e5e2845b-3703-4c14-8ea6-9c2553e54198 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.918 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:39:56 compute-0 nova_compute[182935]:   <uuid>e5e2845b-3703-4c14-8ea6-9c2553e54198</uuid>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   <name>instance-000000b1</name>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <nova:name>tempest-TestGettingAddress-server-1528864554</nova:name>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:39:56</nova:creationTime>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:39:56 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:39:56 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:39:56 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:39:56 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:39:56 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:39:56 compute-0 nova_compute[182935]:         <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 22 00:39:56 compute-0 nova_compute[182935]:         <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:39:56 compute-0 nova_compute[182935]:         <nova:port uuid="05861bb8-a81c-4f72-af9d-ec27c5f9f6e2">
Jan 22 00:39:56 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:39:56 compute-0 nova_compute[182935]:         <nova:port uuid="68410b8d-352f-40ee-9abf-04ba4c6996ec">
Jan 22 00:39:56 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fecf:61b4" ipVersion="6"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <system>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <entry name="serial">e5e2845b-3703-4c14-8ea6-9c2553e54198</entry>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <entry name="uuid">e5e2845b-3703-4c14-8ea6-9c2553e54198</entry>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     </system>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   <os>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   </os>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   <features>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   </features>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.config"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:bd:4e:dd"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <target dev="tap05861bb8-a8"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:cf:61:b4"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <target dev="tap68410b8d-35"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/console.log" append="off"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <video>
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     </video>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:39:56 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:39:56 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:39:56 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:39:56 compute-0 nova_compute[182935]: </domain>
Jan 22 00:39:56 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.920 182939 DEBUG nova.compute.manager [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Preparing to wait for external event network-vif-plugged-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.920 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.920 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.920 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.921 182939 DEBUG nova.compute.manager [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Preparing to wait for external event network-vif-plugged-68410b8d-352f-40ee-9abf-04ba4c6996ec prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.921 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.922 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.922 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.923 182939 DEBUG nova.virt.libvirt.vif [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:39:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1528864554',display_name='tempest-TestGettingAddress-server-1528864554',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1528864554',id=177,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfT5O6ZRpgmE55cZJ3QGKqkIsXzVQor6hmo4DYN0Y1I8FjHDDT3akgLWKRb+GjMp1RPaUHp87VJJNRMEPu8nfNaeXczwCaPpHi3Lj4qUZVeX1rL89fg6akvTfTjYIINQg==',key_name='tempest-TestGettingAddress-1576090324',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-31l429uo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:39:48Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=e5e2845b-3703-4c14-8ea6-9c2553e54198,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.923 182939 DEBUG nova.network.os_vif_util [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.924 182939 DEBUG nova.network.os_vif_util [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:4e:dd,bridge_name='br-int',has_traffic_filtering=True,id=05861bb8-a81c-4f72-af9d-ec27c5f9f6e2,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05861bb8-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.924 182939 DEBUG os_vif [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:4e:dd,bridge_name='br-int',has_traffic_filtering=True,id=05861bb8-a81c-4f72-af9d-ec27c5f9f6e2,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05861bb8-a8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.925 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.925 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.926 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.929 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.930 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap05861bb8-a8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.930 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap05861bb8-a8, col_values=(('external_ids', {'iface-id': '05861bb8-a81c-4f72-af9d-ec27c5f9f6e2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:4e:dd', 'vm-uuid': 'e5e2845b-3703-4c14-8ea6-9c2553e54198'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.969 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:56 compute-0 NetworkManager[55139]: <info>  [1769042396.9706] manager: (tap05861bb8-a8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/358)
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.973 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.979 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.980 182939 INFO os_vif [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:4e:dd,bridge_name='br-int',has_traffic_filtering=True,id=05861bb8-a81c-4f72-af9d-ec27c5f9f6e2,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05861bb8-a8')
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.981 182939 DEBUG nova.virt.libvirt.vif [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:39:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1528864554',display_name='tempest-TestGettingAddress-server-1528864554',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1528864554',id=177,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfT5O6ZRpgmE55cZJ3QGKqkIsXzVQor6hmo4DYN0Y1I8FjHDDT3akgLWKRb+GjMp1RPaUHp87VJJNRMEPu8nfNaeXczwCaPpHi3Lj4qUZVeX1rL89fg6akvTfTjYIINQg==',key_name='tempest-TestGettingAddress-1576090324',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-31l429uo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:39:48Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=e5e2845b-3703-4c14-8ea6-9c2553e54198,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.981 182939 DEBUG nova.network.os_vif_util [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.982 182939 DEBUG nova.network.os_vif_util [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:61:b4,bridge_name='br-int',has_traffic_filtering=True,id=68410b8d-352f-40ee-9abf-04ba4c6996ec,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68410b8d-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.983 182939 DEBUG os_vif [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:61:b4,bridge_name='br-int',has_traffic_filtering=True,id=68410b8d-352f-40ee-9abf-04ba4c6996ec,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68410b8d-35') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.983 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.983 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.983 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.985 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.985 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap68410b8d-35, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.986 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap68410b8d-35, col_values=(('external_ids', {'iface-id': '68410b8d-352f-40ee-9abf-04ba4c6996ec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:61:b4', 'vm-uuid': 'e5e2845b-3703-4c14-8ea6-9c2553e54198'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.987 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:56 compute-0 NetworkManager[55139]: <info>  [1769042396.9879] manager: (tap68410b8d-35): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/359)
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.989 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.996 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:56 compute-0 nova_compute[182935]: 2026-01-22 00:39:56.996 182939 INFO os_vif [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:61:b4,bridge_name='br-int',has_traffic_filtering=True,id=68410b8d-352f-40ee-9abf-04ba4c6996ec,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68410b8d-35')
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.097 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.097 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.098 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:bd:4e:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.098 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:cf:61:b4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.098 182939 INFO nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Using config drive
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.402 182939 INFO nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Creating config drive at /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.config
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.411 182939 DEBUG oslo_concurrency.processutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprguqozgv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.558 182939 DEBUG oslo_concurrency.processutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprguqozgv" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:39:57 compute-0 kernel: tap05861bb8-a8: entered promiscuous mode
Jan 22 00:39:57 compute-0 NetworkManager[55139]: <info>  [1769042397.6313] manager: (tap05861bb8-a8): new Tun device (/org/freedesktop/NetworkManager/Devices/360)
Jan 22 00:39:57 compute-0 ovn_controller[95047]: 2026-01-22T00:39:57Z|00727|binding|INFO|Claiming lport 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 for this chassis.
Jan 22 00:39:57 compute-0 ovn_controller[95047]: 2026-01-22T00:39:57Z|00728|binding|INFO|05861bb8-a81c-4f72-af9d-ec27c5f9f6e2: Claiming fa:16:3e:bd:4e:dd 10.100.0.5
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.635 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:57 compute-0 kernel: tap68410b8d-35: entered promiscuous mode
Jan 22 00:39:57 compute-0 NetworkManager[55139]: <info>  [1769042397.6483] manager: (tap68410b8d-35): new Tun device (/org/freedesktop/NetworkManager/Devices/361)
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.660 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:4e:dd 10.100.0.5'], port_security=['fa:16:3e:bd:4e:dd 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d60c1e89-37d5-4a05-b566-04735ac9e501', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0b1cab7e-3d92-45e3-88fe-6266af987fc5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e88081bc-c33e-4f29-8ba8-cdfb76dc2a31, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=05861bb8-a81c-4f72-af9d-ec27c5f9f6e2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.661 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 in datapath d60c1e89-37d5-4a05-b566-04735ac9e501 bound to our chassis
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.663 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d60c1e89-37d5-4a05-b566-04735ac9e501
Jan 22 00:39:57 compute-0 systemd-udevd[244988]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:39:57 compute-0 systemd-udevd[244989]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.677 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7a969b2e-80f2-4110-9f48-3ccbe6fa38df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.678 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd60c1e89-31 in ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.681 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd60c1e89-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.681 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1182e547-526a-4359-987a-fd47332c8cc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 NetworkManager[55139]: <info>  [1769042397.6848] device (tap05861bb8-a8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:39:57 compute-0 NetworkManager[55139]: <info>  [1769042397.6853] device (tap05861bb8-a8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:39:57 compute-0 NetworkManager[55139]: <info>  [1769042397.6883] device (tap68410b8d-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.684 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab18a42-42e3-4e71-ba17-fa4d1ccb3f4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 NetworkManager[55139]: <info>  [1769042397.6887] device (tap68410b8d-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.696 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[522c4e01-d004-41d4-be3c-79a4f90c2a14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 systemd-machined[154182]: New machine qemu-91-instance-000000b1.
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.728 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3d063df7-8605-4580-a630-94708d700b68]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 ovn_controller[95047]: 2026-01-22T00:39:57Z|00729|binding|INFO|Claiming lport 68410b8d-352f-40ee-9abf-04ba4c6996ec for this chassis.
Jan 22 00:39:57 compute-0 ovn_controller[95047]: 2026-01-22T00:39:57Z|00730|binding|INFO|68410b8d-352f-40ee-9abf-04ba4c6996ec: Claiming fa:16:3e:cf:61:b4 2001:db8::f816:3eff:fecf:61b4
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.747 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.749 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:57 compute-0 systemd[1]: Started Virtual Machine qemu-91-instance-000000b1.
Jan 22 00:39:57 compute-0 ovn_controller[95047]: 2026-01-22T00:39:57Z|00731|binding|INFO|Setting lport 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 ovn-installed in OVS
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.756 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.763 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[c476091a-5923-4ebc-9f49-90bfa4652f06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 ovn_controller[95047]: 2026-01-22T00:39:57Z|00732|binding|INFO|Setting lport 68410b8d-352f-40ee-9abf-04ba4c6996ec ovn-installed in OVS
Jan 22 00:39:57 compute-0 NetworkManager[55139]: <info>  [1769042397.7696] manager: (tapd60c1e89-30): new Veth device (/org/freedesktop/NetworkManager/Devices/362)
Jan 22 00:39:57 compute-0 systemd-udevd[244994]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.771 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[30786756-e8ec-454f-8983-8ee1dbd47c96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.772 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.803 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d762f5be-5d41-42b8-8a71-1c99a8376732]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.806 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[bfda84a6-870d-4aa9-a1f9-c85a5b5e60dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 NetworkManager[55139]: <info>  [1769042397.8280] device (tapd60c1e89-30): carrier: link connected
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.834 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c895a9-7275-4b7c-a579-045adacd6ff4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.850 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f3335f18-0daf-4d32-b6fa-93194d1dab1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd60c1e89-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:c3:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 690056, 'reachable_time': 31437, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245025, 'error': None, 'target': 'ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.864 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[16f3316a-620a-4974-ad14-67637bd8c0f5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef1:c38b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 690056, 'tstamp': 690056}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245026, 'error': None, 'target': 'ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.879 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[54e001ce-9b8e-4aa4-8fad-2a5eca0363a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd60c1e89-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f1:c3:8b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 690056, 'reachable_time': 31437, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245027, 'error': None, 'target': 'ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.907 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cb54b543-7ed4-4fd7-af99-b658abe4c19b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 ovn_controller[95047]: 2026-01-22T00:39:57Z|00733|binding|INFO|Setting lport 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 up in Southbound
Jan 22 00:39:57 compute-0 ovn_controller[95047]: 2026-01-22T00:39:57Z|00734|binding|INFO|Setting lport 68410b8d-352f-40ee-9abf-04ba4c6996ec up in Southbound
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.940 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:61:b4 2001:db8::f816:3eff:fecf:61b4'], port_security=['fa:16:3e:cf:61:b4 2001:db8::f816:3eff:fecf:61b4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fecf:61b4/64', 'neutron:device_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0b1cab7e-3d92-45e3-88fe-6266af987fc5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29e7c9c6-d993-479c-815c-b28e8e044cd0, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=68410b8d-352f-40ee-9abf-04ba4c6996ec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.965 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab84193-6e72-427d-958a-bff0115d1088]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.966 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd60c1e89-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.967 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.967 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd60c1e89-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.969 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:57 compute-0 NetworkManager[55139]: <info>  [1769042397.9700] manager: (tapd60c1e89-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/363)
Jan 22 00:39:57 compute-0 kernel: tapd60c1e89-30: entered promiscuous mode
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.972 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.973 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd60c1e89-30, col_values=(('external_ids', {'iface-id': 'e129da0f-abdd-47af-b02c-0b124db30d95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.974 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:57 compute-0 ovn_controller[95047]: 2026-01-22T00:39:57Z|00735|binding|INFO|Releasing lport e129da0f-abdd-47af-b02c-0b124db30d95 from this chassis (sb_readonly=0)
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.975 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.976 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d60c1e89-37d5-4a05-b566-04735ac9e501.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d60c1e89-37d5-4a05-b566-04735ac9e501.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.980 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2a284d6b-f57f-42c3-9ba2-f181d578022d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.981 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-d60c1e89-37d5-4a05-b566-04735ac9e501
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/d60c1e89-37d5-4a05-b566-04735ac9e501.pid.haproxy
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID d60c1e89-37d5-4a05-b566-04735ac9e501
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:39:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:57.982 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501', 'env', 'PROCESS_TAG=haproxy-d60c1e89-37d5-4a05-b566-04735ac9e501', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d60c1e89-37d5-4a05-b566-04735ac9e501.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:39:57 compute-0 nova_compute[182935]: 2026-01-22 00:39:57.986 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.076 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042398.0760112, e5e2845b-3703-4c14-8ea6-9c2553e54198 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.077 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] VM Started (Lifecycle Event)
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.094 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.099 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042398.0765662, e5e2845b-3703-4c14-8ea6-9c2553e54198 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.099 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] VM Paused (Lifecycle Event)
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.120 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.123 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.141 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:39:58 compute-0 podman[245065]: 2026-01-22 00:39:58.349468605 +0000 UTC m=+0.049133297 container create f5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.380 182939 DEBUG nova.compute.manager [req-1ee8f083-be7d-44fb-a7df-e22daf83b078 req-e7bc4d55-e612-4ee5-8af8-fe57436be3a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-vif-plugged-68410b8d-352f-40ee-9abf-04ba4c6996ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.381 182939 DEBUG oslo_concurrency.lockutils [req-1ee8f083-be7d-44fb-a7df-e22daf83b078 req-e7bc4d55-e612-4ee5-8af8-fe57436be3a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.381 182939 DEBUG oslo_concurrency.lockutils [req-1ee8f083-be7d-44fb-a7df-e22daf83b078 req-e7bc4d55-e612-4ee5-8af8-fe57436be3a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.381 182939 DEBUG oslo_concurrency.lockutils [req-1ee8f083-be7d-44fb-a7df-e22daf83b078 req-e7bc4d55-e612-4ee5-8af8-fe57436be3a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.381 182939 DEBUG nova.compute.manager [req-1ee8f083-be7d-44fb-a7df-e22daf83b078 req-e7bc4d55-e612-4ee5-8af8-fe57436be3a6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Processing event network-vif-plugged-68410b8d-352f-40ee-9abf-04ba4c6996ec _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:39:58 compute-0 systemd[1]: Started libpod-conmon-f5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598.scope.
Jan 22 00:39:58 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:39:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90228c67d09372b2673bed0d1bdd33088731cd79b06341f9fc24f8d8db011434/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:39:58 compute-0 podman[245065]: 2026-01-22 00:39:58.324600409 +0000 UTC m=+0.024265091 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:39:58 compute-0 podman[245065]: 2026-01-22 00:39:58.426516669 +0000 UTC m=+0.126181361 container init f5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 22 00:39:58 compute-0 podman[245065]: 2026-01-22 00:39:58.431735625 +0000 UTC m=+0.131400297 container start f5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:39:58 compute-0 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[245081]: [NOTICE]   (245085) : New worker (245087) forked
Jan 22 00:39:58 compute-0 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[245081]: [NOTICE]   (245085) : Loading success.
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.487 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 68410b8d-352f-40ee-9abf-04ba4c6996ec in datapath 65bd5007-25fc-43be-bec0-20ff1d1f0a79 unbound from our chassis
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.489 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 65bd5007-25fc-43be-bec0-20ff1d1f0a79
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.500 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ef4f6daa-e81a-4450-8c30-25466f0485cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.501 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap65bd5007-21 in ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.503 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap65bd5007-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.503 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3f3b9dd6-d877-4c4c-93e3-1a109c5d44e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.504 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ba50f0e8-c416-4af0-9d07-aed456e0336c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.514 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[a573d1f3-62ee-4c15-bda7-358c7814526d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.522 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.523 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.539 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d9dedd6a-fb7c-4b1f-9dd9-d5a43a0cca25]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.540 182939 DEBUG nova.compute.manager [req-fc1ce5c9-4138-4f21-a742-537a0d9774db req-7f9b929a-3279-4d8e-84e5-716181c2aaa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-vif-plugged-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.541 182939 DEBUG oslo_concurrency.lockutils [req-fc1ce5c9-4138-4f21-a742-537a0d9774db req-7f9b929a-3279-4d8e-84e5-716181c2aaa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.541 182939 DEBUG oslo_concurrency.lockutils [req-fc1ce5c9-4138-4f21-a742-537a0d9774db req-7f9b929a-3279-4d8e-84e5-716181c2aaa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.541 182939 DEBUG oslo_concurrency.lockutils [req-fc1ce5c9-4138-4f21-a742-537a0d9774db req-7f9b929a-3279-4d8e-84e5-716181c2aaa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.541 182939 DEBUG nova.compute.manager [req-fc1ce5c9-4138-4f21-a742-537a0d9774db req-7f9b929a-3279-4d8e-84e5-716181c2aaa5 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Processing event network-vif-plugged-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.542 182939 DEBUG nova.compute.manager [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.546 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042398.5457647, e5e2845b-3703-4c14-8ea6-9c2553e54198 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.547 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] VM Resumed (Lifecycle Event)
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.548 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.552 182939 INFO nova.virt.libvirt.driver [-] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Instance spawned successfully.
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.552 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.568 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.573 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.576 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.576 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.577 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.577 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.577 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd539a4-d897-4b07-8825-ca0e97f0ecaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.577 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.578 182939 DEBUG nova.virt.libvirt.driver [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:39:58 compute-0 systemd-udevd[245017]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:39:58 compute-0 NetworkManager[55139]: <info>  [1769042398.5855] manager: (tap65bd5007-20): new Veth device (/org/freedesktop/NetworkManager/Devices/364)
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.584 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[61e70ba2-cc59-45d0-b8ec-59bc4d587aed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.605 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.622 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[98c53908-152c-483a-8418-23b314eb6592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.626 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[75ede5e6-6575-40cc-94cb-4e1a9e3af708]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.633 182939 DEBUG nova.network.neutron [req-55a34152-2f58-4772-b2fb-9d1b408ec2e7 req-e2445691-63ff-4891-9861-a465e5710260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Updated VIF entry in instance network info cache for port 68410b8d-352f-40ee-9abf-04ba4c6996ec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.633 182939 DEBUG nova.network.neutron [req-55a34152-2f58-4772-b2fb-9d1b408ec2e7 req-e2445691-63ff-4891-9861-a465e5710260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Updating instance_info_cache with network_info: [{"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:39:58 compute-0 NetworkManager[55139]: <info>  [1769042398.6510] device (tap65bd5007-20): carrier: link connected
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.656 182939 INFO nova.compute.manager [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Took 9.94 seconds to spawn the instance on the hypervisor.
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.657 182939 DEBUG nova.compute.manager [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.658 182939 DEBUG oslo_concurrency.lockutils [req-55a34152-2f58-4772-b2fb-9d1b408ec2e7 req-e2445691-63ff-4891-9861-a465e5710260 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.660 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[2a7adbe6-fc90-4cd8-807e-203760926e71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.678 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c677c38a-3122-4f22-82ae-09719f0496b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65bd5007-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:88:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 234], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 690138, 'reachable_time': 39846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245106, 'error': None, 'target': 'ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.693 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[66677ebf-de20-4861-a794-1c7a6b195fa7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe05:88ac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 690138, 'tstamp': 690138}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245107, 'error': None, 'target': 'ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.710 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[68950c85-37ce-4490-91e3-4270f7dc734b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65bd5007-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:88:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 234], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 690138, 'reachable_time': 39846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245108, 'error': None, 'target': 'ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.742 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3c587e29-ffcd-42ac-b13b-0c8d096614d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.743 182939 INFO nova.compute.manager [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Took 10.56 seconds to build instance.
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.763 182939 DEBUG oslo_concurrency.lockutils [None req-d721e2bc-3d45-4679-86cc-f179fccd010f a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.780 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a49ee5e3-3945-414d-9896-2a52b3820b9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.781 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65bd5007-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.782 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.782 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65bd5007-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.784 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:58 compute-0 kernel: tap65bd5007-20: entered promiscuous mode
Jan 22 00:39:58 compute-0 NetworkManager[55139]: <info>  [1769042398.7850] manager: (tap65bd5007-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/365)
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.786 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.787 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap65bd5007-20, col_values=(('external_ids', {'iface-id': 'e79ede92-32a1-4758-b6f9-877484a6ca25'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.788 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:58 compute-0 ovn_controller[95047]: 2026-01-22T00:39:58Z|00736|binding|INFO|Releasing lport e79ede92-32a1-4758-b6f9-877484a6ca25 from this chassis (sb_readonly=0)
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.790 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.790 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/65bd5007-25fc-43be-bec0-20ff1d1f0a79.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/65bd5007-25fc-43be-bec0-20ff1d1f0a79.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.791 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0903d81c-a43f-4c4a-8d6e-1f0fe81141ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.792 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-65bd5007-25fc-43be-bec0-20ff1d1f0a79
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/65bd5007-25fc-43be-bec0-20ff1d1f0a79.pid.haproxy
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 65bd5007-25fc-43be-bec0-20ff1d1f0a79
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:39:58 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:58.793 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'env', 'PROCESS_TAG=haproxy-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/65bd5007-25fc-43be-bec0-20ff1d1f0a79.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:39:58 compute-0 nova_compute[182935]: 2026-01-22 00:39:58.801 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:39:59 compute-0 podman[245138]: 2026-01-22 00:39:59.146103737 +0000 UTC m=+0.057583629 container create 10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:39:59 compute-0 systemd[1]: Started libpod-conmon-10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303.scope.
Jan 22 00:39:59 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:39:59 compute-0 podman[245138]: 2026-01-22 00:39:59.118777963 +0000 UTC m=+0.030257875 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:39:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0d2abe7fbffa0fad5e15ccc39320b731ed459fba080bc262009bc8f9e1489fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:39:59 compute-0 podman[245138]: 2026-01-22 00:39:59.232234279 +0000 UTC m=+0.143714191 container init 10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 22 00:39:59 compute-0 podman[245138]: 2026-01-22 00:39:59.237110896 +0000 UTC m=+0.148590788 container start 10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 22 00:39:59 compute-0 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[245153]: [NOTICE]   (245159) : New worker (245161) forked
Jan 22 00:39:59 compute-0 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[245153]: [NOTICE]   (245159) : Loading success.
Jan 22 00:39:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:39:59.291 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:40:00 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:40:00.294 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.470 182939 DEBUG nova.compute.manager [req-c59c5afe-e432-48e9-aeea-c20936e29206 req-3ad8ee6a-c6bd-450a-a3c9-6cd17832993f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-vif-plugged-68410b8d-352f-40ee-9abf-04ba4c6996ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.470 182939 DEBUG oslo_concurrency.lockutils [req-c59c5afe-e432-48e9-aeea-c20936e29206 req-3ad8ee6a-c6bd-450a-a3c9-6cd17832993f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.470 182939 DEBUG oslo_concurrency.lockutils [req-c59c5afe-e432-48e9-aeea-c20936e29206 req-3ad8ee6a-c6bd-450a-a3c9-6cd17832993f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.471 182939 DEBUG oslo_concurrency.lockutils [req-c59c5afe-e432-48e9-aeea-c20936e29206 req-3ad8ee6a-c6bd-450a-a3c9-6cd17832993f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.471 182939 DEBUG nova.compute.manager [req-c59c5afe-e432-48e9-aeea-c20936e29206 req-3ad8ee6a-c6bd-450a-a3c9-6cd17832993f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] No waiting events found dispatching network-vif-plugged-68410b8d-352f-40ee-9abf-04ba4c6996ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.471 182939 WARNING nova.compute.manager [req-c59c5afe-e432-48e9-aeea-c20936e29206 req-3ad8ee6a-c6bd-450a-a3c9-6cd17832993f 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received unexpected event network-vif-plugged-68410b8d-352f-40ee-9abf-04ba4c6996ec for instance with vm_state active and task_state None.
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.628 182939 DEBUG nova.compute.manager [req-2f6c253b-f0a5-49c8-91fa-7f4a237bbdeb req-61ec1288-93f3-411d-aaed-5738869e0658 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-vif-plugged-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.628 182939 DEBUG oslo_concurrency.lockutils [req-2f6c253b-f0a5-49c8-91fa-7f4a237bbdeb req-61ec1288-93f3-411d-aaed-5738869e0658 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.628 182939 DEBUG oslo_concurrency.lockutils [req-2f6c253b-f0a5-49c8-91fa-7f4a237bbdeb req-61ec1288-93f3-411d-aaed-5738869e0658 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.628 182939 DEBUG oslo_concurrency.lockutils [req-2f6c253b-f0a5-49c8-91fa-7f4a237bbdeb req-61ec1288-93f3-411d-aaed-5738869e0658 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.628 182939 DEBUG nova.compute.manager [req-2f6c253b-f0a5-49c8-91fa-7f4a237bbdeb req-61ec1288-93f3-411d-aaed-5738869e0658 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] No waiting events found dispatching network-vif-plugged-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.629 182939 WARNING nova.compute.manager [req-2f6c253b-f0a5-49c8-91fa-7f4a237bbdeb req-61ec1288-93f3-411d-aaed-5738869e0658 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received unexpected event network-vif-plugged-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 for instance with vm_state active and task_state None.
Jan 22 00:40:00 compute-0 podman[245171]: 2026-01-22 00:40:00.678496305 +0000 UTC m=+0.056235648 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.814 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.814 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.814 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.815 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.894 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.954 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.955 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:40:00 compute-0 nova_compute[182935]: 2026-01-22 00:40:00.974 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.017 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.176 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.177 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5521MB free_disk=73.12188720703125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.178 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.178 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.265 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance e5e2845b-3703-4c14-8ea6-9c2553e54198 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.266 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.266 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.303 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.318 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.337 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.338 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:01 compute-0 nova_compute[182935]: 2026-01-22 00:40:01.988 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:40:03.238 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:40:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:40:03.239 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:40:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:40:03.240 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:40:04 compute-0 podman[245199]: 2026-01-22 00:40:04.69100506 +0000 UTC m=+0.064080105 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 00:40:04 compute-0 podman[245200]: 2026-01-22 00:40:04.712789952 +0000 UTC m=+0.075683794 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:40:05 compute-0 nova_compute[182935]: 2026-01-22 00:40:05.976 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:07 compute-0 nova_compute[182935]: 2026-01-22 00:40:07.033 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:07 compute-0 nova_compute[182935]: 2026-01-22 00:40:07.338 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:07 compute-0 nova_compute[182935]: 2026-01-22 00:40:07.339 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:40:07 compute-0 nova_compute[182935]: 2026-01-22 00:40:07.339 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:40:07 compute-0 nova_compute[182935]: 2026-01-22 00:40:07.494 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:40:07 compute-0 nova_compute[182935]: 2026-01-22 00:40:07.494 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:40:07 compute-0 nova_compute[182935]: 2026-01-22 00:40:07.494 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:40:07 compute-0 nova_compute[182935]: 2026-01-22 00:40:07.495 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e5e2845b-3703-4c14-8ea6-9c2553e54198 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:40:07 compute-0 nova_compute[182935]: 2026-01-22 00:40:07.595 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:07 compute-0 NetworkManager[55139]: <info>  [1769042407.5959] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/366)
Jan 22 00:40:07 compute-0 NetworkManager[55139]: <info>  [1769042407.5967] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/367)
Jan 22 00:40:07 compute-0 ovn_controller[95047]: 2026-01-22T00:40:07Z|00737|binding|INFO|Releasing lport e129da0f-abdd-47af-b02c-0b124db30d95 from this chassis (sb_readonly=0)
Jan 22 00:40:07 compute-0 ovn_controller[95047]: 2026-01-22T00:40:07Z|00738|binding|INFO|Releasing lport e79ede92-32a1-4758-b6f9-877484a6ca25 from this chassis (sb_readonly=0)
Jan 22 00:40:07 compute-0 nova_compute[182935]: 2026-01-22 00:40:07.623 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:07 compute-0 ovn_controller[95047]: 2026-01-22T00:40:07Z|00739|binding|INFO|Releasing lport e129da0f-abdd-47af-b02c-0b124db30d95 from this chassis (sb_readonly=0)
Jan 22 00:40:07 compute-0 ovn_controller[95047]: 2026-01-22T00:40:07Z|00740|binding|INFO|Releasing lport e79ede92-32a1-4758-b6f9-877484a6ca25 from this chassis (sb_readonly=0)
Jan 22 00:40:07 compute-0 nova_compute[182935]: 2026-01-22 00:40:07.629 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:08 compute-0 nova_compute[182935]: 2026-01-22 00:40:08.035 182939 DEBUG nova.compute.manager [req-a8a9484f-a444-457e-b652-1e491fcf081f req-cd7030c2-b9a0-4502-8cf4-eba1f405d358 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-changed-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:40:08 compute-0 nova_compute[182935]: 2026-01-22 00:40:08.036 182939 DEBUG nova.compute.manager [req-a8a9484f-a444-457e-b652-1e491fcf081f req-cd7030c2-b9a0-4502-8cf4-eba1f405d358 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Refreshing instance network info cache due to event network-changed-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:40:08 compute-0 nova_compute[182935]: 2026-01-22 00:40:08.037 182939 DEBUG oslo_concurrency.lockutils [req-a8a9484f-a444-457e-b652-1e491fcf081f req-cd7030c2-b9a0-4502-8cf4-eba1f405d358 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:40:09 compute-0 nova_compute[182935]: 2026-01-22 00:40:09.715 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Updating instance_info_cache with network_info: [{"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:40:09 compute-0 nova_compute[182935]: 2026-01-22 00:40:09.735 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:40:09 compute-0 nova_compute[182935]: 2026-01-22 00:40:09.736 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:40:09 compute-0 nova_compute[182935]: 2026-01-22 00:40:09.736 182939 DEBUG oslo_concurrency.lockutils [req-a8a9484f-a444-457e-b652-1e491fcf081f req-cd7030c2-b9a0-4502-8cf4-eba1f405d358 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:40:09 compute-0 nova_compute[182935]: 2026-01-22 00:40:09.737 182939 DEBUG nova.network.neutron [req-a8a9484f-a444-457e-b652-1e491fcf081f req-cd7030c2-b9a0-4502-8cf4-eba1f405d358 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Refreshing network info cache for port 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:40:09 compute-0 nova_compute[182935]: 2026-01-22 00:40:09.738 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:09 compute-0 nova_compute[182935]: 2026-01-22 00:40:09.738 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:40:09 compute-0 nova_compute[182935]: 2026-01-22 00:40:09.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:10 compute-0 nova_compute[182935]: 2026-01-22 00:40:10.978 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:11 compute-0 nova_compute[182935]: 2026-01-22 00:40:11.217 182939 DEBUG nova.network.neutron [req-a8a9484f-a444-457e-b652-1e491fcf081f req-cd7030c2-b9a0-4502-8cf4-eba1f405d358 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Updated VIF entry in instance network info cache for port 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:40:11 compute-0 nova_compute[182935]: 2026-01-22 00:40:11.218 182939 DEBUG nova.network.neutron [req-a8a9484f-a444-457e-b652-1e491fcf081f req-cd7030c2-b9a0-4502-8cf4-eba1f405d358 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Updating instance_info_cache with network_info: [{"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:40:11 compute-0 nova_compute[182935]: 2026-01-22 00:40:11.249 182939 DEBUG oslo_concurrency.lockutils [req-a8a9484f-a444-457e-b652-1e491fcf081f req-cd7030c2-b9a0-4502-8cf4-eba1f405d358 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:40:11 compute-0 ovn_controller[95047]: 2026-01-22T00:40:11Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bd:4e:dd 10.100.0.5
Jan 22 00:40:11 compute-0 ovn_controller[95047]: 2026-01-22T00:40:11Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bd:4e:dd 10.100.0.5
Jan 22 00:40:12 compute-0 nova_compute[182935]: 2026-01-22 00:40:12.036 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:14 compute-0 nova_compute[182935]: 2026-01-22 00:40:14.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:15 compute-0 nova_compute[182935]: 2026-01-22 00:40:15.980 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:16 compute-0 nova_compute[182935]: 2026-01-22 00:40:16.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:17 compute-0 nova_compute[182935]: 2026-01-22 00:40:17.039 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:18 compute-0 nova_compute[182935]: 2026-01-22 00:40:18.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:18 compute-0 nova_compute[182935]: 2026-01-22 00:40:18.790 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:19 compute-0 nova_compute[182935]: 2026-01-22 00:40:19.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:19 compute-0 nova_compute[182935]: 2026-01-22 00:40:19.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:40:21 compute-0 nova_compute[182935]: 2026-01-22 00:40:21.013 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:21 compute-0 sshd-session[245251]: Invalid user nginx from 188.166.69.60 port 43094
Jan 22 00:40:21 compute-0 sshd-session[245251]: Connection closed by invalid user nginx 188.166.69.60 port 43094 [preauth]
Jan 22 00:40:22 compute-0 nova_compute[182935]: 2026-01-22 00:40:22.085 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:22 compute-0 podman[245253]: 2026-01-22 00:40:22.687107062 +0000 UTC m=+0.060509969 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:40:22 compute-0 podman[245254]: 2026-01-22 00:40:22.724872246 +0000 UTC m=+0.086694626 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:40:22 compute-0 podman[245255]: 2026-01-22 00:40:22.72501633 +0000 UTC m=+0.086131033 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.325 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'name': 'tempest-TestGettingAddress-server-1528864554', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b1', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '837db8748d074b3c9179b47d30e7a1d4', 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'hostId': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.326 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.329 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e5e2845b-3703-4c14-8ea6-9c2553e54198 / tap05861bb8-a8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.330 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e5e2845b-3703-4c14-8ea6-9c2553e54198 / tap68410b8d-35 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.331 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.331 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4dca064b-7db9-4c92-ae23-d6a8311af20c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap05861bb8-a8', 'timestamp': '2026-01-22T00:40:23.326910', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap05861bb8-a8', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:4e:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap05861bb8-a8'}, 'message_id': 'efdb8a5e-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': '4663fc43194d442adba0d2990bc3a71a3f06ba47ffe889370f006516a30a290e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap68410b8d-35', 'timestamp': '2026-01-22T00:40:23.326910', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap68410b8d-35', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:61:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68410b8d-35'}, 'message_id': 'efdba250-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': '552af107dc9096a1edf01ab7650ddde5fabbcb30b51259ae1e030e3c3f6b592b'}]}, 'timestamp': '2026-01-22 00:40:23.332396', '_unique_id': 'e02bad5cc18a4ad391733054d343e623'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.335 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.337 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.337 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.338 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0711338b-956c-42a4-be7a-b5080bb72c7e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap05861bb8-a8', 'timestamp': '2026-01-22T00:40:23.337686', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap05861bb8-a8', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:4e:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap05861bb8-a8'}, 'message_id': 'efdc86ca-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': 'd3ec6dd3fc86fd4c9acf8ebab9d1e93c92fc0d8a20dfa4963194eebe59c79d8a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap68410b8d-35', 'timestamp': '2026-01-22T00:40:23.337686', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap68410b8d-35', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:61:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68410b8d-35'}, 'message_id': 'efdc94f8-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': '27a26665adcb3cb7f6925e1c1101005d36c71a673e041b32564922c3493ff1f2'}]}, 'timestamp': '2026-01-22 00:40:23.338433', '_unique_id': 'e6856bfe26674a0990f4a06e3ff91aaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.339 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.340 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.351 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.351 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d66a236-320a-4dd9-9e25-c9bd065d0278', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-vda', 'timestamp': '2026-01-22T00:40:23.340341', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'efde9726-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.134640927, 'message_signature': '92b5cf63bb575f1ff3d0d67cf07c0358957233a27b8f84c462658dbbc514a4b4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-sda', 'timestamp': '2026-01-22T00:40:23.340341', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'efdea464-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.134640927, 'message_signature': 'c14df6fcb8d02fcc557c069fcacf19bb75d2d4463cb862ff1b7926e35eeb69a2'}]}, 'timestamp': '2026-01-22 00:40:23.351968', '_unique_id': '7f3321bb307f4b1f8eff3bc31b2ba13e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.353 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.354 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1528864554>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1528864554>]
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.354 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.354 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.354 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09e9be17-0d3c-4837-b0f4-430deaece786', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap05861bb8-a8', 'timestamp': '2026-01-22T00:40:23.354272', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap05861bb8-a8', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:4e:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap05861bb8-a8'}, 'message_id': 'efdf097c-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': 'ea4f8f416c5cc361742697c8122e98844405927cd97520d1db47bcdf3a7009a0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap68410b8d-35', 'timestamp': '2026-01-22T00:40:23.354272', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap68410b8d-35', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:61:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68410b8d-35'}, 'message_id': 'efdf12aa-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': 'a8ae6a14e492ce1029430d1c3a47b67e46aacb8d020de3d764f7da320d6e3a01'}]}, 'timestamp': '2026-01-22 00:40:23.354748', '_unique_id': '5cbcf0dd943d4fd6a4c3fef2fc0ad826'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.355 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.356 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1528864554>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1528864554>]
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.356 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.356 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.356 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a64dcbd-31d9-401b-b593-a2f76e1d0811', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-vda', 'timestamp': '2026-01-22T00:40:23.356258', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'efdf5738-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.134640927, 'message_signature': '697e17ccff700199ecc6da2dca3d6442b856ed17d627a17bd53df3b592919b3c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-sda', 'timestamp': '2026-01-22T00:40:23.356258', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'efdf621e-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.134640927, 'message_signature': '9168c957e3c1f1489ba42972ea33a663f728c60b87e7e4238548b8b413ff97d3'}]}, 'timestamp': '2026-01-22 00:40:23.356831', '_unique_id': '8e0ca171b3d54c728be76882a91f1369'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.357 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.380 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.read.requests volume: 1098 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.380 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb7a6b4b-adf3-4c59-988a-8dd3ea470a02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1098, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-vda', 'timestamp': '2026-01-22T00:40:23.357998', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'efe30068-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.152273158, 'message_signature': '00adfad77ddb775dd0f1e4e18e498c6269d1fad159594300a5c05ee69223881b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-sda', 'timestamp': '2026-01-22T00:40:23.357998', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'efe313d2-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.152273158, 'message_signature': '077894e9196c94a3a07d666e160da104b6977dbfd4a350f16cdef57ec2882b18'}]}, 'timestamp': '2026-01-22 00:40:23.380996', '_unique_id': '9f08f8f78c044ea7b9a3d425f8c3d263'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.381 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.382 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.382 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.382 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e2e554e-a1b4-4d1c-83dc-2864d07c61a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-vda', 'timestamp': '2026-01-22T00:40:23.382673', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'efe35f86-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.134640927, 'message_signature': '4d97ef085ba393efb62c590fd47e9ab44cd0e07ddb1000e62dadf8299f49e3b9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-sda', 'timestamp': '2026-01-22T00:40:23.382673', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'efe368c8-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.134640927, 'message_signature': '1f16bd9bad580a17d4d207e7f66b1c3c8e6f841032848b2207244b96d8451dbe'}]}, 'timestamp': '2026-01-22 00:40:23.383156', '_unique_id': '9760b8a9f08b49daa2d8e50a70e530cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.383 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.384 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.384 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.384 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33a7b0f4-1e9e-4320-8d68-c567c7755944', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap05861bb8-a8', 'timestamp': '2026-01-22T00:40:23.384263', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap05861bb8-a8', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:4e:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap05861bb8-a8'}, 'message_id': 'efe39cb2-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': '57f489a6f9883a1df8fb46bdce0438a057e39f3ba59af9dda7737a7b94c32c18'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap68410b8d-35', 'timestamp': '2026-01-22T00:40:23.384263', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap68410b8d-35', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:61:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68410b8d-35'}, 'message_id': 'efe3a572-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': '03b8900d2c182f75e7127e9b1ba91647abedb1557819618848b706a0c9e7c057'}]}, 'timestamp': '2026-01-22 00:40:23.384715', '_unique_id': '15d122fa1e964d2daedc6459ce8ed1be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.385 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.386 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.386 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.386 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1528864554>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1528864554>]
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.386 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.386 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.386 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7354c79-c053-43a1-b3d5-21c24413ed49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-vda', 'timestamp': '2026-01-22T00:40:23.386387', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'efe3ef50-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.152273158, 'message_signature': '8174cc0ca8d1ca339fdff6bde48c546752a804b1195b9a68980599604571a6e8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-sda', 'timestamp': '2026-01-22T00:40:23.386387', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'efe3f748-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.152273158, 'message_signature': 'b02e1a3bca3e25a5cd7f0e4080d4494446420d2e240fe0e08f70ff698cebe757'}]}, 'timestamp': '2026-01-22 00:40:23.386844', '_unique_id': '73986bfa7e8c4379b747c347441d4d05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.387 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.388 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.388 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.388 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1528864554>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1528864554>]
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.388 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.403 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/memory.usage volume: 44.02734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa57ca6e-eb39-4456-bcfb-d65a5ff7f645', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 44.02734375, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'timestamp': '2026-01-22T00:40:23.388361', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'efe69bce-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.197765828, 'message_signature': '006a762c5789949b5bb9e080108a1289be4483b788fe3d98bc8362a58ccab6b8'}]}, 'timestamp': '2026-01-22 00:40:23.404208', '_unique_id': '906be5125cc4422f960393e1e1ac660a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.405 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.read.latency volume: 388909879 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.406 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.read.latency volume: 17272871 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '415c1817-f140-402a-8564-ea498e35f01c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 388909879, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-vda', 'timestamp': '2026-01-22T00:40:23.405813', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'efe6e728-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.152273158, 'message_signature': '8d26b826dc1749a3cc8b27335f0b403007064d72b820e5f5eea0dbe3dfe4bad6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17272871, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-sda', 'timestamp': '2026-01-22T00:40:23.405813', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'efe6f132-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.152273158, 'message_signature': '6ddb4cd331c1dd07ef3344ec0839bddd84e342e06cb77765bcd8bc38a1272e56'}]}, 'timestamp': '2026-01-22 00:40:23.406337', '_unique_id': '10d2427c0f034b4695719ce989c3a2fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.407 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.incoming.bytes volume: 1648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.incoming.bytes volume: 740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f5f880e-71e3-4169-9ff8-317fc6e8f057', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1648, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap05861bb8-a8', 'timestamp': '2026-01-22T00:40:23.407838', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap05861bb8-a8', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:4e:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap05861bb8-a8'}, 'message_id': 'efe736ec-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': 'ad6243431d4b5a9bdb4fb0181dc0c860ad572e4115f8076e118344b71f6489e8'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 740, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap68410b8d-35', 'timestamp': '2026-01-22T00:40:23.407838', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap68410b8d-35', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:61:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68410b8d-35'}, 'message_id': 'efe741e6-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': '0e54678c2e7be32a2f5ae6b48704e69e1b0acf4ab5089c283b0bce7668862e39'}]}, 'timestamp': '2026-01-22 00:40:23.408423', '_unique_id': 'e2a22e99fed04e9b85e9deee1147c271'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.408 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.409 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.409 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.outgoing.bytes volume: 1550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.409 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.outgoing.bytes volume: 1892 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eec19a16-b89c-4bf5-b4c4-dbf2e9dd31d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1550, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap05861bb8-a8', 'timestamp': '2026-01-22T00:40:23.409541', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap05861bb8-a8', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:4e:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap05861bb8-a8'}, 'message_id': 'efe778e6-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': '32823c35d351e76cc77c2ab130816d16c3f752db675e502284a82030706e2dd2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1892, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap68410b8d-35', 'timestamp': '2026-01-22T00:40:23.409541', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap68410b8d-35', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:61:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68410b8d-35'}, 'message_id': 'efe782be-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': '7751716bb8b6903fdd857fc11fa90cd44d0f7a907c419ac1eb85dc17282e52c1'}]}, 'timestamp': '2026-01-22 00:40:23.410041', '_unique_id': '0f81cfc8cb8a48d49267efac8145bfc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.411 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.412 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.412 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.413 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0568ae8-1630-478b-b97d-3ad5b30fd765', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap05861bb8-a8', 'timestamp': '2026-01-22T00:40:23.412871', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap05861bb8-a8', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:4e:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap05861bb8-a8'}, 'message_id': 'efe7fc6c-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': 'fbcaec8dfce9db4e97f8430c0fc7fa6bb0fd5429dd5fc712621c135e6c31ade6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap68410b8d-35', 'timestamp': '2026-01-22T00:40:23.412871', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap68410b8d-35', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:61:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68410b8d-35'}, 'message_id': 'efe80716-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': 'b55f0ee210a2b75b96da59b860f7219c6c9ff031de951515def0c6c9a9b6f585'}]}, 'timestamp': '2026-01-22 00:40:23.413441', '_unique_id': 'ee3bcea8e67c48ca97e684ac36eb262d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.414 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8734aed5-1537-452e-b82a-1eb9e764b0a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap05861bb8-a8', 'timestamp': '2026-01-22T00:40:23.414960', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap05861bb8-a8', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:4e:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap05861bb8-a8'}, 'message_id': 'efe84ba4-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': 'c014f239086f4bb96632b6cb13e9c9c0625ea87ccbdad040bee5db86783c3357'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap68410b8d-35', 'timestamp': '2026-01-22T00:40:23.414960', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap68410b8d-35', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:61:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68410b8d-35'}, 'message_id': 'efe8541e-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': 'c069bcfdd8b6d2342daf57be01e2bc005c23599a0f313897580e9b76b5f91681'}]}, 'timestamp': '2026-01-22 00:40:23.415403', '_unique_id': '388556bd40b4434faf9c9153df210fbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.415 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.416 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.416 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.read.bytes volume: 30530048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.416 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7a81971-8a4c-4b1d-9c82-0f79e82fa251', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30530048, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-vda', 'timestamp': '2026-01-22T00:40:23.416509', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'efe88812-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.152273158, 'message_signature': '37e24e23ea931d5192f110a8aae7dd448008254331adb20738e336256b4e4f30'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-sda', 'timestamp': '2026-01-22T00:40:23.416509', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'efe890fa-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.152273158, 'message_signature': '6034a868b7f2bc81946dfe4cc3e0bd04d1f16ec91af31bf3910037b77cb2995b'}]}, 'timestamp': '2026-01-22 00:40:23.416953', '_unique_id': '82d19f0a66df4208b02f7ab9030f75a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.417 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.418 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.418 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.write.latency volume: 3187521828 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.418 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a8cf0e0-6b45-464b-aa5a-f1c3c4039b5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3187521828, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-vda', 'timestamp': '2026-01-22T00:40:23.418132', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'efe8c764-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.152273158, 'message_signature': '0f8a393cf3962601e08b484be36a471a2ff6bf851da38618cddaed8b98d05b0c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-sda', 'timestamp': '2026-01-22T00:40:23.418132', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'efe8cf8e-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.152273158, 'message_signature': '7d4d75ad7c4146ad9f353b4fc56b0ff61784b564672c06da9b48c1a88af972d6'}]}, 'timestamp': '2026-01-22 00:40:23.418557', '_unique_id': 'ea7a0140ab8d4fb0a8ac31a2b4d9185a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.419 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91759028-c3bd-46b7-9bc6-0c5544143551', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap05861bb8-a8', 'timestamp': '2026-01-22T00:40:23.419740', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap05861bb8-a8', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:4e:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap05861bb8-a8'}, 'message_id': 'efe90724-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': '93f14ba7d2653d663b7c253846dae4b9e6d536ae7fc8f83d87aef7567944e3cc'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap68410b8d-35', 'timestamp': '2026-01-22T00:40:23.419740', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap68410b8d-35', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:61:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68410b8d-35'}, 'message_id': 'efe90f9e-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': '5bd77afed95a2b676223da5c48bf6f8f87ba7df49edb1f0414e57d666d9924a0'}]}, 'timestamp': '2026-01-22 00:40:23.420216', '_unique_id': 'af874ffb172d436faf47f99a27fe85df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.420 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.421 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.421 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/cpu volume: 12280000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3884f7c1-e0f5-4093-90e0-e91e45b11ebd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12280000000, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'timestamp': '2026-01-22T00:40:23.421306', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'efe943d8-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.197765828, 'message_signature': 'f1170c40a4d974491b4c44bbe21b5d6f398418cc0f44fff14d2b9358d3184f92'}]}, 'timestamp': '2026-01-22 00:40:23.421541', '_unique_id': '4c8d8b7901a240bca3fab7838f1f9de5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.422 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/network.outgoing.packets volume: 18 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e6db1c7-8ce8-4772-81cd-e8f91c0ad9a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap05861bb8-a8', 'timestamp': '2026-01-22T00:40:23.422656', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap05861bb8-a8', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:4e:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap05861bb8-a8'}, 'message_id': 'efe978a8-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': 'eafe5d77c150fdaa1d59ebd07bfcaf443e51dfcb0a38d2d4ef39e596cf930e31'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 18, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'instance-000000b1-e5e2845b-3703-4c14-8ea6-9c2553e54198-tap68410b8d-35', 'timestamp': '2026-01-22T00:40:23.422656', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'tap68410b8d-35', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:cf:61:b4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap68410b8d-35'}, 'message_id': 'efe9826c-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.121192394, 'message_signature': 'e25a1b91f9bdc1e1600f041f3a5f2c2dd4274c70b957767cb71af6a219847422'}]}, 'timestamp': '2026-01-22 00:40:23.423140', '_unique_id': '6c3d0dbe2d2a4fb9bb016026791913fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.423 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.424 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.424 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.write.bytes volume: 72933376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.424 12 DEBUG ceilometer.compute.pollsters [-] e5e2845b-3703-4c14-8ea6-9c2553e54198/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0b34880-2145-48e8-825e-096c35a6abf8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72933376, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-vda', 'timestamp': '2026-01-22T00:40:23.424242', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'efe9b5f2-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.152273158, 'message_signature': '1bb14330d46a2461c917804aaf9855aa04f66cf82f1bd01a0284523a617c31b9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_name': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_name': None, 'resource_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198-sda', 'timestamp': '2026-01-22T00:40:23.424242', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1528864554', 'name': 'instance-000000b1', 'instance_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'instance_type': 'm1.nano', 'host': '152be67e99fc07bcd6d5cb0d44aee32d530c566679d65dce05ab76fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'c3389c03-89c4-4ff5-9e03-1a99d41713d4', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}, 'image_ref': '9cd98f02-a505-4543-a7ad-04e9a377b456', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'efe9be12-f72a-11f0-9743-fa163e6b0dfb', 'monotonic_time': 6926.152273158, 'message_signature': '246d8f331d34c876b68df712fcceece5dc240706f05ad2f0b55c7c90c7c6d25a'}]}, 'timestamp': '2026-01-22 00:40:23.424660', '_unique_id': 'b7e30d2df0d0497d9456fafc2098ad21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     yield
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 22 00:40:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:40:23.425 12 ERROR oslo_messaging.notify.messaging 
Jan 22 00:40:26 compute-0 nova_compute[182935]: 2026-01-22 00:40:26.015 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:27 compute-0 nova_compute[182935]: 2026-01-22 00:40:27.131 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:31 compute-0 nova_compute[182935]: 2026-01-22 00:40:31.075 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:31 compute-0 podman[245323]: 2026-01-22 00:40:31.68602927 +0000 UTC m=+0.057248351 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:40:32 compute-0 nova_compute[182935]: 2026-01-22 00:40:32.164 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:35 compute-0 podman[245343]: 2026-01-22 00:40:35.672398659 +0000 UTC m=+0.050570761 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Jan 22 00:40:35 compute-0 podman[245344]: 2026-01-22 00:40:35.685223986 +0000 UTC m=+0.058528311 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 22 00:40:36 compute-0 nova_compute[182935]: 2026-01-22 00:40:36.077 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:37 compute-0 nova_compute[182935]: 2026-01-22 00:40:37.166 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:41 compute-0 nova_compute[182935]: 2026-01-22 00:40:41.079 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:42 compute-0 nova_compute[182935]: 2026-01-22 00:40:42.208 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:46 compute-0 nova_compute[182935]: 2026-01-22 00:40:46.080 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:47 compute-0 nova_compute[182935]: 2026-01-22 00:40:47.212 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:51 compute-0 nova_compute[182935]: 2026-01-22 00:40:51.100 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:52 compute-0 nova_compute[182935]: 2026-01-22 00:40:52.214 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:53 compute-0 podman[245385]: 2026-01-22 00:40:53.706649376 +0000 UTC m=+0.059273551 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:40:53 compute-0 podman[245387]: 2026-01-22 00:40:53.71559578 +0000 UTC m=+0.061651037 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:40:53 compute-0 podman[245386]: 2026-01-22 00:40:53.740619199 +0000 UTC m=+0.090902467 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 22 00:40:56 compute-0 nova_compute[182935]: 2026-01-22 00:40:56.102 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:40:57 compute-0 nova_compute[182935]: 2026-01-22 00:40:57.218 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:01 compute-0 nova_compute[182935]: 2026-01-22 00:41:01.104 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:01 compute-0 nova_compute[182935]: 2026-01-22 00:41:01.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:01 compute-0 nova_compute[182935]: 2026-01-22 00:41:01.827 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:01 compute-0 nova_compute[182935]: 2026-01-22 00:41:01.827 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:01 compute-0 nova_compute[182935]: 2026-01-22 00:41:01.827 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:01 compute-0 nova_compute[182935]: 2026-01-22 00:41:01.828 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:41:01 compute-0 nova_compute[182935]: 2026-01-22 00:41:01.944 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.014 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.015 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.096 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.246 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.289 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.290 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5510MB free_disk=73.09407424926758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.291 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.291 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.579 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance e5e2845b-3703-4c14-8ea6-9c2553e54198 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.579 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.580 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:41:02 compute-0 podman[245465]: 2026-01-22 00:41:02.666817144 +0000 UTC m=+0.044096097 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.723 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.738 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.739 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:41:02 compute-0 nova_compute[182935]: 2026-01-22 00:41:02.739 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.448s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:03.240 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:03.240 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:03.241 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:03.909 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:41:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:03.910 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:41:03 compute-0 nova_compute[182935]: 2026-01-22 00:41:03.910 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:04 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:04.911 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:05 compute-0 sshd-session[245484]: Invalid user nginx from 188.166.69.60 port 36410
Jan 22 00:41:05 compute-0 sshd-session[245484]: Connection closed by invalid user nginx 188.166.69.60 port 36410 [preauth]
Jan 22 00:41:06 compute-0 nova_compute[182935]: 2026-01-22 00:41:06.107 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:06 compute-0 podman[245487]: 2026-01-22 00:41:06.686256486 +0000 UTC m=+0.052615471 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:41:06 compute-0 podman[245486]: 2026-01-22 00:41:06.692671109 +0000 UTC m=+0.065711454 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64)
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.248 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.847 182939 DEBUG oslo_concurrency.lockutils [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.847 182939 DEBUG oslo_concurrency.lockutils [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.848 182939 DEBUG oslo_concurrency.lockutils [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.848 182939 DEBUG oslo_concurrency.lockutils [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.848 182939 DEBUG oslo_concurrency.lockutils [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.861 182939 INFO nova.compute.manager [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Terminating instance
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.870 182939 DEBUG nova.compute.manager [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:41:07 compute-0 kernel: tap05861bb8-a8 (unregistering): left promiscuous mode
Jan 22 00:41:07 compute-0 NetworkManager[55139]: <info>  [1769042467.8974] device (tap05861bb8-a8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:41:07 compute-0 ovn_controller[95047]: 2026-01-22T00:41:07Z|00741|binding|INFO|Releasing lport 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 from this chassis (sb_readonly=0)
Jan 22 00:41:07 compute-0 ovn_controller[95047]: 2026-01-22T00:41:07Z|00742|binding|INFO|Setting lport 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 down in Southbound
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.906 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:07 compute-0 ovn_controller[95047]: 2026-01-22T00:41:07Z|00743|binding|INFO|Removing iface tap05861bb8-a8 ovn-installed in OVS
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.908 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:07 compute-0 kernel: tap68410b8d-35 (unregistering): left promiscuous mode
Jan 22 00:41:07 compute-0 NetworkManager[55139]: <info>  [1769042467.9238] device (tap68410b8d-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:41:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:07.924 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:4e:dd 10.100.0.5'], port_security=['fa:16:3e:bd:4e:dd 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d60c1e89-37d5-4a05-b566-04735ac9e501', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0b1cab7e-3d92-45e3-88fe-6266af987fc5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e88081bc-c33e-4f29-8ba8-cdfb76dc2a31, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=05861bb8-a81c-4f72-af9d-ec27c5f9f6e2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.925 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:07.925 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 in datapath d60c1e89-37d5-4a05-b566-04735ac9e501 unbound from our chassis
Jan 22 00:41:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:07.927 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d60c1e89-37d5-4a05-b566-04735ac9e501, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:41:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:07.928 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2a5a098d-a690-4457-883f-f45f01495276]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:07.929 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501 namespace which is not needed anymore
Jan 22 00:41:07 compute-0 ovn_controller[95047]: 2026-01-22T00:41:07Z|00744|binding|INFO|Releasing lport 68410b8d-352f-40ee-9abf-04ba4c6996ec from this chassis (sb_readonly=0)
Jan 22 00:41:07 compute-0 ovn_controller[95047]: 2026-01-22T00:41:07Z|00745|binding|INFO|Setting lport 68410b8d-352f-40ee-9abf-04ba4c6996ec down in Southbound
Jan 22 00:41:07 compute-0 ovn_controller[95047]: 2026-01-22T00:41:07Z|00746|binding|INFO|Removing iface tap68410b8d-35 ovn-installed in OVS
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.932 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.936 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:07 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:07.944 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:61:b4 2001:db8::f816:3eff:fecf:61b4'], port_security=['fa:16:3e:cf:61:b4 2001:db8::f816:3eff:fecf:61b4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fecf:61b4/64', 'neutron:device_id': 'e5e2845b-3703-4c14-8ea6-9c2553e54198', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0b1cab7e-3d92-45e3-88fe-6266af987fc5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29e7c9c6-d993-479c-815c-b28e8e044cd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=68410b8d-352f-40ee-9abf-04ba4c6996ec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:41:07 compute-0 nova_compute[182935]: 2026-01-22 00:41:07.950 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:07 compute-0 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000b1.scope: Deactivated successfully.
Jan 22 00:41:07 compute-0 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000b1.scope: Consumed 15.818s CPU time.
Jan 22 00:41:07 compute-0 systemd-machined[154182]: Machine qemu-91-instance-000000b1 terminated.
Jan 22 00:41:08 compute-0 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[245081]: [NOTICE]   (245085) : haproxy version is 2.8.14-c23fe91
Jan 22 00:41:08 compute-0 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[245081]: [NOTICE]   (245085) : path to executable is /usr/sbin/haproxy
Jan 22 00:41:08 compute-0 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[245081]: [WARNING]  (245085) : Exiting Master process...
Jan 22 00:41:08 compute-0 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[245081]: [WARNING]  (245085) : Exiting Master process...
Jan 22 00:41:08 compute-0 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[245081]: [ALERT]    (245085) : Current worker (245087) exited with code 143 (Terminated)
Jan 22 00:41:08 compute-0 neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501[245081]: [WARNING]  (245085) : All workers exited. Exiting... (0)
Jan 22 00:41:08 compute-0 systemd[1]: libpod-f5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598.scope: Deactivated successfully.
Jan 22 00:41:08 compute-0 podman[245555]: 2026-01-22 00:41:08.053776926 +0000 UTC m=+0.041312291 container died f5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:41:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598-userdata-shm.mount: Deactivated successfully.
Jan 22 00:41:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-90228c67d09372b2673bed0d1bdd33088731cd79b06341f9fc24f8d8db011434-merged.mount: Deactivated successfully.
Jan 22 00:41:08 compute-0 podman[245555]: 2026-01-22 00:41:08.094545892 +0000 UTC m=+0.082081257 container cleanup f5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 22 00:41:08 compute-0 systemd[1]: libpod-conmon-f5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598.scope: Deactivated successfully.
Jan 22 00:41:08 compute-0 NetworkManager[55139]: <info>  [1769042468.1043] manager: (tap68410b8d-35): new Tun device (/org/freedesktop/NetworkManager/Devices/368)
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.153 182939 INFO nova.virt.libvirt.driver [-] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Instance destroyed successfully.
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.154 182939 DEBUG nova.objects.instance [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid e5e2845b-3703-4c14-8ea6-9c2553e54198 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:41:08 compute-0 podman[245595]: 2026-01-22 00:41:08.158315848 +0000 UTC m=+0.042622181 container remove f5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.163 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd74a78-0d96-476f-8999-41d4ebc419d5]: (4, ('Thu Jan 22 12:41:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501 (f5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598)\nf5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598\nThu Jan 22 12:41:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501 (f5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598)\nf5cce61da2e4b14ca763c632698f219ce67f2dc9b32277daa75fbf4f84a3a598\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.164 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2b1a4c1d-38a5-4b24-8674-44ee6ea63bf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.165 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd60c1e89-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.167 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:08 compute-0 kernel: tapd60c1e89-30: left promiscuous mode
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.170 182939 DEBUG nova.virt.libvirt.vif [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:39:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1528864554',display_name='tempest-TestGettingAddress-server-1528864554',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1528864554',id=177,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfT5O6ZRpgmE55cZJ3QGKqkIsXzVQor6hmo4DYN0Y1I8FjHDDT3akgLWKRb+GjMp1RPaUHp87VJJNRMEPu8nfNaeXczwCaPpHi3Lj4qUZVeX1rL89fg6akvTfTjYIINQg==',key_name='tempest-TestGettingAddress-1576090324',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:39:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-31l429uo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:39:58Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=e5e2845b-3703-4c14-8ea6-9c2553e54198,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.171 182939 DEBUG nova.network.os_vif_util [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.172 182939 DEBUG nova.network.os_vif_util [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bd:4e:dd,bridge_name='br-int',has_traffic_filtering=True,id=05861bb8-a81c-4f72-af9d-ec27c5f9f6e2,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05861bb8-a8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.172 182939 DEBUG os_vif [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bd:4e:dd,bridge_name='br-int',has_traffic_filtering=True,id=05861bb8-a81c-4f72-af9d-ec27c5f9f6e2,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05861bb8-a8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.173 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.174 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05861bb8-a8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.175 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.177 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.230 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.233 182939 INFO os_vif [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bd:4e:dd,bridge_name='br-int',has_traffic_filtering=True,id=05861bb8-a81c-4f72-af9d-ec27c5f9f6e2,network=Network(d60c1e89-37d5-4a05-b566-04735ac9e501),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap05861bb8-a8')
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.233 182939 DEBUG nova.virt.libvirt.vif [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:39:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1528864554',display_name='tempest-TestGettingAddress-server-1528864554',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1528864554',id=177,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHfT5O6ZRpgmE55cZJ3QGKqkIsXzVQor6hmo4DYN0Y1I8FjHDDT3akgLWKRb+GjMp1RPaUHp87VJJNRMEPu8nfNaeXczwCaPpHi3Lj4qUZVeX1rL89fg6akvTfTjYIINQg==',key_name='tempest-TestGettingAddress-1576090324',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:39:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-31l429uo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:39:58Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=e5e2845b-3703-4c14-8ea6-9c2553e54198,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.234 182939 DEBUG nova.network.os_vif_util [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.234 182939 DEBUG nova.network.os_vif_util [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:61:b4,bridge_name='br-int',has_traffic_filtering=True,id=68410b8d-352f-40ee-9abf-04ba4c6996ec,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68410b8d-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.234 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9700d4-c0d2-4b0e-8345-ed24c5f35c33]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.235 182939 DEBUG os_vif [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:61:b4,bridge_name='br-int',has_traffic_filtering=True,id=68410b8d-352f-40ee-9abf-04ba4c6996ec,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68410b8d-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.236 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.236 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap68410b8d-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.241 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.242 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.242 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[602d5637-fe46-4993-86e9-066d172996d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.243 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b595b2ea-04bb-4d18-8546-639faf2bcaca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.244 182939 INFO os_vif [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:61:b4,bridge_name='br-int',has_traffic_filtering=True,id=68410b8d-352f-40ee-9abf-04ba4c6996ec,network=Network(65bd5007-25fc-43be-bec0-20ff1d1f0a79),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap68410b8d-35')
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.245 182939 INFO nova.virt.libvirt.driver [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Deleting instance files /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198_del
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.245 182939 INFO nova.virt.libvirt.driver [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Deletion of /var/lib/nova/instances/e5e2845b-3703-4c14-8ea6-9c2553e54198_del complete
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.259 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f0bfa8b6-0237-4a8e-96c4-0153dcb93206]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 690049, 'reachable_time': 42723, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245631, 'error': None, 'target': 'ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.262 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d60c1e89-37d5-4a05-b566-04735ac9e501 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.262 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[75d42bbb-f9c1-4eba-9e52-4fcee1c8615a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 systemd[1]: run-netns-ovnmeta\x2dd60c1e89\x2d37d5\x2d4a05\x2db566\x2d04735ac9e501.mount: Deactivated successfully.
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.263 104408 INFO neutron.agent.ovn.metadata.agent [-] Port 68410b8d-352f-40ee-9abf-04ba4c6996ec in datapath 65bd5007-25fc-43be-bec0-20ff1d1f0a79 unbound from our chassis
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.263 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 65bd5007-25fc-43be-bec0-20ff1d1f0a79, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.264 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0a03ea91-af23-4cff-85a8-d8ffb06fb77a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.264 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79 namespace which is not needed anymore
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.342 182939 INFO nova.compute.manager [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.342 182939 DEBUG oslo.service.loopingcall [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.343 182939 DEBUG nova.compute.manager [-] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.343 182939 DEBUG nova.network.neutron [-] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:41:08 compute-0 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[245153]: [NOTICE]   (245159) : haproxy version is 2.8.14-c23fe91
Jan 22 00:41:08 compute-0 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[245153]: [NOTICE]   (245159) : path to executable is /usr/sbin/haproxy
Jan 22 00:41:08 compute-0 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[245153]: [WARNING]  (245159) : Exiting Master process...
Jan 22 00:41:08 compute-0 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[245153]: [ALERT]    (245159) : Current worker (245161) exited with code 143 (Terminated)
Jan 22 00:41:08 compute-0 neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79[245153]: [WARNING]  (245159) : All workers exited. Exiting... (0)
Jan 22 00:41:08 compute-0 systemd[1]: libpod-10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303.scope: Deactivated successfully.
Jan 22 00:41:08 compute-0 podman[245649]: 2026-01-22 00:41:08.385520897 +0000 UTC m=+0.045826707 container died 10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.401 182939 DEBUG nova.compute.manager [req-61b3731d-bec1-4f09-b17b-7ed4e68712bb req-1ddc1943-8ea7-4b81-ae75-d272e77f318e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-changed-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.402 182939 DEBUG nova.compute.manager [req-61b3731d-bec1-4f09-b17b-7ed4e68712bb req-1ddc1943-8ea7-4b81-ae75-d272e77f318e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Refreshing instance network info cache due to event network-changed-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.402 182939 DEBUG oslo_concurrency.lockutils [req-61b3731d-bec1-4f09-b17b-7ed4e68712bb req-1ddc1943-8ea7-4b81-ae75-d272e77f318e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.402 182939 DEBUG oslo_concurrency.lockutils [req-61b3731d-bec1-4f09-b17b-7ed4e68712bb req-1ddc1943-8ea7-4b81-ae75-d272e77f318e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.402 182939 DEBUG nova.network.neutron [req-61b3731d-bec1-4f09-b17b-7ed4e68712bb req-1ddc1943-8ea7-4b81-ae75-d272e77f318e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Refreshing network info cache for port 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:41:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303-userdata-shm.mount: Deactivated successfully.
Jan 22 00:41:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-c0d2abe7fbffa0fad5e15ccc39320b731ed459fba080bc262009bc8f9e1489fc-merged.mount: Deactivated successfully.
Jan 22 00:41:08 compute-0 podman[245649]: 2026-01-22 00:41:08.414625704 +0000 UTC m=+0.074931514 container cleanup 10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.420 182939 DEBUG nova.compute.manager [req-6dc3e607-de67-4112-845e-9c70662f335f req-2d28e710-c3f8-4a7d-a468-c4f3ae2edf6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-vif-unplugged-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.421 182939 DEBUG oslo_concurrency.lockutils [req-6dc3e607-de67-4112-845e-9c70662f335f req-2d28e710-c3f8-4a7d-a468-c4f3ae2edf6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.421 182939 DEBUG oslo_concurrency.lockutils [req-6dc3e607-de67-4112-845e-9c70662f335f req-2d28e710-c3f8-4a7d-a468-c4f3ae2edf6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.421 182939 DEBUG oslo_concurrency.lockutils [req-6dc3e607-de67-4112-845e-9c70662f335f req-2d28e710-c3f8-4a7d-a468-c4f3ae2edf6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.421 182939 DEBUG nova.compute.manager [req-6dc3e607-de67-4112-845e-9c70662f335f req-2d28e710-c3f8-4a7d-a468-c4f3ae2edf6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] No waiting events found dispatching network-vif-unplugged-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.422 182939 DEBUG nova.compute.manager [req-6dc3e607-de67-4112-845e-9c70662f335f req-2d28e710-c3f8-4a7d-a468-c4f3ae2edf6c 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-vif-unplugged-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:41:08 compute-0 systemd[1]: libpod-conmon-10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303.scope: Deactivated successfully.
Jan 22 00:41:08 compute-0 podman[245680]: 2026-01-22 00:41:08.468760191 +0000 UTC m=+0.034682832 container remove 10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.473 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[737b0d8d-72b1-4e70-ae1e-ac7d6c03a363]: (4, ('Thu Jan 22 12:41:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79 (10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303)\n10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303\nThu Jan 22 12:41:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79 (10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303)\n10b54eadd10feffdab17e3d1b2ed17283195537411aabeaf3f473f7316ad3303\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.475 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[23ab63f2-a8d9-414f-8455-56dce5cbeef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.475 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65bd5007-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.477 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:08 compute-0 kernel: tap65bd5007-20: left promiscuous mode
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.480 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0e7b064d-264b-426f-b782-7151fee56fe9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.489 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.503 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[92b7199c-ff7b-48ab-8b0a-18581dd46d83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.504 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[23f640c4-c9c3-405e-97a1-864bdc68e15f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.519 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa8a01d-d1ce-4240-9dd7-cfad5b7b0d0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 690130, 'reachable_time': 32321, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245695, 'error': None, 'target': 'ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.520 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-65bd5007-25fc-43be-bec0-20ff1d1f0a79 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:41:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:08.520 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[38248a19-fdc3-43cc-a4f4-fb4bfc90f165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.739 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.740 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.819 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 22 00:41:08 compute-0 nova_compute[182935]: 2026-01-22 00:41:08.819 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:41:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d65bd5007\x2d25fc\x2d43be\x2dbec0\x2d20ff1d1f0a79.mount: Deactivated successfully.
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.716 182939 DEBUG nova.compute.manager [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-vif-unplugged-68410b8d-352f-40ee-9abf-04ba4c6996ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.717 182939 DEBUG oslo_concurrency.lockutils [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.717 182939 DEBUG oslo_concurrency.lockutils [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.717 182939 DEBUG oslo_concurrency.lockutils [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.717 182939 DEBUG nova.compute.manager [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] No waiting events found dispatching network-vif-unplugged-68410b8d-352f-40ee-9abf-04ba4c6996ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.718 182939 DEBUG nova.compute.manager [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-vif-unplugged-68410b8d-352f-40ee-9abf-04ba4c6996ec for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.718 182939 DEBUG nova.compute.manager [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-vif-plugged-68410b8d-352f-40ee-9abf-04ba4c6996ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.718 182939 DEBUG oslo_concurrency.lockutils [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.718 182939 DEBUG oslo_concurrency.lockutils [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.719 182939 DEBUG oslo_concurrency.lockutils [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.719 182939 DEBUG nova.compute.manager [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] No waiting events found dispatching network-vif-plugged-68410b8d-352f-40ee-9abf-04ba4c6996ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.719 182939 WARNING nova.compute.manager [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received unexpected event network-vif-plugged-68410b8d-352f-40ee-9abf-04ba4c6996ec for instance with vm_state active and task_state deleting.
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.719 182939 DEBUG nova.compute.manager [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-vif-deleted-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.720 182939 INFO nova.compute.manager [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Neutron deleted interface 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2; detaching it from the instance and deleting it from the info cache
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.720 182939 DEBUG nova.network.neutron [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Updating instance_info_cache with network_info: [{"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.747 182939 DEBUG nova.compute.manager [req-f9795e8a-a3d9-40ee-aa85-60168ade3262 req-9798afa8-e902-45e2-bd13-4e0e623314d6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Detach interface failed, port_id=05861bb8-a81c-4f72-af9d-ec27c5f9f6e2, reason: Instance e5e2845b-3703-4c14-8ea6-9c2553e54198 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:41:09 compute-0 nova_compute[182935]: 2026-01-22 00:41:09.918 182939 DEBUG nova.network.neutron [-] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.474 182939 DEBUG nova.network.neutron [req-61b3731d-bec1-4f09-b17b-7ed4e68712bb req-1ddc1943-8ea7-4b81-ae75-d272e77f318e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Updated VIF entry in instance network info cache for port 05861bb8-a81c-4f72-af9d-ec27c5f9f6e2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.474 182939 DEBUG nova.network.neutron [req-61b3731d-bec1-4f09-b17b-7ed4e68712bb req-1ddc1943-8ea7-4b81-ae75-d272e77f318e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Updating instance_info_cache with network_info: [{"id": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "address": "fa:16:3e:bd:4e:dd", "network": {"id": "d60c1e89-37d5-4a05-b566-04735ac9e501", "bridge": "br-int", "label": "tempest-network-smoke--2091942253", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap05861bb8-a8", "ovs_interfaceid": "05861bb8-a81c-4f72-af9d-ec27c5f9f6e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "address": "fa:16:3e:cf:61:b4", "network": {"id": "65bd5007-25fc-43be-bec0-20ff1d1f0a79", "bridge": "br-int", "label": "tempest-network-smoke--1426694891", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecf:61b4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap68410b8d-35", "ovs_interfaceid": "68410b8d-352f-40ee-9abf-04ba4c6996ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.504 182939 INFO nova.compute.manager [-] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Took 2.16 seconds to deallocate network for instance.
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.508 182939 DEBUG oslo_concurrency.lockutils [req-61b3731d-bec1-4f09-b17b-7ed4e68712bb req-1ddc1943-8ea7-4b81-ae75-d272e77f318e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-e5e2845b-3703-4c14-8ea6-9c2553e54198" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.514 182939 DEBUG nova.compute.manager [req-e6d3b839-d46a-4272-a6b6-42cbd1ba8325 req-4d71dd0f-3789-43e3-a5f5-0370f2810e88 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-vif-plugged-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.515 182939 DEBUG oslo_concurrency.lockutils [req-e6d3b839-d46a-4272-a6b6-42cbd1ba8325 req-4d71dd0f-3789-43e3-a5f5-0370f2810e88 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.515 182939 DEBUG oslo_concurrency.lockutils [req-e6d3b839-d46a-4272-a6b6-42cbd1ba8325 req-4d71dd0f-3789-43e3-a5f5-0370f2810e88 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.515 182939 DEBUG oslo_concurrency.lockutils [req-e6d3b839-d46a-4272-a6b6-42cbd1ba8325 req-4d71dd0f-3789-43e3-a5f5-0370f2810e88 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.515 182939 DEBUG nova.compute.manager [req-e6d3b839-d46a-4272-a6b6-42cbd1ba8325 req-4d71dd0f-3789-43e3-a5f5-0370f2810e88 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] No waiting events found dispatching network-vif-plugged-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.516 182939 WARNING nova.compute.manager [req-e6d3b839-d46a-4272-a6b6-42cbd1ba8325 req-4d71dd0f-3789-43e3-a5f5-0370f2810e88 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received unexpected event network-vif-plugged-05861bb8-a81c-4f72-af9d-ec27c5f9f6e2 for instance with vm_state active and task_state deleting.
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.568 182939 DEBUG oslo_concurrency.lockutils [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.569 182939 DEBUG oslo_concurrency.lockutils [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.595 182939 DEBUG nova.scheduler.client.report [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.612 182939 DEBUG nova.scheduler.client.report [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.612 182939 DEBUG nova.compute.provider_tree [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.625 182939 DEBUG nova.scheduler.client.report [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.665 182939 DEBUG nova.scheduler.client.report [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.706 182939 DEBUG nova.compute.provider_tree [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.722 182939 DEBUG nova.scheduler.client.report [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.745 182939 DEBUG oslo_concurrency.lockutils [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.774 182939 INFO nova.scheduler.client.report [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance e5e2845b-3703-4c14-8ea6-9c2553e54198
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:10 compute-0 nova_compute[182935]: 2026-01-22 00:41:10.848 182939 DEBUG oslo_concurrency.lockutils [None req-87a352ed-b7a8-4648-a3ca-7b91b6338a65 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "e5e2845b-3703-4c14-8ea6-9c2553e54198" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:41:11 compute-0 nova_compute[182935]: 2026-01-22 00:41:11.109 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:11 compute-0 nova_compute[182935]: 2026-01-22 00:41:11.834 182939 DEBUG nova.compute.manager [req-49cb989c-6ba1-45a1-871c-fe0890d6e6f0 req-21d80011-379e-4229-a949-db90e9376a43 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Received event network-vif-deleted-68410b8d-352f-40ee-9abf-04ba4c6996ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:41:11 compute-0 nova_compute[182935]: 2026-01-22 00:41:11.834 182939 INFO nova.compute.manager [req-49cb989c-6ba1-45a1-871c-fe0890d6e6f0 req-21d80011-379e-4229-a949-db90e9376a43 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Neutron deleted interface 68410b8d-352f-40ee-9abf-04ba4c6996ec; detaching it from the instance and deleting it from the info cache
Jan 22 00:41:11 compute-0 nova_compute[182935]: 2026-01-22 00:41:11.834 182939 DEBUG nova.network.neutron [req-49cb989c-6ba1-45a1-871c-fe0890d6e6f0 req-21d80011-379e-4229-a949-db90e9376a43 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 22 00:41:11 compute-0 nova_compute[182935]: 2026-01-22 00:41:11.837 182939 DEBUG nova.compute.manager [req-49cb989c-6ba1-45a1-871c-fe0890d6e6f0 req-21d80011-379e-4229-a949-db90e9376a43 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Detach interface failed, port_id=68410b8d-352f-40ee-9abf-04ba4c6996ec, reason: Instance e5e2845b-3703-4c14-8ea6-9c2553e54198 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 22 00:41:13 compute-0 nova_compute[182935]: 2026-01-22 00:41:13.240 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:16 compute-0 nova_compute[182935]: 2026-01-22 00:41:16.111 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:16 compute-0 nova_compute[182935]: 2026-01-22 00:41:16.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:17 compute-0 nova_compute[182935]: 2026-01-22 00:41:17.742 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:17 compute-0 nova_compute[182935]: 2026-01-22 00:41:17.827 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:18 compute-0 nova_compute[182935]: 2026-01-22 00:41:18.291 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:18 compute-0 nova_compute[182935]: 2026-01-22 00:41:18.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:18 compute-0 nova_compute[182935]: 2026-01-22 00:41:18.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:20 compute-0 nova_compute[182935]: 2026-01-22 00:41:20.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:21 compute-0 nova_compute[182935]: 2026-01-22 00:41:21.113 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:21 compute-0 nova_compute[182935]: 2026-01-22 00:41:21.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:41:23 compute-0 nova_compute[182935]: 2026-01-22 00:41:23.153 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042468.1518195, e5e2845b-3703-4c14-8ea6-9c2553e54198 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:41:23 compute-0 nova_compute[182935]: 2026-01-22 00:41:23.154 182939 INFO nova.compute.manager [-] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] VM Stopped (Lifecycle Event)
Jan 22 00:41:23 compute-0 nova_compute[182935]: 2026-01-22 00:41:23.174 182939 DEBUG nova.compute.manager [None req-9d761f9a-2a3b-4c35-9530-51013b0cd975 - - - - - -] [instance: e5e2845b-3703-4c14-8ea6-9c2553e54198] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:41:23 compute-0 nova_compute[182935]: 2026-01-22 00:41:23.346 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:24 compute-0 podman[245699]: 2026-01-22 00:41:24.682234115 +0000 UTC m=+0.051493474 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:41:24 compute-0 podman[245697]: 2026-01-22 00:41:24.68244133 +0000 UTC m=+0.058832889 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:41:24 compute-0 podman[245698]: 2026-01-22 00:41:24.765257423 +0000 UTC m=+0.138529828 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:41:26 compute-0 nova_compute[182935]: 2026-01-22 00:41:26.115 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:28 compute-0 nova_compute[182935]: 2026-01-22 00:41:28.388 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:31 compute-0 nova_compute[182935]: 2026-01-22 00:41:31.117 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:33 compute-0 nova_compute[182935]: 2026-01-22 00:41:33.391 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:33 compute-0 podman[245766]: 2026-01-22 00:41:33.675649221 +0000 UTC m=+0.050404238 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:41:35 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:35.851 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:63:f3 10.100.0.2 2001:db8::f816:3eff:fe5f:63f3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe5f:63f3/64', 'neutron:device_id': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b420faa5-5ae8-471e-9b88-5f792c3ff519, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2a8aebb4-643e-4d79-9b9e-71408c2b29d3) old=Port_Binding(mac=['fa:16:3e:5f:63:f3 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:41:35 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:35.852 104408 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2a8aebb4-643e-4d79-9b9e-71408c2b29d3 in datapath 0fbc923c-90ec-4c3d-92df-bc42843601b3 updated
Jan 22 00:41:35 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:35.853 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0fbc923c-90ec-4c3d-92df-bc42843601b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:41:35 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:35.854 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2fe91a-8f9e-4224-ab89-65d1a16c68f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:36 compute-0 nova_compute[182935]: 2026-01-22 00:41:36.120 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:37 compute-0 podman[245785]: 2026-01-22 00:41:37.678726691 +0000 UTC m=+0.057496959 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., version=9.6, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 00:41:37 compute-0 podman[245786]: 2026-01-22 00:41:37.710760718 +0000 UTC m=+0.085324285 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:41:38 compute-0 nova_compute[182935]: 2026-01-22 00:41:38.393 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:41 compute-0 nova_compute[182935]: 2026-01-22 00:41:41.151 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:41.552 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:63:f3 10.100.0.2 2001:db8:0:1:f816:3eff:fe5f:63f3 2001:db8::f816:3eff:fe5f:63f3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe5f:63f3/64 2001:db8::f816:3eff:fe5f:63f3/64', 'neutron:device_id': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b420faa5-5ae8-471e-9b88-5f792c3ff519, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2a8aebb4-643e-4d79-9b9e-71408c2b29d3) old=Port_Binding(mac=['fa:16:3e:5f:63:f3 10.100.0.2 2001:db8::f816:3eff:fe5f:63f3'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe5f:63f3/64', 'neutron:device_id': 'ovnmeta-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0fbc923c-90ec-4c3d-92df-bc42843601b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:41:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:41.553 104408 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2a8aebb4-643e-4d79-9b9e-71408c2b29d3 in datapath 0fbc923c-90ec-4c3d-92df-bc42843601b3 updated
Jan 22 00:41:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:41.555 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0fbc923c-90ec-4c3d-92df-bc42843601b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:41:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:41:41.556 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[700e8cff-d190-44b6-8c97-036353d06694]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:41:43 compute-0 nova_compute[182935]: 2026-01-22 00:41:43.395 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:46 compute-0 nova_compute[182935]: 2026-01-22 00:41:46.191 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:48 compute-0 nova_compute[182935]: 2026-01-22 00:41:48.402 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:49 compute-0 sshd-session[245825]: Invalid user nginx from 188.166.69.60 port 43574
Jan 22 00:41:49 compute-0 sshd-session[245825]: Connection closed by invalid user nginx 188.166.69.60 port 43574 [preauth]
Jan 22 00:41:51 compute-0 nova_compute[182935]: 2026-01-22 00:41:51.193 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:53 compute-0 nova_compute[182935]: 2026-01-22 00:41:53.427 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:55 compute-0 podman[245827]: 2026-01-22 00:41:55.678668898 +0000 UTC m=+0.050815498 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:41:55 compute-0 podman[245829]: 2026-01-22 00:41:55.691639538 +0000 UTC m=+0.055587983 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:41:55 compute-0 podman[245828]: 2026-01-22 00:41:55.760562838 +0000 UTC m=+0.131052459 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 22 00:41:56 compute-0 nova_compute[182935]: 2026-01-22 00:41:56.195 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:58 compute-0 nova_compute[182935]: 2026-01-22 00:41:58.488 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:41:58 compute-0 ovn_controller[95047]: 2026-01-22T00:41:58Z|00747|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Jan 22 00:42:01 compute-0 nova_compute[182935]: 2026-01-22 00:42:01.230 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:42:03.241 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:42:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:42:03.242 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:42:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:42:03.242 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:42:03 compute-0 nova_compute[182935]: 2026-01-22 00:42:03.490 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:03 compute-0 nova_compute[182935]: 2026-01-22 00:42:03.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:03 compute-0 nova_compute[182935]: 2026-01-22 00:42:03.816 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:42:03 compute-0 nova_compute[182935]: 2026-01-22 00:42:03.817 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:42:03 compute-0 nova_compute[182935]: 2026-01-22 00:42:03.817 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:42:03 compute-0 nova_compute[182935]: 2026-01-22 00:42:03.818 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:42:03 compute-0 podman[245898]: 2026-01-22 00:42:03.918851821 +0000 UTC m=+0.052847937 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:42:03 compute-0 nova_compute[182935]: 2026-01-22 00:42:03.979 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:42:03 compute-0 nova_compute[182935]: 2026-01-22 00:42:03.980 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5712MB free_disk=73.12282943725586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:42:03 compute-0 nova_compute[182935]: 2026-01-22 00:42:03.980 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:42:03 compute-0 nova_compute[182935]: 2026-01-22 00:42:03.980 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:42:04 compute-0 nova_compute[182935]: 2026-01-22 00:42:04.039 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:42:04 compute-0 nova_compute[182935]: 2026-01-22 00:42:04.039 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:42:04 compute-0 nova_compute[182935]: 2026-01-22 00:42:04.058 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:42:04 compute-0 nova_compute[182935]: 2026-01-22 00:42:04.071 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:42:04 compute-0 nova_compute[182935]: 2026-01-22 00:42:04.088 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:42:04 compute-0 nova_compute[182935]: 2026-01-22 00:42:04.089 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:42:06 compute-0 nova_compute[182935]: 2026-01-22 00:42:06.232 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:08 compute-0 nova_compute[182935]: 2026-01-22 00:42:08.493 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:08 compute-0 podman[245919]: 2026-01-22 00:42:08.694898166 +0000 UTC m=+0.068007489 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:42:08 compute-0 podman[245918]: 2026-01-22 00:42:08.696194677 +0000 UTC m=+0.062584878 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, architecture=x86_64)
Jan 22 00:42:09 compute-0 nova_compute[182935]: 2026-01-22 00:42:09.090 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:09 compute-0 nova_compute[182935]: 2026-01-22 00:42:09.090 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:42:10 compute-0 nova_compute[182935]: 2026-01-22 00:42:10.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:10 compute-0 nova_compute[182935]: 2026-01-22 00:42:10.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:42:10 compute-0 nova_compute[182935]: 2026-01-22 00:42:10.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:42:10 compute-0 nova_compute[182935]: 2026-01-22 00:42:10.813 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:42:11 compute-0 nova_compute[182935]: 2026-01-22 00:42:11.294 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:11 compute-0 nova_compute[182935]: 2026-01-22 00:42:11.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:13 compute-0 nova_compute[182935]: 2026-01-22 00:42:13.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:16 compute-0 nova_compute[182935]: 2026-01-22 00:42:16.297 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:17 compute-0 nova_compute[182935]: 2026-01-22 00:42:17.797 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:18 compute-0 nova_compute[182935]: 2026-01-22 00:42:18.539 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:18 compute-0 nova_compute[182935]: 2026-01-22 00:42:18.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:42:19.912 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:42:19 compute-0 nova_compute[182935]: 2026-01-22 00:42:19.912 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:42:19.913 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:42:20 compute-0 nova_compute[182935]: 2026-01-22 00:42:20.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:21 compute-0 nova_compute[182935]: 2026-01-22 00:42:21.354 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:21 compute-0 nova_compute[182935]: 2026-01-22 00:42:21.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:22 compute-0 nova_compute[182935]: 2026-01-22 00:42:22.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:42:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:42:23 compute-0 nova_compute[182935]: 2026-01-22 00:42:23.541 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:23 compute-0 nova_compute[182935]: 2026-01-22 00:42:23.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:42:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:42:25.915 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:42:26 compute-0 nova_compute[182935]: 2026-01-22 00:42:26.403 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:26 compute-0 podman[245955]: 2026-01-22 00:42:26.676713898 +0000 UTC m=+0.049357323 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:42:26 compute-0 podman[245957]: 2026-01-22 00:42:26.699570375 +0000 UTC m=+0.060516570 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:42:26 compute-0 podman[245956]: 2026-01-22 00:42:26.729748358 +0000 UTC m=+0.094690639 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 22 00:42:28 compute-0 nova_compute[182935]: 2026-01-22 00:42:28.544 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:31 compute-0 nova_compute[182935]: 2026-01-22 00:42:31.405 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:33 compute-0 sshd-session[246023]: Invalid user nginx from 188.166.69.60 port 56550
Jan 22 00:42:33 compute-0 sshd-session[246023]: Connection closed by invalid user nginx 188.166.69.60 port 56550 [preauth]
Jan 22 00:42:33 compute-0 nova_compute[182935]: 2026-01-22 00:42:33.583 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:34 compute-0 podman[246025]: 2026-01-22 00:42:34.678698097 +0000 UTC m=+0.056370191 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:42:36 compute-0 nova_compute[182935]: 2026-01-22 00:42:36.405 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:38 compute-0 nova_compute[182935]: 2026-01-22 00:42:38.585 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:39 compute-0 podman[246045]: 2026-01-22 00:42:39.679626226 +0000 UTC m=+0.057367964 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter)
Jan 22 00:42:39 compute-0 podman[246046]: 2026-01-22 00:42:39.691001849 +0000 UTC m=+0.064922515 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:42:41 compute-0 nova_compute[182935]: 2026-01-22 00:42:41.407 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:43 compute-0 nova_compute[182935]: 2026-01-22 00:42:43.587 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:46 compute-0 nova_compute[182935]: 2026-01-22 00:42:46.560 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:48 compute-0 nova_compute[182935]: 2026-01-22 00:42:48.589 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:51 compute-0 nova_compute[182935]: 2026-01-22 00:42:51.562 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:53 compute-0 nova_compute[182935]: 2026-01-22 00:42:53.592 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:56 compute-0 nova_compute[182935]: 2026-01-22 00:42:56.565 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:57 compute-0 podman[246090]: 2026-01-22 00:42:57.684729608 +0000 UTC m=+0.054311002 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:42:57 compute-0 podman[246092]: 2026-01-22 00:42:57.754775025 +0000 UTC m=+0.108213973 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:42:57 compute-0 podman[246091]: 2026-01-22 00:42:57.764113088 +0000 UTC m=+0.124944062 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 22 00:42:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:42:57.946 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:42:57 compute-0 nova_compute[182935]: 2026-01-22 00:42:57.946 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:57 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:42:57.947 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:42:58 compute-0 nova_compute[182935]: 2026-01-22 00:42:58.595 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:42:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:42:59.948 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:43:01 compute-0 nova_compute[182935]: 2026-01-22 00:43:01.613 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:03.242 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:43:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:03.243 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:43:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:03.243 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:43:03 compute-0 nova_compute[182935]: 2026-01-22 00:43:03.597 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:05 compute-0 podman[246164]: 2026-01-22 00:43:05.698154271 +0000 UTC m=+0.069398875 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:43:05 compute-0 nova_compute[182935]: 2026-01-22 00:43:05.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:05 compute-0 nova_compute[182935]: 2026-01-22 00:43:05.817 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:43:05 compute-0 nova_compute[182935]: 2026-01-22 00:43:05.817 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:43:05 compute-0 nova_compute[182935]: 2026-01-22 00:43:05.817 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:43:05 compute-0 nova_compute[182935]: 2026-01-22 00:43:05.817 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:43:05 compute-0 nova_compute[182935]: 2026-01-22 00:43:05.945 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:43:05 compute-0 nova_compute[182935]: 2026-01-22 00:43:05.946 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5709MB free_disk=73.12282943725586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:43:05 compute-0 nova_compute[182935]: 2026-01-22 00:43:05.946 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:43:05 compute-0 nova_compute[182935]: 2026-01-22 00:43:05.946 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:43:06 compute-0 nova_compute[182935]: 2026-01-22 00:43:06.002 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:43:06 compute-0 nova_compute[182935]: 2026-01-22 00:43:06.002 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:43:06 compute-0 nova_compute[182935]: 2026-01-22 00:43:06.025 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:43:06 compute-0 nova_compute[182935]: 2026-01-22 00:43:06.039 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:43:06 compute-0 nova_compute[182935]: 2026-01-22 00:43:06.040 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:43:06 compute-0 nova_compute[182935]: 2026-01-22 00:43:06.040 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:43:06 compute-0 nova_compute[182935]: 2026-01-22 00:43:06.615 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:08 compute-0 nova_compute[182935]: 2026-01-22 00:43:08.599 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:10 compute-0 nova_compute[182935]: 2026-01-22 00:43:10.041 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:10 compute-0 nova_compute[182935]: 2026-01-22 00:43:10.042 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:43:10 compute-0 nova_compute[182935]: 2026-01-22 00:43:10.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:10 compute-0 nova_compute[182935]: 2026-01-22 00:43:10.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:43:10 compute-0 nova_compute[182935]: 2026-01-22 00:43:10.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:43:11 compute-0 nova_compute[182935]: 2026-01-22 00:43:11.147 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:43:11 compute-0 podman[246184]: 2026-01-22 00:43:11.235144013 +0000 UTC m=+0.052808154 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:43:11 compute-0 podman[246183]: 2026-01-22 00:43:11.235583925 +0000 UTC m=+0.056715650 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-type=git, config_id=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6)
Jan 22 00:43:11 compute-0 nova_compute[182935]: 2026-01-22 00:43:11.657 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:13 compute-0 nova_compute[182935]: 2026-01-22 00:43:13.602 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:13 compute-0 nova_compute[182935]: 2026-01-22 00:43:13.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:16 compute-0 nova_compute[182935]: 2026-01-22 00:43:16.659 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:18 compute-0 sshd-session[246222]: Connection closed by authenticating user operator 188.166.69.60 port 49082 [preauth]
Jan 22 00:43:18 compute-0 nova_compute[182935]: 2026-01-22 00:43:18.606 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:18 compute-0 nova_compute[182935]: 2026-01-22 00:43:18.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:19 compute-0 nova_compute[182935]: 2026-01-22 00:43:19.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:20 compute-0 nova_compute[182935]: 2026-01-22 00:43:20.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:21 compute-0 nova_compute[182935]: 2026-01-22 00:43:21.660 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:23 compute-0 nova_compute[182935]: 2026-01-22 00:43:23.609 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:23 compute-0 nova_compute[182935]: 2026-01-22 00:43:23.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:24 compute-0 nova_compute[182935]: 2026-01-22 00:43:24.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:26.451 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:9f:dc 10.100.0.2 2001:db8::f816:3eff:fee0:9fdc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee0:9fdc/64', 'neutron:device_id': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad392942-0b6b-462d-a3a5-d979f385a143, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e429e99d-d544-4554-bbe2-f8538fbd55b8) old=Port_Binding(mac=['fa:16:3e:e0:9f:dc 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:43:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:26.453 104408 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e429e99d-d544-4554-bbe2-f8538fbd55b8 in datapath bc173f9b-a39e-490e-b1d4-92abd1855016 updated
Jan 22 00:43:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:26.455 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc173f9b-a39e-490e-b1d4-92abd1855016, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:43:26 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:26.457 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[13eeb1b6-abed-429c-9a26-012e726eb44c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:43:26 compute-0 nova_compute[182935]: 2026-01-22 00:43:26.662 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:26 compute-0 nova_compute[182935]: 2026-01-22 00:43:26.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:28 compute-0 nova_compute[182935]: 2026-01-22 00:43:28.612 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:28 compute-0 podman[246224]: 2026-01-22 00:43:28.674504608 +0000 UTC m=+0.048614426 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:43:28 compute-0 podman[246226]: 2026-01-22 00:43:28.681017994 +0000 UTC m=+0.046578697 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:43:28 compute-0 podman[246225]: 2026-01-22 00:43:28.707879237 +0000 UTC m=+0.075802036 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:43:30 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:30.057 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:9f:dc 10.100.0.2 2001:db8:0:1:f816:3eff:fee0:9fdc 2001:db8::f816:3eff:fee0:9fdc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fee0:9fdc/64 2001:db8::f816:3eff:fee0:9fdc/64', 'neutron:device_id': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad392942-0b6b-462d-a3a5-d979f385a143, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e429e99d-d544-4554-bbe2-f8538fbd55b8) old=Port_Binding(mac=['fa:16:3e:e0:9f:dc 10.100.0.2 2001:db8::f816:3eff:fee0:9fdc'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee0:9fdc/64', 'neutron:device_id': 'ovnmeta-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc173f9b-a39e-490e-b1d4-92abd1855016', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:43:30 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:30.058 104408 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e429e99d-d544-4554-bbe2-f8538fbd55b8 in datapath bc173f9b-a39e-490e-b1d4-92abd1855016 updated
Jan 22 00:43:30 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:30.059 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc173f9b-a39e-490e-b1d4-92abd1855016, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:43:30 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:30.060 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1b8274f3-c8d2-4f13-a358-8be0e6c8f06d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:43:31 compute-0 nova_compute[182935]: 2026-01-22 00:43:31.664 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:33 compute-0 nova_compute[182935]: 2026-01-22 00:43:33.650 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:34 compute-0 nova_compute[182935]: 2026-01-22 00:43:34.816 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:34 compute-0 nova_compute[182935]: 2026-01-22 00:43:34.816 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:43:36 compute-0 nova_compute[182935]: 2026-01-22 00:43:36.666 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:36 compute-0 podman[246296]: 2026-01-22 00:43:36.671579439 +0000 UTC m=+0.050346186 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:43:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:38.090 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:43:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:38.091 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:43:38 compute-0 nova_compute[182935]: 2026-01-22 00:43:38.090 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:38 compute-0 nova_compute[182935]: 2026-01-22 00:43:38.693 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:41 compute-0 nova_compute[182935]: 2026-01-22 00:43:41.667 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:41 compute-0 podman[246315]: 2026-01-22 00:43:41.692404744 +0000 UTC m=+0.057036435 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350)
Jan 22 00:43:41 compute-0 podman[246316]: 2026-01-22 00:43:41.699920785 +0000 UTC m=+0.060927510 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 00:43:43 compute-0 nova_compute[182935]: 2026-01-22 00:43:43.696 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:45 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:43:45.093 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:43:45 compute-0 nova_compute[182935]: 2026-01-22 00:43:45.811 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:43:45 compute-0 nova_compute[182935]: 2026-01-22 00:43:45.811 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:43:45 compute-0 nova_compute[182935]: 2026-01-22 00:43:45.826 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:43:46 compute-0 nova_compute[182935]: 2026-01-22 00:43:46.669 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:48 compute-0 nova_compute[182935]: 2026-01-22 00:43:48.699 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:51 compute-0 nova_compute[182935]: 2026-01-22 00:43:51.671 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:53 compute-0 nova_compute[182935]: 2026-01-22 00:43:53.745 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:56 compute-0 nova_compute[182935]: 2026-01-22 00:43:56.673 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:58 compute-0 nova_compute[182935]: 2026-01-22 00:43:58.748 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:43:59 compute-0 podman[246355]: 2026-01-22 00:43:59.672528966 +0000 UTC m=+0.045937551 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:43:59 compute-0 podman[246357]: 2026-01-22 00:43:59.686684755 +0000 UTC m=+0.051477554 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:43:59 compute-0 podman[246356]: 2026-01-22 00:43:59.711565531 +0000 UTC m=+0.080379235 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:44:01 compute-0 nova_compute[182935]: 2026-01-22 00:44:01.675 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:01 compute-0 sshd-session[246429]: Connection closed by authenticating user operator 188.166.69.60 port 58660 [preauth]
Jan 22 00:44:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:44:03.244 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:44:03.245 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:44:03.245 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:03 compute-0 nova_compute[182935]: 2026-01-22 00:44:03.751 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:06 compute-0 nova_compute[182935]: 2026-01-22 00:44:06.677 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:07 compute-0 podman[246431]: 2026-01-22 00:44:07.670833519 +0000 UTC m=+0.048993475 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:44:07 compute-0 nova_compute[182935]: 2026-01-22 00:44:07.810 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:07 compute-0 nova_compute[182935]: 2026-01-22 00:44:07.833 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:07 compute-0 nova_compute[182935]: 2026-01-22 00:44:07.834 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:07 compute-0 nova_compute[182935]: 2026-01-22 00:44:07.834 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:07 compute-0 nova_compute[182935]: 2026-01-22 00:44:07.834 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:44:07 compute-0 nova_compute[182935]: 2026-01-22 00:44:07.976 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:44:07 compute-0 nova_compute[182935]: 2026-01-22 00:44:07.977 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5719MB free_disk=73.12282943725586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:44:07 compute-0 nova_compute[182935]: 2026-01-22 00:44:07.977 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:44:07 compute-0 nova_compute[182935]: 2026-01-22 00:44:07.977 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:44:08 compute-0 nova_compute[182935]: 2026-01-22 00:44:08.037 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:44:08 compute-0 nova_compute[182935]: 2026-01-22 00:44:08.038 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:44:08 compute-0 nova_compute[182935]: 2026-01-22 00:44:08.062 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:44:08 compute-0 nova_compute[182935]: 2026-01-22 00:44:08.075 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:44:08 compute-0 nova_compute[182935]: 2026-01-22 00:44:08.076 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:44:08 compute-0 nova_compute[182935]: 2026-01-22 00:44:08.077 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:44:08 compute-0 nova_compute[182935]: 2026-01-22 00:44:08.755 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:11 compute-0 nova_compute[182935]: 2026-01-22 00:44:11.060 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:11 compute-0 nova_compute[182935]: 2026-01-22 00:44:11.060 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:44:11 compute-0 nova_compute[182935]: 2026-01-22 00:44:11.060 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:44:11 compute-0 nova_compute[182935]: 2026-01-22 00:44:11.075 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:44:11 compute-0 nova_compute[182935]: 2026-01-22 00:44:11.076 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:11 compute-0 nova_compute[182935]: 2026-01-22 00:44:11.076 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:44:11 compute-0 nova_compute[182935]: 2026-01-22 00:44:11.679 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:12 compute-0 podman[246452]: 2026-01-22 00:44:12.731560038 +0000 UTC m=+0.055601172 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, vendor=Red Hat, Inc., config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 00:44:12 compute-0 podman[246453]: 2026-01-22 00:44:12.747026378 +0000 UTC m=+0.061159435 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 00:44:13 compute-0 nova_compute[182935]: 2026-01-22 00:44:13.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:13 compute-0 nova_compute[182935]: 2026-01-22 00:44:13.795 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:16 compute-0 nova_compute[182935]: 2026-01-22 00:44:16.681 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:18 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:44:18.085 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:44:18 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:44:18.085 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:44:18 compute-0 nova_compute[182935]: 2026-01-22 00:44:18.086 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:18 compute-0 nova_compute[182935]: 2026-01-22 00:44:18.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:18 compute-0 nova_compute[182935]: 2026-01-22 00:44:18.796 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:19 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:44:19.087 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:44:21 compute-0 nova_compute[182935]: 2026-01-22 00:44:21.683 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:21 compute-0 nova_compute[182935]: 2026-01-22 00:44:21.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:21 compute-0 nova_compute[182935]: 2026-01-22 00:44:21.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:22 compute-0 nova_compute[182935]: 2026-01-22 00:44:22.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:44:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:44:23 compute-0 nova_compute[182935]: 2026-01-22 00:44:23.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:23 compute-0 nova_compute[182935]: 2026-01-22 00:44:23.799 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:25 compute-0 nova_compute[182935]: 2026-01-22 00:44:25.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:44:26 compute-0 nova_compute[182935]: 2026-01-22 00:44:26.710 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:28 compute-0 nova_compute[182935]: 2026-01-22 00:44:28.842 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:30 compute-0 podman[246493]: 2026-01-22 00:44:30.685603306 +0000 UTC m=+0.056861143 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:44:30 compute-0 podman[246500]: 2026-01-22 00:44:30.693309771 +0000 UTC m=+0.052841626 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:44:30 compute-0 podman[246494]: 2026-01-22 00:44:30.737717584 +0000 UTC m=+0.103623012 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 00:44:31 compute-0 nova_compute[182935]: 2026-01-22 00:44:31.712 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:33 compute-0 nova_compute[182935]: 2026-01-22 00:44:33.846 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:36 compute-0 nova_compute[182935]: 2026-01-22 00:44:36.714 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:38 compute-0 podman[246566]: 2026-01-22 00:44:38.714947361 +0000 UTC m=+0.068680295 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:44:38 compute-0 nova_compute[182935]: 2026-01-22 00:44:38.850 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:41 compute-0 nova_compute[182935]: 2026-01-22 00:44:41.717 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:43 compute-0 podman[246585]: 2026-01-22 00:44:43.678051245 +0000 UTC m=+0.056633166 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Jan 22 00:44:43 compute-0 podman[246586]: 2026-01-22 00:44:43.709626632 +0000 UTC m=+0.083350187 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 22 00:44:43 compute-0 nova_compute[182935]: 2026-01-22 00:44:43.900 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:45 compute-0 sshd-session[246626]: Connection closed by authenticating user operator 188.166.69.60 port 36542 [preauth]
Jan 22 00:44:46 compute-0 nova_compute[182935]: 2026-01-22 00:44:46.718 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:48 compute-0 nova_compute[182935]: 2026-01-22 00:44:48.903 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:51 compute-0 nova_compute[182935]: 2026-01-22 00:44:51.719 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:53 compute-0 nova_compute[182935]: 2026-01-22 00:44:53.906 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:56 compute-0 nova_compute[182935]: 2026-01-22 00:44:56.767 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:44:58 compute-0 nova_compute[182935]: 2026-01-22 00:44:58.909 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:01 compute-0 podman[246630]: 2026-01-22 00:45:01.680100964 +0000 UTC m=+0.048628136 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:45:01 compute-0 podman[246628]: 2026-01-22 00:45:01.680198496 +0000 UTC m=+0.053806170 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:45:01 compute-0 podman[246629]: 2026-01-22 00:45:01.705763668 +0000 UTC m=+0.076660247 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:45:01 compute-0 nova_compute[182935]: 2026-01-22 00:45:01.769 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:45:03.245 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:45:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:45:03.245 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:45:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:45:03.246 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:45:03 compute-0 nova_compute[182935]: 2026-01-22 00:45:03.912 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:06 compute-0 nova_compute[182935]: 2026-01-22 00:45:06.770 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:09 compute-0 nova_compute[182935]: 2026-01-22 00:45:09.127 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:09 compute-0 podman[246700]: 2026-01-22 00:45:09.711832535 +0000 UTC m=+0.073215544 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 22 00:45:09 compute-0 nova_compute[182935]: 2026-01-22 00:45:09.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:09 compute-0 nova_compute[182935]: 2026-01-22 00:45:09.848 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:45:09 compute-0 nova_compute[182935]: 2026-01-22 00:45:09.849 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:45:09 compute-0 nova_compute[182935]: 2026-01-22 00:45:09.849 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:45:09 compute-0 nova_compute[182935]: 2026-01-22 00:45:09.849 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:45:09 compute-0 nova_compute[182935]: 2026-01-22 00:45:09.979 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:45:09 compute-0 nova_compute[182935]: 2026-01-22 00:45:09.980 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5718MB free_disk=73.12282943725586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:45:09 compute-0 nova_compute[182935]: 2026-01-22 00:45:09.980 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:45:09 compute-0 nova_compute[182935]: 2026-01-22 00:45:09.980 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:45:10 compute-0 nova_compute[182935]: 2026-01-22 00:45:10.094 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:45:10 compute-0 nova_compute[182935]: 2026-01-22 00:45:10.094 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:45:10 compute-0 nova_compute[182935]: 2026-01-22 00:45:10.127 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:45:10 compute-0 nova_compute[182935]: 2026-01-22 00:45:10.149 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:45:10 compute-0 nova_compute[182935]: 2026-01-22 00:45:10.150 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:45:10 compute-0 nova_compute[182935]: 2026-01-22 00:45:10.150 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:45:11 compute-0 nova_compute[182935]: 2026-01-22 00:45:11.150 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:11 compute-0 nova_compute[182935]: 2026-01-22 00:45:11.151 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:45:11 compute-0 nova_compute[182935]: 2026-01-22 00:45:11.151 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:45:11 compute-0 nova_compute[182935]: 2026-01-22 00:45:11.172 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:45:11 compute-0 nova_compute[182935]: 2026-01-22 00:45:11.772 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:12 compute-0 nova_compute[182935]: 2026-01-22 00:45:12.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:12 compute-0 nova_compute[182935]: 2026-01-22 00:45:12.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:45:13 compute-0 nova_compute[182935]: 2026-01-22 00:45:13.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:14 compute-0 nova_compute[182935]: 2026-01-22 00:45:14.188 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:14 compute-0 podman[246719]: 2026-01-22 00:45:14.680193465 +0000 UTC m=+0.053439121 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Jan 22 00:45:14 compute-0 podman[246720]: 2026-01-22 00:45:14.694785774 +0000 UTC m=+0.061117665 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:45:16 compute-0 nova_compute[182935]: 2026-01-22 00:45:16.774 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:19 compute-0 nova_compute[182935]: 2026-01-22 00:45:19.199 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:20 compute-0 nova_compute[182935]: 2026-01-22 00:45:20.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:21 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:45:21.442 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:45:21 compute-0 nova_compute[182935]: 2026-01-22 00:45:21.442 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:21 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:45:21.444 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:45:21 compute-0 nova_compute[182935]: 2026-01-22 00:45:21.776 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:21 compute-0 nova_compute[182935]: 2026-01-22 00:45:21.787 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:22 compute-0 nova_compute[182935]: 2026-01-22 00:45:22.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:23 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:45:23.446 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:45:24 compute-0 nova_compute[182935]: 2026-01-22 00:45:24.269 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:24 compute-0 nova_compute[182935]: 2026-01-22 00:45:24.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:26 compute-0 nova_compute[182935]: 2026-01-22 00:45:26.815 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:27 compute-0 nova_compute[182935]: 2026-01-22 00:45:27.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:45:28 compute-0 sshd-session[246761]: Connection closed by authenticating user operator 188.166.69.60 port 45672 [preauth]
Jan 22 00:45:29 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:45:29.042 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:5f:6e 10.100.0.2 2001:db8::f816:3eff:fee8:5f6e'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee8:5f6e/64', 'neutron:device_id': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496d15df-9baa-43c6-8bd0-ae8566291be1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4a596305-d10e-4e9e-a8ea-d94a630e8baa) old=Port_Binding(mac=['fa:16:3e:e8:5f:6e 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:45:29 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:45:29.043 104408 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4a596305-d10e-4e9e-a8ea-d94a630e8baa in datapath 83666af9-15ce-4344-a623-7180c9b2515a updated
Jan 22 00:45:29 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:45:29.044 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 83666af9-15ce-4344-a623-7180c9b2515a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:45:29 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:45:29.046 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[948c2711-f4e9-4d46-bfab-a957818f55a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:45:29 compute-0 nova_compute[182935]: 2026-01-22 00:45:29.321 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:31 compute-0 nova_compute[182935]: 2026-01-22 00:45:31.816 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:32 compute-0 podman[246763]: 2026-01-22 00:45:32.682985068 +0000 UTC m=+0.052480598 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:45:32 compute-0 podman[246765]: 2026-01-22 00:45:32.689474054 +0000 UTC m=+0.050608374 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:45:32 compute-0 podman[246764]: 2026-01-22 00:45:32.763998408 +0000 UTC m=+0.129949903 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:45:34 compute-0 nova_compute[182935]: 2026-01-22 00:45:34.372 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:36 compute-0 nova_compute[182935]: 2026-01-22 00:45:36.858 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:39 compute-0 nova_compute[182935]: 2026-01-22 00:45:39.376 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:40 compute-0 podman[246833]: 2026-01-22 00:45:40.679770974 +0000 UTC m=+0.049076806 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:45:41 compute-0 nova_compute[182935]: 2026-01-22 00:45:41.861 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:43 compute-0 sshd-session[246853]: Received disconnect from 91.224.92.108 port 44356:11:  [preauth]
Jan 22 00:45:43 compute-0 sshd-session[246853]: Disconnected from authenticating user root 91.224.92.108 port 44356 [preauth]
Jan 22 00:45:44 compute-0 nova_compute[182935]: 2026-01-22 00:45:44.423 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:45 compute-0 podman[246856]: 2026-01-22 00:45:45.68343881 +0000 UTC m=+0.054314043 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 00:45:45 compute-0 podman[246855]: 2026-01-22 00:45:45.693624143 +0000 UTC m=+0.067212471 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, release=1755695350, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 22 00:45:46 compute-0 nova_compute[182935]: 2026-01-22 00:45:46.915 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:49 compute-0 nova_compute[182935]: 2026-01-22 00:45:49.426 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:51 compute-0 nova_compute[182935]: 2026-01-22 00:45:51.916 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:54 compute-0 nova_compute[182935]: 2026-01-22 00:45:54.463 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:56 compute-0 nova_compute[182935]: 2026-01-22 00:45:56.971 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:45:59 compute-0 nova_compute[182935]: 2026-01-22 00:45:59.468 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:01 compute-0 nova_compute[182935]: 2026-01-22 00:46:01.972 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:03.245 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:03.246 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:03.246 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:03 compute-0 podman[246895]: 2026-01-22 00:46:03.684899101 +0000 UTC m=+0.054368762 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:46:03 compute-0 podman[246897]: 2026-01-22 00:46:03.68566708 +0000 UTC m=+0.048962364 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:46:03 compute-0 podman[246896]: 2026-01-22 00:46:03.71364015 +0000 UTC m=+0.081721138 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:46:04 compute-0 nova_compute[182935]: 2026-01-22 00:46:04.508 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:06 compute-0 nova_compute[182935]: 2026-01-22 00:46:06.998 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:09 compute-0 nova_compute[182935]: 2026-01-22 00:46:09.511 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:10 compute-0 nova_compute[182935]: 2026-01-22 00:46:10.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:10 compute-0 nova_compute[182935]: 2026-01-22 00:46:10.823 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:10 compute-0 nova_compute[182935]: 2026-01-22 00:46:10.823 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:10 compute-0 nova_compute[182935]: 2026-01-22 00:46:10.824 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:10 compute-0 nova_compute[182935]: 2026-01-22 00:46:10.824 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:46:10 compute-0 sshd-session[246969]: Connection closed by authenticating user operator 188.166.69.60 port 40048 [preauth]
Jan 22 00:46:10 compute-0 nova_compute[182935]: 2026-01-22 00:46:10.974 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:46:10 compute-0 nova_compute[182935]: 2026-01-22 00:46:10.976 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5717MB free_disk=73.12261199951172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:46:10 compute-0 nova_compute[182935]: 2026-01-22 00:46:10.976 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:10 compute-0 nova_compute[182935]: 2026-01-22 00:46:10.977 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:11 compute-0 nova_compute[182935]: 2026-01-22 00:46:11.136 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:46:11 compute-0 nova_compute[182935]: 2026-01-22 00:46:11.136 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:46:11 compute-0 nova_compute[182935]: 2026-01-22 00:46:11.199 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:46:11 compute-0 nova_compute[182935]: 2026-01-22 00:46:11.285 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:46:11 compute-0 nova_compute[182935]: 2026-01-22 00:46:11.286 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:46:11 compute-0 nova_compute[182935]: 2026-01-22 00:46:11.300 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:46:11 compute-0 nova_compute[182935]: 2026-01-22 00:46:11.326 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:46:11 compute-0 nova_compute[182935]: 2026-01-22 00:46:11.348 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:46:11 compute-0 nova_compute[182935]: 2026-01-22 00:46:11.363 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:46:11 compute-0 nova_compute[182935]: 2026-01-22 00:46:11.364 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:46:11 compute-0 nova_compute[182935]: 2026-01-22 00:46:11.365 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.388s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:11 compute-0 podman[246971]: 2026-01-22 00:46:11.69061396 +0000 UTC m=+0.062180269 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 22 00:46:12 compute-0 nova_compute[182935]: 2026-01-22 00:46:12.000 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:13 compute-0 nova_compute[182935]: 2026-01-22 00:46:13.366 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:13 compute-0 nova_compute[182935]: 2026-01-22 00:46:13.367 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:46:13 compute-0 nova_compute[182935]: 2026-01-22 00:46:13.367 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:46:13 compute-0 nova_compute[182935]: 2026-01-22 00:46:13.385 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:46:13 compute-0 nova_compute[182935]: 2026-01-22 00:46:13.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:13 compute-0 nova_compute[182935]: 2026-01-22 00:46:13.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:13 compute-0 nova_compute[182935]: 2026-01-22 00:46:13.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.281 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.282 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.303 182939 DEBUG nova.compute.manager [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.401 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.401 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.407 182939 DEBUG nova.virt.hardware [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.407 182939 INFO nova.compute.claims [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.513 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.519 182939 DEBUG nova.compute.provider_tree [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.538 182939 DEBUG nova.scheduler.client.report [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.571 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.572 182939 DEBUG nova.compute.manager [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.637 182939 DEBUG nova.compute.manager [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.637 182939 DEBUG nova.network.neutron [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.655 182939 INFO nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.697 182939 DEBUG nova.compute.manager [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.812 182939 DEBUG nova.compute.manager [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.813 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.814 182939 INFO nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Creating image(s)
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.815 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "/var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.815 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.817 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "/var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.837 182939 DEBUG oslo_concurrency.processutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.897 182939 DEBUG oslo_concurrency.processutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.899 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.900 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.916 182939 DEBUG oslo_concurrency.processutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.973 182939 DEBUG oslo_concurrency.processutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:46:14 compute-0 nova_compute[182935]: 2026-01-22 00:46:14.974 182939 DEBUG oslo_concurrency.processutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.005 182939 DEBUG oslo_concurrency.processutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.006 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.007 182939 DEBUG oslo_concurrency.processutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.063 182939 DEBUG oslo_concurrency.processutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.064 182939 DEBUG nova.virt.disk.api [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Checking if we can resize image /var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.064 182939 DEBUG oslo_concurrency.processutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.119 182939 DEBUG oslo_concurrency.processutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.120 182939 DEBUG nova.virt.disk.api [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Cannot resize image /var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.121 182939 DEBUG nova.objects.instance [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.137 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.137 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Ensure instance console log exists: /var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.138 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.138 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.138 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.144 182939 DEBUG nova.policy [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fd196423d94b309668ffd08655f7ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '837db8748d074b3c9179b47d30e7a1d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:46:15 compute-0 nova_compute[182935]: 2026-01-22 00:46:15.936 182939 DEBUG nova.network.neutron [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Successfully created port: e75bded3-9a57-4c88-9141-6d725875c555 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:46:16 compute-0 podman[247005]: 2026-01-22 00:46:16.689839089 +0000 UTC m=+0.060125850 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal)
Jan 22 00:46:16 compute-0 podman[247006]: 2026-01-22 00:46:16.691742654 +0000 UTC m=+0.059176707 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:46:17 compute-0 nova_compute[182935]: 2026-01-22 00:46:17.002 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:19 compute-0 nova_compute[182935]: 2026-01-22 00:46:19.515 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:20 compute-0 nova_compute[182935]: 2026-01-22 00:46:20.378 182939 DEBUG nova.network.neutron [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Successfully updated port: e75bded3-9a57-4c88-9141-6d725875c555 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:46:20 compute-0 nova_compute[182935]: 2026-01-22 00:46:20.396 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "refresh_cache-3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:46:20 compute-0 nova_compute[182935]: 2026-01-22 00:46:20.396 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquired lock "refresh_cache-3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:46:20 compute-0 nova_compute[182935]: 2026-01-22 00:46:20.396 182939 DEBUG nova.network.neutron [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:46:20 compute-0 nova_compute[182935]: 2026-01-22 00:46:20.522 182939 DEBUG nova.compute.manager [req-f1cc5b85-dbe4-4788-9046-1114ef237a03 req-4b3a7bf4-3224-4f35-a199-015a596747e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Received event network-changed-e75bded3-9a57-4c88-9141-6d725875c555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:46:20 compute-0 nova_compute[182935]: 2026-01-22 00:46:20.522 182939 DEBUG nova.compute.manager [req-f1cc5b85-dbe4-4788-9046-1114ef237a03 req-4b3a7bf4-3224-4f35-a199-015a596747e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Refreshing instance network info cache due to event network-changed-e75bded3-9a57-4c88-9141-6d725875c555. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:46:20 compute-0 nova_compute[182935]: 2026-01-22 00:46:20.523 182939 DEBUG oslo_concurrency.lockutils [req-f1cc5b85-dbe4-4788-9046-1114ef237a03 req-4b3a7bf4-3224-4f35-a199-015a596747e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:46:21 compute-0 nova_compute[182935]: 2026-01-22 00:46:21.370 182939 DEBUG nova.network.neutron [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:46:21 compute-0 nova_compute[182935]: 2026-01-22 00:46:21.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:22 compute-0 nova_compute[182935]: 2026-01-22 00:46:22.004 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:22 compute-0 nova_compute[182935]: 2026-01-22 00:46:22.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.324 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:46:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.811 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.820 182939 DEBUG nova.network.neutron [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Updating instance_info_cache with network_info: [{"id": "e75bded3-9a57-4c88-9141-6d725875c555", "address": "fa:16:3e:95:8f:28", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:8f28", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75bded3-9a", "ovs_interfaceid": "e75bded3-9a57-4c88-9141-6d725875c555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.842 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Releasing lock "refresh_cache-3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.843 182939 DEBUG nova.compute.manager [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Instance network_info: |[{"id": "e75bded3-9a57-4c88-9141-6d725875c555", "address": "fa:16:3e:95:8f:28", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:8f28", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75bded3-9a", "ovs_interfaceid": "e75bded3-9a57-4c88-9141-6d725875c555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.843 182939 DEBUG oslo_concurrency.lockutils [req-f1cc5b85-dbe4-4788-9046-1114ef237a03 req-4b3a7bf4-3224-4f35-a199-015a596747e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.843 182939 DEBUG nova.network.neutron [req-f1cc5b85-dbe4-4788-9046-1114ef237a03 req-4b3a7bf4-3224-4f35-a199-015a596747e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Refreshing network info cache for port e75bded3-9a57-4c88-9141-6d725875c555 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.846 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Start _get_guest_xml network_info=[{"id": "e75bded3-9a57-4c88-9141-6d725875c555", "address": "fa:16:3e:95:8f:28", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:8f28", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75bded3-9a", "ovs_interfaceid": "e75bded3-9a57-4c88-9141-6d725875c555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.850 182939 WARNING nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.857 182939 DEBUG nova.virt.libvirt.host [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.858 182939 DEBUG nova.virt.libvirt.host [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.861 182939 DEBUG nova.virt.libvirt.host [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.862 182939 DEBUG nova.virt.libvirt.host [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.863 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.863 182939 DEBUG nova.virt.hardware [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.864 182939 DEBUG nova.virt.hardware [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.864 182939 DEBUG nova.virt.hardware [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.864 182939 DEBUG nova.virt.hardware [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.864 182939 DEBUG nova.virt.hardware [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.864 182939 DEBUG nova.virt.hardware [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.865 182939 DEBUG nova.virt.hardware [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.865 182939 DEBUG nova.virt.hardware [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.865 182939 DEBUG nova.virt.hardware [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.865 182939 DEBUG nova.virt.hardware [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.866 182939 DEBUG nova.virt.hardware [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.870 182939 DEBUG nova.virt.libvirt.vif [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:46:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1092166276',display_name='tempest-TestGettingAddress-server-1092166276',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1092166276',id=184,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL3/PLQY4lAQU2yFGaoAmqWPJI5565ofTauEAmPcwEncHglgrmt+9X41pDrGx2Hzo63wjxi644i8QnD2R87vFxz3Kmnkg4MUbe27S7AT4N98N34iBfOk+UwjPX/szWkvLg==',key_name='tempest-TestGettingAddress-2046948813',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-q7vqm9ug',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:46:14Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e75bded3-9a57-4c88-9141-6d725875c555", "address": "fa:16:3e:95:8f:28", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:8f28", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75bded3-9a", "ovs_interfaceid": "e75bded3-9a57-4c88-9141-6d725875c555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.870 182939 DEBUG nova.network.os_vif_util [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "e75bded3-9a57-4c88-9141-6d725875c555", "address": "fa:16:3e:95:8f:28", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:8f28", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75bded3-9a", "ovs_interfaceid": "e75bded3-9a57-4c88-9141-6d725875c555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.871 182939 DEBUG nova.network.os_vif_util [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:8f:28,bridge_name='br-int',has_traffic_filtering=True,id=e75bded3-9a57-4c88-9141-6d725875c555,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape75bded3-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.872 182939 DEBUG nova.objects.instance [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.889 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:46:23 compute-0 nova_compute[182935]:   <uuid>3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5</uuid>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   <name>instance-000000b8</name>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <nova:name>tempest-TestGettingAddress-server-1092166276</nova:name>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:46:23</nova:creationTime>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:46:23 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:46:23 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:46:23 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:46:23 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:46:23 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:46:23 compute-0 nova_compute[182935]:         <nova:user uuid="a8fd196423d94b309668ffd08655f7ed">tempest-TestGettingAddress-471729430-project-member</nova:user>
Jan 22 00:46:23 compute-0 nova_compute[182935]:         <nova:project uuid="837db8748d074b3c9179b47d30e7a1d4">tempest-TestGettingAddress-471729430</nova:project>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:46:23 compute-0 nova_compute[182935]:         <nova:port uuid="e75bded3-9a57-4c88-9141-6d725875c555">
Jan 22 00:46:23 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe95:8f28" ipVersion="6"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <system>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <entry name="serial">3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5</entry>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <entry name="uuid">3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5</entry>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     </system>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   <os>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   </os>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   <features>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   </features>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk.config"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:95:8f:28"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <target dev="tape75bded3-9a"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/console.log" append="off"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <video>
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     </video>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:46:23 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:46:23 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:46:23 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:46:23 compute-0 nova_compute[182935]: </domain>
Jan 22 00:46:23 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.890 182939 DEBUG nova.compute.manager [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Preparing to wait for external event network-vif-plugged-e75bded3-9a57-4c88-9141-6d725875c555 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.890 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.890 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.891 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.891 182939 DEBUG nova.virt.libvirt.vif [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:46:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1092166276',display_name='tempest-TestGettingAddress-server-1092166276',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1092166276',id=184,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL3/PLQY4lAQU2yFGaoAmqWPJI5565ofTauEAmPcwEncHglgrmt+9X41pDrGx2Hzo63wjxi644i8QnD2R87vFxz3Kmnkg4MUbe27S7AT4N98N34iBfOk+UwjPX/szWkvLg==',key_name='tempest-TestGettingAddress-2046948813',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-q7vqm9ug',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:46:14Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e75bded3-9a57-4c88-9141-6d725875c555", "address": "fa:16:3e:95:8f:28", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:8f28", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75bded3-9a", "ovs_interfaceid": "e75bded3-9a57-4c88-9141-6d725875c555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.892 182939 DEBUG nova.network.os_vif_util [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "e75bded3-9a57-4c88-9141-6d725875c555", "address": "fa:16:3e:95:8f:28", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:8f28", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75bded3-9a", "ovs_interfaceid": "e75bded3-9a57-4c88-9141-6d725875c555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.892 182939 DEBUG nova.network.os_vif_util [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:8f:28,bridge_name='br-int',has_traffic_filtering=True,id=e75bded3-9a57-4c88-9141-6d725875c555,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape75bded3-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.893 182939 DEBUG os_vif [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:8f:28,bridge_name='br-int',has_traffic_filtering=True,id=e75bded3-9a57-4c88-9141-6d725875c555,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape75bded3-9a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.893 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.893 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.894 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.901 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.901 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape75bded3-9a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.902 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape75bded3-9a, col_values=(('external_ids', {'iface-id': 'e75bded3-9a57-4c88-9141-6d725875c555', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:8f:28', 'vm-uuid': '3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:46:23 compute-0 NetworkManager[55139]: <info>  [1769042783.9043] manager: (tape75bded3-9a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/369)
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.906 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.910 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.911 182939 INFO os_vif [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:8f:28,bridge_name='br-int',has_traffic_filtering=True,id=e75bded3-9a57-4c88-9141-6d725875c555,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape75bded3-9a')
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.967 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.967 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.967 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] No VIF found with MAC fa:16:3e:95:8f:28, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:46:23 compute-0 nova_compute[182935]: 2026-01-22 00:46:23.968 182939 INFO nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Using config drive
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.302 182939 INFO nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Creating config drive at /var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk.config
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.307 182939 DEBUG oslo_concurrency.processutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3g7o3zhp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.435 182939 DEBUG oslo_concurrency.processutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3g7o3zhp" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:46:24 compute-0 kernel: tape75bded3-9a: entered promiscuous mode
Jan 22 00:46:24 compute-0 ovn_controller[95047]: 2026-01-22T00:46:24Z|00748|binding|INFO|Claiming lport e75bded3-9a57-4c88-9141-6d725875c555 for this chassis.
Jan 22 00:46:24 compute-0 ovn_controller[95047]: 2026-01-22T00:46:24Z|00749|binding|INFO|e75bded3-9a57-4c88-9141-6d725875c555: Claiming fa:16:3e:95:8f:28 10.100.0.13 2001:db8::f816:3eff:fe95:8f28
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.503 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:24 compute-0 NetworkManager[55139]: <info>  [1769042784.5050] manager: (tape75bded3-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/370)
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.507 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.514 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:24 compute-0 NetworkManager[55139]: <info>  [1769042784.5152] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/371)
Jan 22 00:46:24 compute-0 NetworkManager[55139]: <info>  [1769042784.5158] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/372)
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.520 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:8f:28 10.100.0.13 2001:db8::f816:3eff:fe95:8f28'], port_security=['fa:16:3e:95:8f:28 10.100.0.13 2001:db8::f816:3eff:fe95:8f28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fe95:8f28/64', 'neutron:device_id': '3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'bd69fbc7-ff38-42ce-b5d5-6559f7285ccb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496d15df-9baa-43c6-8bd0-ae8566291be1, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=e75bded3-9a57-4c88-9141-6d725875c555) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.521 104408 INFO neutron.agent.ovn.metadata.agent [-] Port e75bded3-9a57-4c88-9141-6d725875c555 in datapath 83666af9-15ce-4344-a623-7180c9b2515a bound to our chassis
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.523 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 83666af9-15ce-4344-a623-7180c9b2515a
Jan 22 00:46:24 compute-0 systemd-udevd[247062]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.541 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e6eff61e-a736-49ba-85e1-4de5942d8115]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.542 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap83666af9-11 in ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:46:24 compute-0 NetworkManager[55139]: <info>  [1769042784.5436] device (tape75bded3-9a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:46:24 compute-0 NetworkManager[55139]: <info>  [1769042784.5445] device (tape75bded3-9a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.544 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap83666af9-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.544 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[651b8963-eab1-4c0e-b262-cf7e499ef759]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.546 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cd00ada5-d75b-465a-83d3-52514a0172e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 systemd-machined[154182]: New machine qemu-92-instance-000000b8.
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.557 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[1c68b274-7b01-4705-9f15-ec42e69d42c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.581 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:24 compute-0 systemd[1]: Started Virtual Machine qemu-92-instance-000000b8.
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.589 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a0c03deb-cb8b-4ece-be9a-27ecd7155cee]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.596 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:24 compute-0 ovn_controller[95047]: 2026-01-22T00:46:24Z|00750|binding|INFO|Setting lport e75bded3-9a57-4c88-9141-6d725875c555 ovn-installed in OVS
Jan 22 00:46:24 compute-0 ovn_controller[95047]: 2026-01-22T00:46:24Z|00751|binding|INFO|Setting lport e75bded3-9a57-4c88-9141-6d725875c555 up in Southbound
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.606 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.623 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[73527644-24a7-425f-958d-36da53a92302]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.628 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dafd08c7-dc06-42a3-bcdf-a41043616032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 NetworkManager[55139]: <info>  [1769042784.6303] manager: (tap83666af9-10): new Veth device (/org/freedesktop/NetworkManager/Devices/373)
Jan 22 00:46:24 compute-0 systemd-udevd[247066]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.662 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[7e53311e-89b2-4137-a3c9-ad1e92e0f2e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.666 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[9c37a90c-4490-4697-b15e-9d6127a087b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 NetworkManager[55139]: <info>  [1769042784.6890] device (tap83666af9-10): carrier: link connected
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.696 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[f5689b53-8833-49a0-86cf-6621a6972ece]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.718 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9f375c5b-6f33-4579-91c0-e38444e7cc46]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap83666af9-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:5f:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 238], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 728742, 'reachable_time': 27504, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247097, 'error': None, 'target': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.738 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f7dacc9a-0779-45f5-b289-6d9639e9baab]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:5f6e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 728742, 'tstamp': 728742}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247098, 'error': None, 'target': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.755 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[32edf030-82a6-4f1b-9014-a20c3c3adce3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap83666af9-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:5f:6e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 238], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 728742, 'reachable_time': 27504, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247099, 'error': None, 'target': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.794 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a4069260-3808-4895-975e-6a44a45720c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.871 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cfdd9eb4-47e6-4e85-9fda-87f6c9919351]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.873 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83666af9-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.874 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.874 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap83666af9-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.876 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:24 compute-0 kernel: tap83666af9-10: entered promiscuous mode
Jan 22 00:46:24 compute-0 NetworkManager[55139]: <info>  [1769042784.8788] manager: (tap83666af9-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/374)
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.879 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap83666af9-10, col_values=(('external_ids', {'iface-id': '4a596305-d10e-4e9e-a8ea-d94a630e8baa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:46:24 compute-0 ovn_controller[95047]: 2026-01-22T00:46:24Z|00752|binding|INFO|Releasing lport 4a596305-d10e-4e9e-a8ea-d94a630e8baa from this chassis (sb_readonly=0)
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.880 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.881 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.881 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/83666af9-15ce-4344-a623-7180c9b2515a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/83666af9-15ce-4344-a623-7180c9b2515a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.882 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[1dcff47e-0aef-4ca7-9ac4-3f984af38ec1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.883 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-83666af9-15ce-4344-a623-7180c9b2515a
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/83666af9-15ce-4344-a623-7180c9b2515a.pid.haproxy
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 83666af9-15ce-4344-a623-7180c9b2515a
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:46:24 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:24.884 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'env', 'PROCESS_TAG=haproxy-83666af9-15ce-4344-a623-7180c9b2515a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/83666af9-15ce-4344-a623-7180c9b2515a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.891 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.893 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042784.8926451, 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.893 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] VM Started (Lifecycle Event)
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.902 182939 DEBUG nova.compute.manager [req-c9105428-c2d2-4df2-954f-fa77f22247a2 req-16e49f0a-4528-4ca3-9f82-b53437be94e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Received event network-vif-plugged-e75bded3-9a57-4c88-9141-6d725875c555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.902 182939 DEBUG oslo_concurrency.lockutils [req-c9105428-c2d2-4df2-954f-fa77f22247a2 req-16e49f0a-4528-4ca3-9f82-b53437be94e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.903 182939 DEBUG oslo_concurrency.lockutils [req-c9105428-c2d2-4df2-954f-fa77f22247a2 req-16e49f0a-4528-4ca3-9f82-b53437be94e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.903 182939 DEBUG oslo_concurrency.lockutils [req-c9105428-c2d2-4df2-954f-fa77f22247a2 req-16e49f0a-4528-4ca3-9f82-b53437be94e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.903 182939 DEBUG nova.compute.manager [req-c9105428-c2d2-4df2-954f-fa77f22247a2 req-16e49f0a-4528-4ca3-9f82-b53437be94e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Processing event network-vif-plugged-e75bded3-9a57-4c88-9141-6d725875c555 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.904 182939 DEBUG nova.compute.manager [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.907 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.912 182939 INFO nova.virt.libvirt.driver [-] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Instance spawned successfully.
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.913 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.916 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.920 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.941 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.941 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042784.8950598, 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.942 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] VM Paused (Lifecycle Event)
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.946 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.946 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.947 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.947 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.948 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.948 182939 DEBUG nova.virt.libvirt.driver [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.972 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.975 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042784.9066675, 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.975 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] VM Resumed (Lifecycle Event)
Jan 22 00:46:24 compute-0 nova_compute[182935]: 2026-01-22 00:46:24.997 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:46:25 compute-0 nova_compute[182935]: 2026-01-22 00:46:25.001 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:46:25 compute-0 nova_compute[182935]: 2026-01-22 00:46:25.020 182939 INFO nova.compute.manager [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Took 10.21 seconds to spawn the instance on the hypervisor.
Jan 22 00:46:25 compute-0 nova_compute[182935]: 2026-01-22 00:46:25.020 182939 DEBUG nova.compute.manager [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:46:25 compute-0 nova_compute[182935]: 2026-01-22 00:46:25.023 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:46:25 compute-0 nova_compute[182935]: 2026-01-22 00:46:25.102 182939 INFO nova.compute.manager [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Took 10.75 seconds to build instance.
Jan 22 00:46:25 compute-0 nova_compute[182935]: 2026-01-22 00:46:25.120 182939 DEBUG oslo_concurrency.lockutils [None req-acd9ab34-2d2b-4910-8e01-0fd9bba3a758 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.838s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:25 compute-0 podman[247138]: 2026-01-22 00:46:25.243045604 +0000 UTC m=+0.051607537 container create e33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:46:25 compute-0 systemd[1]: Started libpod-conmon-e33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb.scope.
Jan 22 00:46:25 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:46:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6869a57ac628691638ade053aaa7ba01a3f5915cab067f157bfcf9a11da84e9f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:46:25 compute-0 podman[247138]: 2026-01-22 00:46:25.309008253 +0000 UTC m=+0.117570206 container init e33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:46:25 compute-0 podman[247138]: 2026-01-22 00:46:25.217615765 +0000 UTC m=+0.026177718 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:46:25 compute-0 podman[247138]: 2026-01-22 00:46:25.313923981 +0000 UTC m=+0.122485914 container start e33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 22 00:46:25 compute-0 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[247151]: [NOTICE]   (247155) : New worker (247157) forked
Jan 22 00:46:25 compute-0 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[247151]: [NOTICE]   (247155) : Loading success.
Jan 22 00:46:25 compute-0 nova_compute[182935]: 2026-01-22 00:46:25.645 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:25.646 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:46:25 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:25.647 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:46:25 compute-0 nova_compute[182935]: 2026-01-22 00:46:25.722 182939 DEBUG nova.network.neutron [req-f1cc5b85-dbe4-4788-9046-1114ef237a03 req-4b3a7bf4-3224-4f35-a199-015a596747e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Updated VIF entry in instance network info cache for port e75bded3-9a57-4c88-9141-6d725875c555. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:46:25 compute-0 nova_compute[182935]: 2026-01-22 00:46:25.723 182939 DEBUG nova.network.neutron [req-f1cc5b85-dbe4-4788-9046-1114ef237a03 req-4b3a7bf4-3224-4f35-a199-015a596747e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Updating instance_info_cache with network_info: [{"id": "e75bded3-9a57-4c88-9141-6d725875c555", "address": "fa:16:3e:95:8f:28", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:8f28", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75bded3-9a", "ovs_interfaceid": "e75bded3-9a57-4c88-9141-6d725875c555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:46:25 compute-0 nova_compute[182935]: 2026-01-22 00:46:25.761 182939 DEBUG oslo_concurrency.lockutils [req-f1cc5b85-dbe4-4788-9046-1114ef237a03 req-4b3a7bf4-3224-4f35-a199-015a596747e6 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:46:27 compute-0 nova_compute[182935]: 2026-01-22 00:46:27.011 182939 DEBUG nova.compute.manager [req-45edc5ec-17d7-4bf1-8457-2c8cc188ef3c req-61f039ac-ef3b-4786-b0e9-ca9f41903475 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Received event network-vif-plugged-e75bded3-9a57-4c88-9141-6d725875c555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:46:27 compute-0 nova_compute[182935]: 2026-01-22 00:46:27.013 182939 DEBUG oslo_concurrency.lockutils [req-45edc5ec-17d7-4bf1-8457-2c8cc188ef3c req-61f039ac-ef3b-4786-b0e9-ca9f41903475 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:27 compute-0 nova_compute[182935]: 2026-01-22 00:46:27.013 182939 DEBUG oslo_concurrency.lockutils [req-45edc5ec-17d7-4bf1-8457-2c8cc188ef3c req-61f039ac-ef3b-4786-b0e9-ca9f41903475 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:27 compute-0 nova_compute[182935]: 2026-01-22 00:46:27.013 182939 DEBUG oslo_concurrency.lockutils [req-45edc5ec-17d7-4bf1-8457-2c8cc188ef3c req-61f039ac-ef3b-4786-b0e9-ca9f41903475 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:27 compute-0 nova_compute[182935]: 2026-01-22 00:46:27.013 182939 DEBUG nova.compute.manager [req-45edc5ec-17d7-4bf1-8457-2c8cc188ef3c req-61f039ac-ef3b-4786-b0e9-ca9f41903475 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] No waiting events found dispatching network-vif-plugged-e75bded3-9a57-4c88-9141-6d725875c555 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:46:27 compute-0 nova_compute[182935]: 2026-01-22 00:46:27.014 182939 WARNING nova.compute.manager [req-45edc5ec-17d7-4bf1-8457-2c8cc188ef3c req-61f039ac-ef3b-4786-b0e9-ca9f41903475 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Received unexpected event network-vif-plugged-e75bded3-9a57-4c88-9141-6d725875c555 for instance with vm_state active and task_state None.
Jan 22 00:46:27 compute-0 nova_compute[182935]: 2026-01-22 00:46:27.047 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:28 compute-0 nova_compute[182935]: 2026-01-22 00:46:28.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:46:28 compute-0 nova_compute[182935]: 2026-01-22 00:46:28.906 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:29 compute-0 nova_compute[182935]: 2026-01-22 00:46:29.144 182939 DEBUG nova.compute.manager [req-9b76de13-5c49-4ab5-b004-bbe7970ae5dc req-91d58dfb-0ca0-4091-a646-a8eb60cba8aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Received event network-changed-e75bded3-9a57-4c88-9141-6d725875c555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:46:29 compute-0 nova_compute[182935]: 2026-01-22 00:46:29.144 182939 DEBUG nova.compute.manager [req-9b76de13-5c49-4ab5-b004-bbe7970ae5dc req-91d58dfb-0ca0-4091-a646-a8eb60cba8aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Refreshing instance network info cache due to event network-changed-e75bded3-9a57-4c88-9141-6d725875c555. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:46:29 compute-0 nova_compute[182935]: 2026-01-22 00:46:29.145 182939 DEBUG oslo_concurrency.lockutils [req-9b76de13-5c49-4ab5-b004-bbe7970ae5dc req-91d58dfb-0ca0-4091-a646-a8eb60cba8aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:46:29 compute-0 nova_compute[182935]: 2026-01-22 00:46:29.145 182939 DEBUG oslo_concurrency.lockutils [req-9b76de13-5c49-4ab5-b004-bbe7970ae5dc req-91d58dfb-0ca0-4091-a646-a8eb60cba8aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:46:29 compute-0 nova_compute[182935]: 2026-01-22 00:46:29.145 182939 DEBUG nova.network.neutron [req-9b76de13-5c49-4ab5-b004-bbe7970ae5dc req-91d58dfb-0ca0-4091-a646-a8eb60cba8aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Refreshing network info cache for port e75bded3-9a57-4c88-9141-6d725875c555 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:46:30 compute-0 nova_compute[182935]: 2026-01-22 00:46:30.608 182939 DEBUG nova.network.neutron [req-9b76de13-5c49-4ab5-b004-bbe7970ae5dc req-91d58dfb-0ca0-4091-a646-a8eb60cba8aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Updated VIF entry in instance network info cache for port e75bded3-9a57-4c88-9141-6d725875c555. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:46:30 compute-0 nova_compute[182935]: 2026-01-22 00:46:30.609 182939 DEBUG nova.network.neutron [req-9b76de13-5c49-4ab5-b004-bbe7970ae5dc req-91d58dfb-0ca0-4091-a646-a8eb60cba8aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Updating instance_info_cache with network_info: [{"id": "e75bded3-9a57-4c88-9141-6d725875c555", "address": "fa:16:3e:95:8f:28", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:8f28", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75bded3-9a", "ovs_interfaceid": "e75bded3-9a57-4c88-9141-6d725875c555", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:46:30 compute-0 nova_compute[182935]: 2026-01-22 00:46:30.634 182939 DEBUG oslo_concurrency.lockutils [req-9b76de13-5c49-4ab5-b004-bbe7970ae5dc req-91d58dfb-0ca0-4091-a646-a8eb60cba8aa 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:46:32 compute-0 nova_compute[182935]: 2026-01-22 00:46:32.049 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:33 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:33.649 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:46:33 compute-0 nova_compute[182935]: 2026-01-22 00:46:33.909 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:34 compute-0 podman[247167]: 2026-01-22 00:46:34.718593802 +0000 UTC m=+0.063303926 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:46:34 compute-0 podman[247169]: 2026-01-22 00:46:34.724423892 +0000 UTC m=+0.057674662 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:46:34 compute-0 podman[247168]: 2026-01-22 00:46:34.740102898 +0000 UTC m=+0.082739193 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:46:37 compute-0 nova_compute[182935]: 2026-01-22 00:46:37.053 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:37 compute-0 ovn_controller[95047]: 2026-01-22T00:46:37Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:8f:28 10.100.0.13
Jan 22 00:46:37 compute-0 ovn_controller[95047]: 2026-01-22T00:46:37Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:8f:28 10.100.0.13
Jan 22 00:46:38 compute-0 nova_compute[182935]: 2026-01-22 00:46:38.913 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:42 compute-0 nova_compute[182935]: 2026-01-22 00:46:42.056 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:42 compute-0 podman[247250]: 2026-01-22 00:46:42.673500495 +0000 UTC m=+0.046115456 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:46:43 compute-0 nova_compute[182935]: 2026-01-22 00:46:43.916 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.854 182939 DEBUG nova.compute.manager [req-44b648d8-0eee-4f75-97e4-1e5caf202caa req-a883b88d-faca-4a76-8225-d3f49216a197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Received event network-changed-e75bded3-9a57-4c88-9141-6d725875c555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.854 182939 DEBUG nova.compute.manager [req-44b648d8-0eee-4f75-97e4-1e5caf202caa req-a883b88d-faca-4a76-8225-d3f49216a197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Refreshing instance network info cache due to event network-changed-e75bded3-9a57-4c88-9141-6d725875c555. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.854 182939 DEBUG oslo_concurrency.lockutils [req-44b648d8-0eee-4f75-97e4-1e5caf202caa req-a883b88d-faca-4a76-8225-d3f49216a197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.854 182939 DEBUG oslo_concurrency.lockutils [req-44b648d8-0eee-4f75-97e4-1e5caf202caa req-a883b88d-faca-4a76-8225-d3f49216a197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.854 182939 DEBUG nova.network.neutron [req-44b648d8-0eee-4f75-97e4-1e5caf202caa req-a883b88d-faca-4a76-8225-d3f49216a197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Refreshing network info cache for port e75bded3-9a57-4c88-9141-6d725875c555 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.936 182939 DEBUG oslo_concurrency.lockutils [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.936 182939 DEBUG oslo_concurrency.lockutils [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.937 182939 DEBUG oslo_concurrency.lockutils [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.937 182939 DEBUG oslo_concurrency.lockutils [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.937 182939 DEBUG oslo_concurrency.lockutils [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.950 182939 INFO nova.compute.manager [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Terminating instance
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.961 182939 DEBUG nova.compute.manager [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:46:45 compute-0 kernel: tape75bded3-9a (unregistering): left promiscuous mode
Jan 22 00:46:45 compute-0 NetworkManager[55139]: <info>  [1769042805.9833] device (tape75bded3-9a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:46:45 compute-0 ovn_controller[95047]: 2026-01-22T00:46:45Z|00753|binding|INFO|Releasing lport e75bded3-9a57-4c88-9141-6d725875c555 from this chassis (sb_readonly=0)
Jan 22 00:46:45 compute-0 ovn_controller[95047]: 2026-01-22T00:46:45Z|00754|binding|INFO|Setting lport e75bded3-9a57-4c88-9141-6d725875c555 down in Southbound
Jan 22 00:46:45 compute-0 nova_compute[182935]: 2026-01-22 00:46:45.992 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:45 compute-0 ovn_controller[95047]: 2026-01-22T00:46:45Z|00755|binding|INFO|Removing iface tape75bded3-9a ovn-installed in OVS
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.005 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:8f:28 10.100.0.13 2001:db8::f816:3eff:fe95:8f28'], port_security=['fa:16:3e:95:8f:28 10.100.0.13 2001:db8::f816:3eff:fe95:8f28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fe95:8f28/64', 'neutron:device_id': '3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-83666af9-15ce-4344-a623-7180c9b2515a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '837db8748d074b3c9179b47d30e7a1d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'bd69fbc7-ff38-42ce-b5d5-6559f7285ccb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=496d15df-9baa-43c6-8bd0-ae8566291be1, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=e75bded3-9a57-4c88-9141-6d725875c555) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.007 104408 INFO neutron.agent.ovn.metadata.agent [-] Port e75bded3-9a57-4c88-9141-6d725875c555 in datapath 83666af9-15ce-4344-a623-7180c9b2515a unbound from our chassis
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.008 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.008 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 83666af9-15ce-4344-a623-7180c9b2515a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.009 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[22ec2731-0da7-4e21-9805-3570b9c27475]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.009 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a namespace which is not needed anymore
Jan 22 00:46:46 compute-0 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000b8.scope: Deactivated successfully.
Jan 22 00:46:46 compute-0 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000b8.scope: Consumed 13.722s CPU time.
Jan 22 00:46:46 compute-0 systemd-machined[154182]: Machine qemu-92-instance-000000b8 terminated.
Jan 22 00:46:46 compute-0 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[247151]: [NOTICE]   (247155) : haproxy version is 2.8.14-c23fe91
Jan 22 00:46:46 compute-0 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[247151]: [NOTICE]   (247155) : path to executable is /usr/sbin/haproxy
Jan 22 00:46:46 compute-0 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[247151]: [WARNING]  (247155) : Exiting Master process...
Jan 22 00:46:46 compute-0 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[247151]: [ALERT]    (247155) : Current worker (247157) exited with code 143 (Terminated)
Jan 22 00:46:46 compute-0 neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a[247151]: [WARNING]  (247155) : All workers exited. Exiting... (0)
Jan 22 00:46:46 compute-0 systemd[1]: libpod-e33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb.scope: Deactivated successfully.
Jan 22 00:46:46 compute-0 podman[247295]: 2026-01-22 00:46:46.135007747 +0000 UTC m=+0.045116181 container died e33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:46:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb-userdata-shm.mount: Deactivated successfully.
Jan 22 00:46:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-6869a57ac628691638ade053aaa7ba01a3f5915cab067f157bfcf9a11da84e9f-merged.mount: Deactivated successfully.
Jan 22 00:46:46 compute-0 podman[247295]: 2026-01-22 00:46:46.167142247 +0000 UTC m=+0.077250671 container cleanup e33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:46:46 compute-0 systemd[1]: libpod-conmon-e33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb.scope: Deactivated successfully.
Jan 22 00:46:46 compute-0 podman[247323]: 2026-01-22 00:46:46.231365524 +0000 UTC m=+0.041477124 container remove e33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.232 182939 INFO nova.virt.libvirt.driver [-] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Instance destroyed successfully.
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.233 182939 DEBUG nova.objects.instance [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lazy-loading 'resources' on Instance uuid 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.237 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[51b6ca44-19a5-4292-9e03-afca2eb0bd58]: (4, ('Thu Jan 22 12:46:46 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a (e33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb)\ne33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb\nThu Jan 22 12:46:46 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a (e33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb)\ne33cb8b349ada3f8c763699069def27846d61b731c770ff5ba2faead0307b3fb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.239 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a080273b-0876-4cd7-823a-f52e3f0287cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.240 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83666af9-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.241 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:46 compute-0 kernel: tap83666af9-10: left promiscuous mode
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.249 182939 DEBUG nova.virt.libvirt.vif [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:46:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1092166276',display_name='tempest-TestGettingAddress-server-1092166276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1092166276',id=184,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL3/PLQY4lAQU2yFGaoAmqWPJI5565ofTauEAmPcwEncHglgrmt+9X41pDrGx2Hzo63wjxi644i8QnD2R87vFxz3Kmnkg4MUbe27S7AT4N98N34iBfOk+UwjPX/szWkvLg==',key_name='tempest-TestGettingAddress-2046948813',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:46:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='837db8748d074b3c9179b47d30e7a1d4',ramdisk_id='',reservation_id='r-q7vqm9ug',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-471729430',owner_user_name='tempest-TestGettingAddress-471729430-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:46:25Z,user_data=None,user_id='a8fd196423d94b309668ffd08655f7ed',uuid=3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e75bded3-9a57-4c88-9141-6d725875c555", "address": "fa:16:3e:95:8f:28", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:8f28", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75bded3-9a", "ovs_interfaceid": "e75bded3-9a57-4c88-9141-6d725875c555", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.249 182939 DEBUG nova.network.os_vif_util [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converting VIF {"id": "e75bded3-9a57-4c88-9141-6d725875c555", "address": "fa:16:3e:95:8f:28", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:8f28", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75bded3-9a", "ovs_interfaceid": "e75bded3-9a57-4c88-9141-6d725875c555", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.250 182939 DEBUG nova.network.os_vif_util [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:8f:28,bridge_name='br-int',has_traffic_filtering=True,id=e75bded3-9a57-4c88-9141-6d725875c555,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape75bded3-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.250 182939 DEBUG os_vif [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:8f:28,bridge_name='br-int',has_traffic_filtering=True,id=e75bded3-9a57-4c88-9141-6d725875c555,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape75bded3-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.252 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.253 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape75bded3-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.254 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.256 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.258 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5d0f3f3e-b4a4-4b82-9736-f016cc5115ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.259 182939 INFO os_vif [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:8f:28,bridge_name='br-int',has_traffic_filtering=True,id=e75bded3-9a57-4c88-9141-6d725875c555,network=Network(83666af9-15ce-4344-a623-7180c9b2515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape75bded3-9a')
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.260 182939 INFO nova.virt.libvirt.driver [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Deleting instance files /var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5_del
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.261 182939 INFO nova.virt.libvirt.driver [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Deletion of /var/lib/nova/instances/3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5_del complete
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.284 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[984e25c3-dcf8-4b63-aed4-01b1cb4f7146]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.285 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[40b7dabb-2017-448e-a910-7db0a6ed3bca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.301 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[6efa83e2-f04f-48fe-80ae-20857591ccd6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 728735, 'reachable_time': 42437, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247354, 'error': None, 'target': 'ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d83666af9\x2d15ce\x2d4344\x2da623\x2d7180c9b2515a.mount: Deactivated successfully.
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.305 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-83666af9-15ce-4344-a623-7180c9b2515a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:46:46 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:46:46.305 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[22e2ac4e-a774-4f1f-9039-1c6cdf531374]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.332 182939 INFO nova.compute.manager [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.333 182939 DEBUG oslo.service.loopingcall [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.333 182939 DEBUG nova.compute.manager [-] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:46:46 compute-0 nova_compute[182935]: 2026-01-22 00:46:46.334 182939 DEBUG nova.network.neutron [-] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.058 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.423 182939 DEBUG nova.network.neutron [-] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.444 182939 INFO nova.compute.manager [-] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Took 1.11 seconds to deallocate network for instance.
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.523 182939 DEBUG nova.compute.manager [req-16c1d54c-0685-40a7-b234-8bc7c9047c08 req-88083022-6421-4be4-a533-f645bfe37b5d 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Received event network-vif-deleted-e75bded3-9a57-4c88-9141-6d725875c555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.525 182939 DEBUG oslo_concurrency.lockutils [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.526 182939 DEBUG oslo_concurrency.lockutils [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.594 182939 DEBUG nova.compute.provider_tree [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.612 182939 DEBUG nova.scheduler.client.report [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.635 182939 DEBUG oslo_concurrency.lockutils [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.664 182939 INFO nova.scheduler.client.report [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Deleted allocations for instance 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5
Jan 22 00:46:47 compute-0 podman[247355]: 2026-01-22 00:46:47.693391727 +0000 UTC m=+0.063257645 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Jan 22 00:46:47 compute-0 podman[247356]: 2026-01-22 00:46:47.695052247 +0000 UTC m=+0.063985143 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.745 182939 DEBUG oslo_concurrency.lockutils [None req-6b2d3d7b-d833-499a-8a6c-db237536d0c6 a8fd196423d94b309668ffd08655f7ed 837db8748d074b3c9179b47d30e7a1d4 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.972 182939 DEBUG nova.compute.manager [req-aefc148d-665e-4c4e-a305-89654241195d req-3f985e66-0a72-4223-8837-cbec90fb6d09 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Received event network-vif-unplugged-e75bded3-9a57-4c88-9141-6d725875c555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.972 182939 DEBUG oslo_concurrency.lockutils [req-aefc148d-665e-4c4e-a305-89654241195d req-3f985e66-0a72-4223-8837-cbec90fb6d09 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.973 182939 DEBUG oslo_concurrency.lockutils [req-aefc148d-665e-4c4e-a305-89654241195d req-3f985e66-0a72-4223-8837-cbec90fb6d09 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.973 182939 DEBUG oslo_concurrency.lockutils [req-aefc148d-665e-4c4e-a305-89654241195d req-3f985e66-0a72-4223-8837-cbec90fb6d09 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.973 182939 DEBUG nova.compute.manager [req-aefc148d-665e-4c4e-a305-89654241195d req-3f985e66-0a72-4223-8837-cbec90fb6d09 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] No waiting events found dispatching network-vif-unplugged-e75bded3-9a57-4c88-9141-6d725875c555 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.974 182939 WARNING nova.compute.manager [req-aefc148d-665e-4c4e-a305-89654241195d req-3f985e66-0a72-4223-8837-cbec90fb6d09 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Received unexpected event network-vif-unplugged-e75bded3-9a57-4c88-9141-6d725875c555 for instance with vm_state deleted and task_state None.
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.974 182939 DEBUG nova.compute.manager [req-aefc148d-665e-4c4e-a305-89654241195d req-3f985e66-0a72-4223-8837-cbec90fb6d09 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Received event network-vif-plugged-e75bded3-9a57-4c88-9141-6d725875c555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.974 182939 DEBUG oslo_concurrency.lockutils [req-aefc148d-665e-4c4e-a305-89654241195d req-3f985e66-0a72-4223-8837-cbec90fb6d09 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.974 182939 DEBUG oslo_concurrency.lockutils [req-aefc148d-665e-4c4e-a305-89654241195d req-3f985e66-0a72-4223-8837-cbec90fb6d09 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.974 182939 DEBUG oslo_concurrency.lockutils [req-aefc148d-665e-4c4e-a305-89654241195d req-3f985e66-0a72-4223-8837-cbec90fb6d09 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.975 182939 DEBUG nova.compute.manager [req-aefc148d-665e-4c4e-a305-89654241195d req-3f985e66-0a72-4223-8837-cbec90fb6d09 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] No waiting events found dispatching network-vif-plugged-e75bded3-9a57-4c88-9141-6d725875c555 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:46:47 compute-0 nova_compute[182935]: 2026-01-22 00:46:47.975 182939 WARNING nova.compute.manager [req-aefc148d-665e-4c4e-a305-89654241195d req-3f985e66-0a72-4223-8837-cbec90fb6d09 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Received unexpected event network-vif-plugged-e75bded3-9a57-4c88-9141-6d725875c555 for instance with vm_state deleted and task_state None.
Jan 22 00:46:48 compute-0 nova_compute[182935]: 2026-01-22 00:46:48.050 182939 DEBUG nova.network.neutron [req-44b648d8-0eee-4f75-97e4-1e5caf202caa req-a883b88d-faca-4a76-8225-d3f49216a197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Updated VIF entry in instance network info cache for port e75bded3-9a57-4c88-9141-6d725875c555. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:46:48 compute-0 nova_compute[182935]: 2026-01-22 00:46:48.050 182939 DEBUG nova.network.neutron [req-44b648d8-0eee-4f75-97e4-1e5caf202caa req-a883b88d-faca-4a76-8225-d3f49216a197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Updating instance_info_cache with network_info: [{"id": "e75bded3-9a57-4c88-9141-6d725875c555", "address": "fa:16:3e:95:8f:28", "network": {"id": "83666af9-15ce-4344-a623-7180c9b2515a", "bridge": "br-int", "label": "tempest-network-smoke--333028759", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe95:8f28", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "837db8748d074b3c9179b47d30e7a1d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape75bded3-9a", "ovs_interfaceid": "e75bded3-9a57-4c88-9141-6d725875c555", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:46:48 compute-0 nova_compute[182935]: 2026-01-22 00:46:48.070 182939 DEBUG oslo_concurrency.lockutils [req-44b648d8-0eee-4f75-97e4-1e5caf202caa req-a883b88d-faca-4a76-8225-d3f49216a197 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:46:51 compute-0 nova_compute[182935]: 2026-01-22 00:46:51.254 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:52 compute-0 nova_compute[182935]: 2026-01-22 00:46:52.060 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:53 compute-0 sshd-session[247396]: Connection closed by authenticating user operator 188.166.69.60 port 52212 [preauth]
Jan 22 00:46:56 compute-0 nova_compute[182935]: 2026-01-22 00:46:56.257 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:57 compute-0 nova_compute[182935]: 2026-01-22 00:46:57.062 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:57 compute-0 nova_compute[182935]: 2026-01-22 00:46:57.447 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:46:57 compute-0 nova_compute[182935]: 2026-01-22 00:46:57.515 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:01 compute-0 nova_compute[182935]: 2026-01-22 00:47:01.230 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042806.230256, 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:47:01 compute-0 nova_compute[182935]: 2026-01-22 00:47:01.231 182939 INFO nova.compute.manager [-] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] VM Stopped (Lifecycle Event)
Jan 22 00:47:01 compute-0 nova_compute[182935]: 2026-01-22 00:47:01.256 182939 DEBUG nova.compute.manager [None req-8e69e0e3-5bbf-489f-8e5d-dc81615700c4 - - - - - -] [instance: 3aee85cf-aabc-45a7-9be5-bfcfdb5c70b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:47:01 compute-0 nova_compute[182935]: 2026-01-22 00:47:01.259 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:02 compute-0 nova_compute[182935]: 2026-01-22 00:47:02.062 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:03.247 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:03.248 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:03.248 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:05 compute-0 podman[247399]: 2026-01-22 00:47:05.693854533 +0000 UTC m=+0.070277563 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:47:05 compute-0 podman[247401]: 2026-01-22 00:47:05.705792279 +0000 UTC m=+0.069863313 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:47:05 compute-0 podman[247400]: 2026-01-22 00:47:05.731519365 +0000 UTC m=+0.101954882 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 22 00:47:06 compute-0 nova_compute[182935]: 2026-01-22 00:47:06.261 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:07 compute-0 nova_compute[182935]: 2026-01-22 00:47:07.064 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:10 compute-0 nova_compute[182935]: 2026-01-22 00:47:10.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:10 compute-0 nova_compute[182935]: 2026-01-22 00:47:10.828 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:10 compute-0 nova_compute[182935]: 2026-01-22 00:47:10.830 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:10 compute-0 nova_compute[182935]: 2026-01-22 00:47:10.830 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:10 compute-0 nova_compute[182935]: 2026-01-22 00:47:10.830 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:47:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:10.869 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:47:10 compute-0 nova_compute[182935]: 2026-01-22 00:47:10.869 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:10 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:10.870 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:47:10 compute-0 nova_compute[182935]: 2026-01-22 00:47:10.995 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:47:10 compute-0 nova_compute[182935]: 2026-01-22 00:47:10.996 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5717MB free_disk=73.12208938598633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:47:10 compute-0 nova_compute[182935]: 2026-01-22 00:47:10.997 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:10 compute-0 nova_compute[182935]: 2026-01-22 00:47:10.997 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:11 compute-0 nova_compute[182935]: 2026-01-22 00:47:11.073 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:47:11 compute-0 nova_compute[182935]: 2026-01-22 00:47:11.074 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:47:11 compute-0 nova_compute[182935]: 2026-01-22 00:47:11.099 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:47:11 compute-0 nova_compute[182935]: 2026-01-22 00:47:11.113 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:47:11 compute-0 nova_compute[182935]: 2026-01-22 00:47:11.138 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:47:11 compute-0 nova_compute[182935]: 2026-01-22 00:47:11.138 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:11 compute-0 nova_compute[182935]: 2026-01-22 00:47:11.263 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:11 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:11.872 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:12 compute-0 nova_compute[182935]: 2026-01-22 00:47:12.065 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:13 compute-0 podman[247472]: 2026-01-22 00:47:13.6982299 +0000 UTC m=+0.061228198 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:47:15 compute-0 nova_compute[182935]: 2026-01-22 00:47:15.139 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:15 compute-0 nova_compute[182935]: 2026-01-22 00:47:15.139 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:47:15 compute-0 nova_compute[182935]: 2026-01-22 00:47:15.139 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:47:15 compute-0 nova_compute[182935]: 2026-01-22 00:47:15.154 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:47:15 compute-0 nova_compute[182935]: 2026-01-22 00:47:15.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:15 compute-0 nova_compute[182935]: 2026-01-22 00:47:15.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:15 compute-0 nova_compute[182935]: 2026-01-22 00:47:15.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:47:16 compute-0 nova_compute[182935]: 2026-01-22 00:47:16.264 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:17 compute-0 nova_compute[182935]: 2026-01-22 00:47:17.067 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:18 compute-0 podman[247492]: 2026-01-22 00:47:18.698464243 +0000 UTC m=+0.063652405 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Jan 22 00:47:18 compute-0 podman[247493]: 2026-01-22 00:47:18.722750345 +0000 UTC m=+0.082595489 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:47:21 compute-0 nova_compute[182935]: 2026-01-22 00:47:21.267 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:22 compute-0 nova_compute[182935]: 2026-01-22 00:47:22.069 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:22 compute-0 nova_compute[182935]: 2026-01-22 00:47:22.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:24 compute-0 nova_compute[182935]: 2026-01-22 00:47:24.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:24 compute-0 nova_compute[182935]: 2026-01-22 00:47:24.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:25 compute-0 nova_compute[182935]: 2026-01-22 00:47:25.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:26 compute-0 nova_compute[182935]: 2026-01-22 00:47:26.268 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:27 compute-0 nova_compute[182935]: 2026-01-22 00:47:27.110 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:29 compute-0 nova_compute[182935]: 2026-01-22 00:47:29.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:47:31 compute-0 nova_compute[182935]: 2026-01-22 00:47:31.269 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:32 compute-0 nova_compute[182935]: 2026-01-22 00:47:32.111 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:32 compute-0 nova_compute[182935]: 2026-01-22 00:47:32.845 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "240ee2c3-f964-4de0-a440-9ce354fca15c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:32 compute-0 nova_compute[182935]: 2026-01-22 00:47:32.846 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:32 compute-0 nova_compute[182935]: 2026-01-22 00:47:32.861 182939 DEBUG nova.compute.manager [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:47:32 compute-0 nova_compute[182935]: 2026-01-22 00:47:32.955 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:32 compute-0 nova_compute[182935]: 2026-01-22 00:47:32.955 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:32 compute-0 nova_compute[182935]: 2026-01-22 00:47:32.960 182939 DEBUG nova.virt.hardware [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:47:32 compute-0 nova_compute[182935]: 2026-01-22 00:47:32.960 182939 INFO nova.compute.claims [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.077 182939 DEBUG nova.compute.provider_tree [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.092 182939 DEBUG nova.scheduler.client.report [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.126 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.126 182939 DEBUG nova.compute.manager [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.188 182939 DEBUG nova.compute.manager [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.189 182939 DEBUG nova.network.neutron [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.209 182939 INFO nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.228 182939 DEBUG nova.compute.manager [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.352 182939 DEBUG nova.compute.manager [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.353 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.353 182939 INFO nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Creating image(s)
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.354 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "/var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.354 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "/var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.355 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "/var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.368 182939 DEBUG oslo_concurrency.processutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.427 182939 DEBUG oslo_concurrency.processutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.428 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.429 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.439 182939 DEBUG oslo_concurrency.processutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.494 182939 DEBUG oslo_concurrency.processutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.495 182939 DEBUG oslo_concurrency.processutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.527 182939 DEBUG oslo_concurrency.processutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.528 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.529 182939 DEBUG oslo_concurrency.processutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.587 182939 DEBUG oslo_concurrency.processutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.588 182939 DEBUG nova.virt.disk.api [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Checking if we can resize image /var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.588 182939 DEBUG oslo_concurrency.processutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.643 182939 DEBUG oslo_concurrency.processutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.644 182939 DEBUG nova.virt.disk.api [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Cannot resize image /var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.644 182939 DEBUG nova.objects.instance [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lazy-loading 'migration_context' on Instance uuid 240ee2c3-f964-4de0-a440-9ce354fca15c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.657 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.658 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Ensure instance console log exists: /var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.658 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.659 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:33 compute-0 nova_compute[182935]: 2026-01-22 00:47:33.659 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:34 compute-0 nova_compute[182935]: 2026-01-22 00:47:34.340 182939 DEBUG nova.network.neutron [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Successfully created port: b1f67e42-3c9d-47fc-bc33-b7f51901cb78 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:47:35 compute-0 nova_compute[182935]: 2026-01-22 00:47:35.164 182939 DEBUG nova.network.neutron [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Successfully updated port: b1f67e42-3c9d-47fc-bc33-b7f51901cb78 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:47:35 compute-0 nova_compute[182935]: 2026-01-22 00:47:35.180 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "refresh_cache-240ee2c3-f964-4de0-a440-9ce354fca15c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:47:35 compute-0 nova_compute[182935]: 2026-01-22 00:47:35.181 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquired lock "refresh_cache-240ee2c3-f964-4de0-a440-9ce354fca15c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:47:35 compute-0 nova_compute[182935]: 2026-01-22 00:47:35.181 182939 DEBUG nova.network.neutron [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:47:35 compute-0 nova_compute[182935]: 2026-01-22 00:47:35.254 182939 DEBUG nova.compute.manager [req-5cc454b4-e432-4e52-9c00-0c209ec75e70 req-2fba8665-ec1f-4c76-a738-df5189b5f3e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Received event network-changed-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:47:35 compute-0 nova_compute[182935]: 2026-01-22 00:47:35.255 182939 DEBUG nova.compute.manager [req-5cc454b4-e432-4e52-9c00-0c209ec75e70 req-2fba8665-ec1f-4c76-a738-df5189b5f3e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Refreshing instance network info cache due to event network-changed-b1f67e42-3c9d-47fc-bc33-b7f51901cb78. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:47:35 compute-0 nova_compute[182935]: 2026-01-22 00:47:35.255 182939 DEBUG oslo_concurrency.lockutils [req-5cc454b4-e432-4e52-9c00-0c209ec75e70 req-2fba8665-ec1f-4c76-a738-df5189b5f3e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-240ee2c3-f964-4de0-a440-9ce354fca15c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:47:35 compute-0 nova_compute[182935]: 2026-01-22 00:47:35.380 182939 DEBUG nova.network.neutron [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.271 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.313 182939 DEBUG nova.network.neutron [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Updating instance_info_cache with network_info: [{"id": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "address": "fa:16:3e:f4:1e:a8", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f67e42-3c", "ovs_interfaceid": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.331 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Releasing lock "refresh_cache-240ee2c3-f964-4de0-a440-9ce354fca15c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.332 182939 DEBUG nova.compute.manager [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Instance network_info: |[{"id": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "address": "fa:16:3e:f4:1e:a8", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f67e42-3c", "ovs_interfaceid": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.332 182939 DEBUG oslo_concurrency.lockutils [req-5cc454b4-e432-4e52-9c00-0c209ec75e70 req-2fba8665-ec1f-4c76-a738-df5189b5f3e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-240ee2c3-f964-4de0-a440-9ce354fca15c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.332 182939 DEBUG nova.network.neutron [req-5cc454b4-e432-4e52-9c00-0c209ec75e70 req-2fba8665-ec1f-4c76-a738-df5189b5f3e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Refreshing network info cache for port b1f67e42-3c9d-47fc-bc33-b7f51901cb78 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.335 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Start _get_guest_xml network_info=[{"id": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "address": "fa:16:3e:f4:1e:a8", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f67e42-3c", "ovs_interfaceid": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.339 182939 WARNING nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.343 182939 DEBUG nova.virt.libvirt.host [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.343 182939 DEBUG nova.virt.libvirt.host [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.346 182939 DEBUG nova.virt.libvirt.host [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.346 182939 DEBUG nova.virt.libvirt.host [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.348 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.348 182939 DEBUG nova.virt.hardware [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.348 182939 DEBUG nova.virt.hardware [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.349 182939 DEBUG nova.virt.hardware [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.349 182939 DEBUG nova.virt.hardware [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.349 182939 DEBUG nova.virt.hardware [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.349 182939 DEBUG nova.virt.hardware [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.350 182939 DEBUG nova.virt.hardware [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.350 182939 DEBUG nova.virt.hardware [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.350 182939 DEBUG nova.virt.hardware [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.350 182939 DEBUG nova.virt.hardware [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.350 182939 DEBUG nova.virt.hardware [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.354 182939 DEBUG nova.virt.libvirt.vif [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:47:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-2128242320',display_name='tempest-TestServerMultinode-server-2128242320',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-2128242320',id=187,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38ae0051f15c46809f70ec5299cfb2c6',ramdisk_id='',reservation_id='r-0b8b6959',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-385846676',owner_user_name='tempest-TestServerMultinode-385846676-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:47:33Z,user_data=None,user_id='8fb6fa8c5dd241fb975d0e13ddb107f4',uuid=240ee2c3-f964-4de0-a440-9ce354fca15c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "address": "fa:16:3e:f4:1e:a8", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f67e42-3c", "ovs_interfaceid": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.355 182939 DEBUG nova.network.os_vif_util [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converting VIF {"id": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "address": "fa:16:3e:f4:1e:a8", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f67e42-3c", "ovs_interfaceid": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.355 182939 DEBUG nova.network.os_vif_util [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:1e:a8,bridge_name='br-int',has_traffic_filtering=True,id=b1f67e42-3c9d-47fc-bc33-b7f51901cb78,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f67e42-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.356 182939 DEBUG nova.objects.instance [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 240ee2c3-f964-4de0-a440-9ce354fca15c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.384 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:47:36 compute-0 nova_compute[182935]:   <uuid>240ee2c3-f964-4de0-a440-9ce354fca15c</uuid>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   <name>instance-000000bb</name>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <nova:name>tempest-TestServerMultinode-server-2128242320</nova:name>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:47:36</nova:creationTime>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:47:36 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:47:36 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:47:36 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:47:36 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:47:36 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:47:36 compute-0 nova_compute[182935]:         <nova:user uuid="8fb6fa8c5dd241fb975d0e13ddb107f4">tempest-TestServerMultinode-385846676-project-admin</nova:user>
Jan 22 00:47:36 compute-0 nova_compute[182935]:         <nova:project uuid="38ae0051f15c46809f70ec5299cfb2c6">tempest-TestServerMultinode-385846676</nova:project>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:47:36 compute-0 nova_compute[182935]:         <nova:port uuid="b1f67e42-3c9d-47fc-bc33-b7f51901cb78">
Jan 22 00:47:36 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <system>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <entry name="serial">240ee2c3-f964-4de0-a440-9ce354fca15c</entry>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <entry name="uuid">240ee2c3-f964-4de0-a440-9ce354fca15c</entry>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     </system>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   <os>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   </os>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   <features>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   </features>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk.config"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:f4:1e:a8"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <target dev="tapb1f67e42-3c"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/console.log" append="off"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <video>
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     </video>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:47:36 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:47:36 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:47:36 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:47:36 compute-0 nova_compute[182935]: </domain>
Jan 22 00:47:36 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.385 182939 DEBUG nova.compute.manager [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Preparing to wait for external event network-vif-plugged-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.386 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.386 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.386 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.387 182939 DEBUG nova.virt.libvirt.vif [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:47:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-2128242320',display_name='tempest-TestServerMultinode-server-2128242320',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-2128242320',id=187,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38ae0051f15c46809f70ec5299cfb2c6',ramdisk_id='',reservation_id='r-0b8b6959',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-385846676',owner_user_name='tempest-TestServerMultinode-385846676-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:47:33Z,user_data=None,user_id='8fb6fa8c5dd241fb975d0e13ddb107f4',uuid=240ee2c3-f964-4de0-a440-9ce354fca15c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "address": "fa:16:3e:f4:1e:a8", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f67e42-3c", "ovs_interfaceid": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.387 182939 DEBUG nova.network.os_vif_util [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converting VIF {"id": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "address": "fa:16:3e:f4:1e:a8", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f67e42-3c", "ovs_interfaceid": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.388 182939 DEBUG nova.network.os_vif_util [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:1e:a8,bridge_name='br-int',has_traffic_filtering=True,id=b1f67e42-3c9d-47fc-bc33-b7f51901cb78,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f67e42-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.388 182939 DEBUG os_vif [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:1e:a8,bridge_name='br-int',has_traffic_filtering=True,id=b1f67e42-3c9d-47fc-bc33-b7f51901cb78,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f67e42-3c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.388 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.389 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.389 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.392 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.392 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1f67e42-3c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.393 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb1f67e42-3c, col_values=(('external_ids', {'iface-id': 'b1f67e42-3c9d-47fc-bc33-b7f51901cb78', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:1e:a8', 'vm-uuid': '240ee2c3-f964-4de0-a440-9ce354fca15c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:36 compute-0 NetworkManager[55139]: <info>  [1769042856.3954] manager: (tapb1f67e42-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/375)
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.394 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.399 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.401 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.402 182939 INFO os_vif [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:1e:a8,bridge_name='br-int',has_traffic_filtering=True,id=b1f67e42-3c9d-47fc-bc33-b7f51901cb78,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f67e42-3c')
Jan 22 00:47:36 compute-0 sshd-session[247549]: Connection closed by authenticating user operator 188.166.69.60 port 45084 [preauth]
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.483 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.483 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.483 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] No VIF found with MAC fa:16:3e:f4:1e:a8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:47:36 compute-0 nova_compute[182935]: 2026-01-22 00:47:36.484 182939 INFO nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Using config drive
Jan 22 00:47:36 compute-0 podman[247553]: 2026-01-22 00:47:36.674296971 +0000 UTC m=+0.048201876 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:47:36 compute-0 podman[247555]: 2026-01-22 00:47:36.679549297 +0000 UTC m=+0.048331259 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:47:36 compute-0 podman[247554]: 2026-01-22 00:47:36.69928704 +0000 UTC m=+0.070529811 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:47:37 compute-0 nova_compute[182935]: 2026-01-22 00:47:37.113 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:37 compute-0 nova_compute[182935]: 2026-01-22 00:47:37.672 182939 INFO nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Creating config drive at /var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk.config
Jan 22 00:47:37 compute-0 nova_compute[182935]: 2026-01-22 00:47:37.676 182939 DEBUG oslo_concurrency.processutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmn5e6i6h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:47:37 compute-0 nova_compute[182935]: 2026-01-22 00:47:37.810 182939 DEBUG oslo_concurrency.processutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmn5e6i6h" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:47:37 compute-0 kernel: tapb1f67e42-3c: entered promiscuous mode
Jan 22 00:47:37 compute-0 NetworkManager[55139]: <info>  [1769042857.8774] manager: (tapb1f67e42-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/376)
Jan 22 00:47:37 compute-0 ovn_controller[95047]: 2026-01-22T00:47:37Z|00756|binding|INFO|Claiming lport b1f67e42-3c9d-47fc-bc33-b7f51901cb78 for this chassis.
Jan 22 00:47:37 compute-0 ovn_controller[95047]: 2026-01-22T00:47:37Z|00757|binding|INFO|b1f67e42-3c9d-47fc-bc33-b7f51901cb78: Claiming fa:16:3e:f4:1e:a8 10.100.0.8
Jan 22 00:47:37 compute-0 nova_compute[182935]: 2026-01-22 00:47:37.877 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:37 compute-0 nova_compute[182935]: 2026-01-22 00:47:37.880 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:37.894 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:1e:a8 10.100.0.8'], port_security=['fa:16:3e:f4:1e:a8 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '240ee2c3-f964-4de0-a440-9ce354fca15c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38ae0051f15c46809f70ec5299cfb2c6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '09d74f28-816b-485a-851f-3d27c0c9555a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b227cd0-f221-40a4-86c3-ce27482fa492, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=b1f67e42-3c9d-47fc-bc33-b7f51901cb78) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:47:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:37.895 104408 INFO neutron.agent.ovn.metadata.agent [-] Port b1f67e42-3c9d-47fc-bc33-b7f51901cb78 in datapath c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 bound to our chassis
Jan 22 00:47:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:37.896 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2
Jan 22 00:47:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:37.909 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f821158c-f9c3-4e39-a2c1-aec821f34cfb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:37.909 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc27f16e8-e1 in ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:47:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:37.911 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc27f16e8-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:47:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:37.912 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[35421998-6b0a-47df-a8c1-16a85fc8e3b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:37.912 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2b9944fc-6dec-4071-83a0-d01178093f4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:37 compute-0 systemd-udevd[247645]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:47:37 compute-0 systemd-machined[154182]: New machine qemu-93-instance-000000bb.
Jan 22 00:47:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:37.923 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[dfa84427-3f34-48fe-99b0-b84c808f5a88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:37 compute-0 NetworkManager[55139]: <info>  [1769042857.9266] device (tapb1f67e42-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:47:37 compute-0 NetworkManager[55139]: <info>  [1769042857.9274] device (tapb1f67e42-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:47:37 compute-0 nova_compute[182935]: 2026-01-22 00:47:37.938 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:37 compute-0 systemd[1]: Started Virtual Machine qemu-93-instance-000000bb.
Jan 22 00:47:37 compute-0 nova_compute[182935]: 2026-01-22 00:47:37.944 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:37 compute-0 ovn_controller[95047]: 2026-01-22T00:47:37Z|00758|binding|INFO|Setting lport b1f67e42-3c9d-47fc-bc33-b7f51901cb78 ovn-installed in OVS
Jan 22 00:47:37 compute-0 ovn_controller[95047]: 2026-01-22T00:47:37Z|00759|binding|INFO|Setting lport b1f67e42-3c9d-47fc-bc33-b7f51901cb78 up in Southbound
Jan 22 00:47:37 compute-0 nova_compute[182935]: 2026-01-22 00:47:37.948 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:37.951 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[c56d2ae4-3281-4b7c-80a0-f46f0410caed]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:37.979 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[2caef3d6-21ca-437d-a5a8-602495678ca5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:37 compute-0 NetworkManager[55139]: <info>  [1769042857.9865] manager: (tapc27f16e8-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/377)
Jan 22 00:47:37 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:37.985 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d682bedb-e5b5-4c36-86a0-58cf34d7a702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.024 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[6ffb1a52-5ae8-4fc5-94fd-c5d0ee137a24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.030 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb7a94a-a062-4c2d-8571-dfaedb243c5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:38 compute-0 NetworkManager[55139]: <info>  [1769042858.0543] device (tapc27f16e8-e0): carrier: link connected
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.062 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[388a6550-9bc8-4674-aca9-43556f00cb86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.080 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[039abb31-5e26-410c-9216-c306ffa49d6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc27f16e8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:14:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736079, 'reachable_time': 15780, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247677, 'error': None, 'target': 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.094 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7627c1f4-d817-480d-ba1e-3b46786c1768]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:144f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 736079, 'tstamp': 736079}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247678, 'error': None, 'target': 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.120 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b07ef939-e811-4bc5-af05-be8bf1e5ac8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc27f16e8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:14:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736079, 'reachable_time': 15780, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247679, 'error': None, 'target': 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.157 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[48c2819a-7a48-4760-a84f-e7b21d240164]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.216 182939 DEBUG nova.compute.manager [req-4ef39509-1256-46ff-9308-c10c02a6ef9a req-89ec6770-4a02-438a-be18-268b0b6eecd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Received event network-vif-plugged-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.216 182939 DEBUG oslo_concurrency.lockutils [req-4ef39509-1256-46ff-9308-c10c02a6ef9a req-89ec6770-4a02-438a-be18-268b0b6eecd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.216 182939 DEBUG oslo_concurrency.lockutils [req-4ef39509-1256-46ff-9308-c10c02a6ef9a req-89ec6770-4a02-438a-be18-268b0b6eecd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.217 182939 DEBUG oslo_concurrency.lockutils [req-4ef39509-1256-46ff-9308-c10c02a6ef9a req-89ec6770-4a02-438a-be18-268b0b6eecd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.217 182939 DEBUG nova.compute.manager [req-4ef39509-1256-46ff-9308-c10c02a6ef9a req-89ec6770-4a02-438a-be18-268b0b6eecd3 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Processing event network-vif-plugged-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.219 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b70a4a25-8696-4e6a-95c7-2070fe9723d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.220 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc27f16e8-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.220 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.221 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc27f16e8-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.223 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:38 compute-0 kernel: tapc27f16e8-e0: entered promiscuous mode
Jan 22 00:47:38 compute-0 NetworkManager[55139]: <info>  [1769042858.2239] manager: (tapc27f16e8-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/378)
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.225 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.226 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc27f16e8-e0, col_values=(('external_ids', {'iface-id': '8c4d0320-cbc0-4761-8fbc-cd4251890b14'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.227 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:38 compute-0 ovn_controller[95047]: 2026-01-22T00:47:38Z|00760|binding|INFO|Releasing lport 8c4d0320-cbc0-4761-8fbc-cd4251890b14 from this chassis (sb_readonly=0)
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.228 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.228 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.229 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ccfb7ed1-f259-403d-a19f-efcd5bd14d1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.229 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2.pid.haproxy
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:47:38 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:38.230 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'env', 'PROCESS_TAG=haproxy-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.238 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.292 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042858.2923481, 240ee2c3-f964-4de0-a440-9ce354fca15c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.293 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] VM Started (Lifecycle Event)
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.295 182939 DEBUG nova.compute.manager [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.299 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.302 182939 INFO nova.virt.libvirt.driver [-] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Instance spawned successfully.
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.302 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.318 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.320 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.329 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.329 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.330 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.330 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.331 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.331 182939 DEBUG nova.virt.libvirt.driver [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.336 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.337 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042858.2933354, 240ee2c3-f964-4de0-a440-9ce354fca15c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.337 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] VM Paused (Lifecycle Event)
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.358 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.362 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042858.3005598, 240ee2c3-f964-4de0-a440-9ce354fca15c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.362 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] VM Resumed (Lifecycle Event)
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.399 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.402 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.428 182939 INFO nova.compute.manager [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Took 5.08 seconds to spawn the instance on the hypervisor.
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.429 182939 DEBUG nova.compute.manager [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.437 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:47:38 compute-0 podman[247717]: 2026-01-22 00:47:38.58851554 +0000 UTC m=+0.045461929 container create 16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 22 00:47:38 compute-0 systemd[1]: Started libpod-conmon-16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d.scope.
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.644 182939 DEBUG nova.network.neutron [req-5cc454b4-e432-4e52-9c00-0c209ec75e70 req-2fba8665-ec1f-4c76-a738-df5189b5f3e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Updated VIF entry in instance network info cache for port b1f67e42-3c9d-47fc-bc33-b7f51901cb78. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.645 182939 DEBUG nova.network.neutron [req-5cc454b4-e432-4e52-9c00-0c209ec75e70 req-2fba8665-ec1f-4c76-a738-df5189b5f3e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Updating instance_info_cache with network_info: [{"id": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "address": "fa:16:3e:f4:1e:a8", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f67e42-3c", "ovs_interfaceid": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:47:38 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:47:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/580f63a4bca923fc658c70ccc4df4c38aa8159d8064ab23be6787a9490a1ddae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:47:38 compute-0 podman[247717]: 2026-01-22 00:47:38.565196762 +0000 UTC m=+0.022143151 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:47:38 compute-0 podman[247717]: 2026-01-22 00:47:38.668219229 +0000 UTC m=+0.125165628 container init 16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 00:47:38 compute-0 podman[247717]: 2026-01-22 00:47:38.673243969 +0000 UTC m=+0.130190338 container start 16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:47:38 compute-0 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[247731]: [NOTICE]   (247735) : New worker (247737) forked
Jan 22 00:47:38 compute-0 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[247731]: [NOTICE]   (247735) : Loading success.
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.728 182939 DEBUG oslo_concurrency.lockutils [req-5cc454b4-e432-4e52-9c00-0c209ec75e70 req-2fba8665-ec1f-4c76-a738-df5189b5f3e2 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-240ee2c3-f964-4de0-a440-9ce354fca15c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.772 182939 INFO nova.compute.manager [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Took 5.86 seconds to build instance.
Jan 22 00:47:38 compute-0 nova_compute[182935]: 2026-01-22 00:47:38.791 182939 DEBUG oslo_concurrency.lockutils [None req-f0feb78e-c18b-4319-a8fe-20117359f9ce 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.945s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:40 compute-0 nova_compute[182935]: 2026-01-22 00:47:40.391 182939 DEBUG nova.compute.manager [req-74b2b7a7-8650-437e-a465-b6fe0755d6e8 req-f2939826-5c60-4fed-a1ee-e953181fda2e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Received event network-vif-plugged-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:47:40 compute-0 nova_compute[182935]: 2026-01-22 00:47:40.392 182939 DEBUG oslo_concurrency.lockutils [req-74b2b7a7-8650-437e-a465-b6fe0755d6e8 req-f2939826-5c60-4fed-a1ee-e953181fda2e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:40 compute-0 nova_compute[182935]: 2026-01-22 00:47:40.392 182939 DEBUG oslo_concurrency.lockutils [req-74b2b7a7-8650-437e-a465-b6fe0755d6e8 req-f2939826-5c60-4fed-a1ee-e953181fda2e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:40 compute-0 nova_compute[182935]: 2026-01-22 00:47:40.393 182939 DEBUG oslo_concurrency.lockutils [req-74b2b7a7-8650-437e-a465-b6fe0755d6e8 req-f2939826-5c60-4fed-a1ee-e953181fda2e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:40 compute-0 nova_compute[182935]: 2026-01-22 00:47:40.393 182939 DEBUG nova.compute.manager [req-74b2b7a7-8650-437e-a465-b6fe0755d6e8 req-f2939826-5c60-4fed-a1ee-e953181fda2e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] No waiting events found dispatching network-vif-plugged-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:47:40 compute-0 nova_compute[182935]: 2026-01-22 00:47:40.393 182939 WARNING nova.compute.manager [req-74b2b7a7-8650-437e-a465-b6fe0755d6e8 req-f2939826-5c60-4fed-a1ee-e953181fda2e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Received unexpected event network-vif-plugged-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 for instance with vm_state active and task_state None.
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.260 182939 DEBUG oslo_concurrency.lockutils [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "240ee2c3-f964-4de0-a440-9ce354fca15c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.260 182939 DEBUG oslo_concurrency.lockutils [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.261 182939 DEBUG oslo_concurrency.lockutils [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.263 182939 DEBUG oslo_concurrency.lockutils [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.264 182939 DEBUG oslo_concurrency.lockutils [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.278 182939 INFO nova.compute.manager [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Terminating instance
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.290 182939 DEBUG nova.compute.manager [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 22 00:47:41 compute-0 kernel: tapb1f67e42-3c (unregistering): left promiscuous mode
Jan 22 00:47:41 compute-0 NetworkManager[55139]: <info>  [1769042861.3179] device (tapb1f67e42-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00761|binding|INFO|Releasing lport b1f67e42-3c9d-47fc-bc33-b7f51901cb78 from this chassis (sb_readonly=0)
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00762|binding|INFO|Setting lport b1f67e42-3c9d-47fc-bc33-b7f51901cb78 down in Southbound
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00763|binding|INFO|Removing iface tapb1f67e42-3c ovn-installed in OVS
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.334 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.335 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.344 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:1e:a8 10.100.0.8'], port_security=['fa:16:3e:f4:1e:a8 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '240ee2c3-f964-4de0-a440-9ce354fca15c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38ae0051f15c46809f70ec5299cfb2c6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '09d74f28-816b-485a-851f-3d27c0c9555a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b227cd0-f221-40a4-86c3-ce27482fa492, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=b1f67e42-3c9d-47fc-bc33-b7f51901cb78) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.346 104408 INFO neutron.agent.ovn.metadata.agent [-] Port b1f67e42-3c9d-47fc-bc33-b7f51901cb78 in datapath c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 unbound from our chassis
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.348 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.349 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[8e114e22-3c64-49f7-8ecc-5cfa5a8eeb0b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.350 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 namespace which is not needed anymore
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.356 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000bb.scope: Deactivated successfully.
Jan 22 00:47:41 compute-0 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000bb.scope: Consumed 3.311s CPU time.
Jan 22 00:47:41 compute-0 systemd-machined[154182]: Machine qemu-93-instance-000000bb terminated.
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.395 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[247731]: [NOTICE]   (247735) : haproxy version is 2.8.14-c23fe91
Jan 22 00:47:41 compute-0 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[247731]: [NOTICE]   (247735) : path to executable is /usr/sbin/haproxy
Jan 22 00:47:41 compute-0 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[247731]: [WARNING]  (247735) : Exiting Master process...
Jan 22 00:47:41 compute-0 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[247731]: [ALERT]    (247735) : Current worker (247737) exited with code 143 (Terminated)
Jan 22 00:47:41 compute-0 neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2[247731]: [WARNING]  (247735) : All workers exited. Exiting... (0)
Jan 22 00:47:41 compute-0 systemd[1]: libpod-16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d.scope: Deactivated successfully.
Jan 22 00:47:41 compute-0 podman[247770]: 2026-01-22 00:47:41.49515639 +0000 UTC m=+0.046094135 container died 16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 22 00:47:41 compute-0 systemd-udevd[247750]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:47:41 compute-0 kernel: tapb1f67e42-3c: entered promiscuous mode
Jan 22 00:47:41 compute-0 NetworkManager[55139]: <info>  [1769042861.5114] manager: (tapb1f67e42-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/379)
Jan 22 00:47:41 compute-0 kernel: tapb1f67e42-3c (unregistering): left promiscuous mode
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.514 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00764|binding|INFO|Claiming lport b1f67e42-3c9d-47fc-bc33-b7f51901cb78 for this chassis.
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00765|binding|INFO|b1f67e42-3c9d-47fc-bc33-b7f51901cb78: Claiming fa:16:3e:f4:1e:a8 10.100.0.8
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.525 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:1e:a8 10.100.0.8'], port_security=['fa:16:3e:f4:1e:a8 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '240ee2c3-f964-4de0-a440-9ce354fca15c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38ae0051f15c46809f70ec5299cfb2c6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '09d74f28-816b-485a-851f-3d27c0c9555a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b227cd0-f221-40a4-86c3-ce27482fa492, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=b1f67e42-3c9d-47fc-bc33-b7f51901cb78) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00766|binding|INFO|Setting lport b1f67e42-3c9d-47fc-bc33-b7f51901cb78 ovn-installed in OVS
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00767|binding|INFO|Setting lport b1f67e42-3c9d-47fc-bc33-b7f51901cb78 up in Southbound
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00768|binding|INFO|Releasing lport b1f67e42-3c9d-47fc-bc33-b7f51901cb78 from this chassis (sb_readonly=1)
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00769|if_status|INFO|Dropped 7 log messages in last 2681 seconds (most recently, 2681 seconds ago) due to excessive rate
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00770|if_status|INFO|Not setting lport b1f67e42-3c9d-47fc-bc33-b7f51901cb78 down as sb is readonly
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.541 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00771|binding|INFO|Removing iface tapb1f67e42-3c ovn-installed in OVS
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.544 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d-userdata-shm.mount: Deactivated successfully.
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00772|binding|INFO|Releasing lport b1f67e42-3c9d-47fc-bc33-b7f51901cb78 from this chassis (sb_readonly=0)
Jan 22 00:47:41 compute-0 ovn_controller[95047]: 2026-01-22T00:47:41Z|00773|binding|INFO|Setting lport b1f67e42-3c9d-47fc-bc33-b7f51901cb78 down in Southbound
Jan 22 00:47:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-580f63a4bca923fc658c70ccc4df4c38aa8159d8064ab23be6787a9490a1ddae-merged.mount: Deactivated successfully.
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.555 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.559 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:1e:a8 10.100.0.8'], port_security=['fa:16:3e:f4:1e:a8 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '240ee2c3-f964-4de0-a440-9ce354fca15c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38ae0051f15c46809f70ec5299cfb2c6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '09d74f28-816b-485a-851f-3d27c0c9555a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b227cd0-f221-40a4-86c3-ce27482fa492, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=b1f67e42-3c9d-47fc-bc33-b7f51901cb78) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:47:41 compute-0 podman[247770]: 2026-01-22 00:47:41.565171336 +0000 UTC m=+0.116109071 container cleanup 16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.569 182939 INFO nova.virt.libvirt.driver [-] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Instance destroyed successfully.
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.570 182939 DEBUG nova.objects.instance [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lazy-loading 'resources' on Instance uuid 240ee2c3-f964-4de0-a440-9ce354fca15c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:47:41 compute-0 systemd[1]: libpod-conmon-16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d.scope: Deactivated successfully.
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.590 182939 DEBUG nova.virt.libvirt.vif [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:47:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-2128242320',display_name='tempest-TestServerMultinode-server-2128242320',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-2128242320',id=187,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:47:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='38ae0051f15c46809f70ec5299cfb2c6',ramdisk_id='',reservation_id='r-0b8b6959',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-385846676',owner_user_name='tempest-TestServerMultinode-385846676-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:47:38Z,user_data=None,user_id='8fb6fa8c5dd241fb975d0e13ddb107f4',uuid=240ee2c3-f964-4de0-a440-9ce354fca15c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "address": "fa:16:3e:f4:1e:a8", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f67e42-3c", "ovs_interfaceid": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.590 182939 DEBUG nova.network.os_vif_util [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converting VIF {"id": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "address": "fa:16:3e:f4:1e:a8", "network": {"id": "c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2", "bridge": "br-int", "label": "tempest-TestServerMultinode-485848203-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "991deac0047e45c598f6b4e9ae868ad3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f67e42-3c", "ovs_interfaceid": "b1f67e42-3c9d-47fc-bc33-b7f51901cb78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.591 182939 DEBUG nova.network.os_vif_util [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:1e:a8,bridge_name='br-int',has_traffic_filtering=True,id=b1f67e42-3c9d-47fc-bc33-b7f51901cb78,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f67e42-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.591 182939 DEBUG os_vif [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:1e:a8,bridge_name='br-int',has_traffic_filtering=True,id=b1f67e42-3c9d-47fc-bc33-b7f51901cb78,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f67e42-3c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.594 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.594 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1f67e42-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.596 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.597 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.600 182939 INFO os_vif [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:1e:a8,bridge_name='br-int',has_traffic_filtering=True,id=b1f67e42-3c9d-47fc-bc33-b7f51901cb78,network=Network(c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb1f67e42-3c')
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.601 182939 INFO nova.virt.libvirt.driver [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Deleting instance files /var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c_del
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.601 182939 INFO nova.virt.libvirt.driver [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Deletion of /var/lib/nova/instances/240ee2c3-f964-4de0-a440-9ce354fca15c_del complete
Jan 22 00:47:41 compute-0 podman[247809]: 2026-01-22 00:47:41.631007752 +0000 UTC m=+0.042530279 container remove 16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.636 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3dda59-4718-4bfd-9545-9f8e05713d42]: (4, ('Thu Jan 22 12:47:41 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 (16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d)\n16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d\nThu Jan 22 12:47:41 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 (16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d)\n16942ab4d0a9130412a67c1a77d18fafb0053fb037f68ec967e91167e27b0a8d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.638 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[fb8b86ec-a9af-440e-924e-3a9fb671d2a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.639 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc27f16e8-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.641 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 kernel: tapc27f16e8-e0: left promiscuous mode
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.657 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.661 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[426eda2a-6ba6-483a-bf8c-25f65b880701]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.687 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[5a61d930-5248-4558-9f51-834cf889a3d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.689 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[7d746add-cfad-4a57-9a01-781d42943461]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.707 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3037f85d-629b-4877-b974-9ee5cc47a00f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 736071, 'reachable_time': 31362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247824, 'error': None, 'target': 'ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.710 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.710 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c52ab7-2baf-43e0-9aae-17ba6ca8e8b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:41 compute-0 systemd[1]: run-netns-ovnmeta\x2dc27f16e8\x2de7ea\x2d4ce6\x2d8fc8\x2d52a4d97170f2.mount: Deactivated successfully.
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.711 104408 INFO neutron.agent.ovn.metadata.agent [-] Port b1f67e42-3c9d-47fc-bc33-b7f51901cb78 in datapath c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 unbound from our chassis
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.713 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.713 182939 INFO nova.compute.manager [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.714 182939 DEBUG oslo.service.loopingcall [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.714 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[a0dba313-262d-4dba-9bba-8a637555dd7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.714 182939 DEBUG nova.compute.manager [-] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 22 00:47:41 compute-0 nova_compute[182935]: 2026-01-22 00:47:41.715 182939 DEBUG nova.network.neutron [-] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.715 104408 INFO neutron.agent.ovn.metadata.agent [-] Port b1f67e42-3c9d-47fc-bc33-b7f51901cb78 in datapath c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2 unbound from our chassis
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.715 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c27f16e8-e7ea-4ce6-8fc8-52a4d97170f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:47:41 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:47:41.716 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f89e3411-a63f-47a3-a9cb-be4c96aae807]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:47:42 compute-0 nova_compute[182935]: 2026-01-22 00:47:42.115 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:44 compute-0 podman[247825]: 2026-01-22 00:47:44.701689219 +0000 UTC m=+0.070345016 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 22 00:47:45 compute-0 nova_compute[182935]: 2026-01-22 00:47:45.729 182939 DEBUG nova.compute.manager [req-b47bb2fb-3634-46f8-96a6-105a9f0066b4 req-f5f56d3f-eb58-46fc-921b-d3d1c4b432ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Received event network-vif-unplugged-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:47:45 compute-0 nova_compute[182935]: 2026-01-22 00:47:45.730 182939 DEBUG oslo_concurrency.lockutils [req-b47bb2fb-3634-46f8-96a6-105a9f0066b4 req-f5f56d3f-eb58-46fc-921b-d3d1c4b432ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:45 compute-0 nova_compute[182935]: 2026-01-22 00:47:45.730 182939 DEBUG oslo_concurrency.lockutils [req-b47bb2fb-3634-46f8-96a6-105a9f0066b4 req-f5f56d3f-eb58-46fc-921b-d3d1c4b432ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:45 compute-0 nova_compute[182935]: 2026-01-22 00:47:45.730 182939 DEBUG oslo_concurrency.lockutils [req-b47bb2fb-3634-46f8-96a6-105a9f0066b4 req-f5f56d3f-eb58-46fc-921b-d3d1c4b432ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:45 compute-0 nova_compute[182935]: 2026-01-22 00:47:45.731 182939 DEBUG nova.compute.manager [req-b47bb2fb-3634-46f8-96a6-105a9f0066b4 req-f5f56d3f-eb58-46fc-921b-d3d1c4b432ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] No waiting events found dispatching network-vif-unplugged-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:47:45 compute-0 nova_compute[182935]: 2026-01-22 00:47:45.731 182939 DEBUG nova.compute.manager [req-b47bb2fb-3634-46f8-96a6-105a9f0066b4 req-f5f56d3f-eb58-46fc-921b-d3d1c4b432ef 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Received event network-vif-unplugged-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 22 00:47:45 compute-0 nova_compute[182935]: 2026-01-22 00:47:45.981 182939 DEBUG nova.network.neutron [-] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:47:46 compute-0 nova_compute[182935]: 2026-01-22 00:47:46.016 182939 INFO nova.compute.manager [-] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Took 4.30 seconds to deallocate network for instance.
Jan 22 00:47:46 compute-0 nova_compute[182935]: 2026-01-22 00:47:46.115 182939 DEBUG oslo_concurrency.lockutils [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:46 compute-0 nova_compute[182935]: 2026-01-22 00:47:46.116 182939 DEBUG oslo_concurrency.lockutils [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:46 compute-0 nova_compute[182935]: 2026-01-22 00:47:46.199 182939 DEBUG nova.compute.provider_tree [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:47:46 compute-0 nova_compute[182935]: 2026-01-22 00:47:46.224 182939 DEBUG nova.scheduler.client.report [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:47:46 compute-0 nova_compute[182935]: 2026-01-22 00:47:46.260 182939 DEBUG oslo_concurrency.lockutils [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:46 compute-0 nova_compute[182935]: 2026-01-22 00:47:46.294 182939 INFO nova.scheduler.client.report [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Deleted allocations for instance 240ee2c3-f964-4de0-a440-9ce354fca15c
Jan 22 00:47:46 compute-0 nova_compute[182935]: 2026-01-22 00:47:46.597 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:47 compute-0 nova_compute[182935]: 2026-01-22 00:47:47.117 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:49 compute-0 podman[247847]: 2026-01-22 00:47:49.719635015 +0000 UTC m=+0.080673713 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Jan 22 00:47:49 compute-0 podman[247848]: 2026-01-22 00:47:49.734704146 +0000 UTC m=+0.096449970 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:47:50 compute-0 nova_compute[182935]: 2026-01-22 00:47:50.099 182939 DEBUG nova.compute.manager [req-3afa7488-cf1e-4f15-9304-3412758f36f5 req-dfc10c79-7e30-4f7e-ab6e-2f8a022fee5e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Received event network-vif-deleted-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:47:50 compute-0 nova_compute[182935]: 2026-01-22 00:47:50.099 182939 DEBUG nova.compute.manager [req-3afa7488-cf1e-4f15-9304-3412758f36f5 req-dfc10c79-7e30-4f7e-ab6e-2f8a022fee5e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Received event network-vif-plugged-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:47:50 compute-0 nova_compute[182935]: 2026-01-22 00:47:50.100 182939 DEBUG oslo_concurrency.lockutils [req-3afa7488-cf1e-4f15-9304-3412758f36f5 req-dfc10c79-7e30-4f7e-ab6e-2f8a022fee5e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:47:50 compute-0 nova_compute[182935]: 2026-01-22 00:47:50.100 182939 DEBUG oslo_concurrency.lockutils [req-3afa7488-cf1e-4f15-9304-3412758f36f5 req-dfc10c79-7e30-4f7e-ab6e-2f8a022fee5e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:47:50 compute-0 nova_compute[182935]: 2026-01-22 00:47:50.100 182939 DEBUG oslo_concurrency.lockutils [req-3afa7488-cf1e-4f15-9304-3412758f36f5 req-dfc10c79-7e30-4f7e-ab6e-2f8a022fee5e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:50 compute-0 nova_compute[182935]: 2026-01-22 00:47:50.100 182939 DEBUG nova.compute.manager [req-3afa7488-cf1e-4f15-9304-3412758f36f5 req-dfc10c79-7e30-4f7e-ab6e-2f8a022fee5e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] No waiting events found dispatching network-vif-plugged-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:47:50 compute-0 nova_compute[182935]: 2026-01-22 00:47:50.101 182939 WARNING nova.compute.manager [req-3afa7488-cf1e-4f15-9304-3412758f36f5 req-dfc10c79-7e30-4f7e-ab6e-2f8a022fee5e 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Received unexpected event network-vif-plugged-b1f67e42-3c9d-47fc-bc33-b7f51901cb78 for instance with vm_state deleted and task_state None.
Jan 22 00:47:50 compute-0 nova_compute[182935]: 2026-01-22 00:47:50.207 182939 DEBUG oslo_concurrency.lockutils [None req-c1835f64-9541-40e8-ad48-2f3405dda086 8fb6fa8c5dd241fb975d0e13ddb107f4 38ae0051f15c46809f70ec5299cfb2c6 - - default default] Lock "240ee2c3-f964-4de0-a440-9ce354fca15c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.947s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:47:50 compute-0 sshd-session[247845]: Invalid user ubuntu from 203.83.238.251 port 36842
Jan 22 00:47:50 compute-0 sshd-session[247845]: Received disconnect from 203.83.238.251 port 36842:11:  [preauth]
Jan 22 00:47:50 compute-0 sshd-session[247845]: Disconnected from invalid user ubuntu 203.83.238.251 port 36842 [preauth]
Jan 22 00:47:51 compute-0 nova_compute[182935]: 2026-01-22 00:47:51.600 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:52 compute-0 nova_compute[182935]: 2026-01-22 00:47:52.164 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:56 compute-0 nova_compute[182935]: 2026-01-22 00:47:56.568 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042861.5670633, 240ee2c3-f964-4de0-a440-9ce354fca15c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:47:56 compute-0 nova_compute[182935]: 2026-01-22 00:47:56.568 182939 INFO nova.compute.manager [-] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] VM Stopped (Lifecycle Event)
Jan 22 00:47:56 compute-0 nova_compute[182935]: 2026-01-22 00:47:56.604 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:57 compute-0 nova_compute[182935]: 2026-01-22 00:47:57.167 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:47:58 compute-0 nova_compute[182935]: 2026-01-22 00:47:58.629 182939 DEBUG nova.compute.manager [None req-3d353cd0-6ba4-4193-b8c8-bbfff23b0782 - - - - - -] [instance: 240ee2c3-f964-4de0-a440-9ce354fca15c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:48:01 compute-0 nova_compute[182935]: 2026-01-22 00:48:01.607 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:02 compute-0 nova_compute[182935]: 2026-01-22 00:48:02.205 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:02.389 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:48:02 compute-0 nova_compute[182935]: 2026-01-22 00:48:02.389 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:02.390 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:48:02 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:02.391 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:48:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:03.247 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:03.248 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:03.248 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:06 compute-0 nova_compute[182935]: 2026-01-22 00:48:06.030 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:06 compute-0 nova_compute[182935]: 2026-01-22 00:48:06.610 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:07 compute-0 nova_compute[182935]: 2026-01-22 00:48:07.207 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:07 compute-0 podman[247888]: 2026-01-22 00:48:07.68654392 +0000 UTC m=+0.055111520 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:48:07 compute-0 podman[247890]: 2026-01-22 00:48:07.701717344 +0000 UTC m=+0.060178552 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:48:07 compute-0 podman[247889]: 2026-01-22 00:48:07.720158735 +0000 UTC m=+0.083761507 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 22 00:48:10 compute-0 nova_compute[182935]: 2026-01-22 00:48:10.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:10 compute-0 nova_compute[182935]: 2026-01-22 00:48:10.847 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:10 compute-0 nova_compute[182935]: 2026-01-22 00:48:10.848 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:10 compute-0 nova_compute[182935]: 2026-01-22 00:48:10.848 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:10 compute-0 nova_compute[182935]: 2026-01-22 00:48:10.848 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:48:10 compute-0 nova_compute[182935]: 2026-01-22 00:48:10.996 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:48:10 compute-0 nova_compute[182935]: 2026-01-22 00:48:10.997 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5702MB free_disk=73.1220817565918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:48:10 compute-0 nova_compute[182935]: 2026-01-22 00:48:10.997 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:10 compute-0 nova_compute[182935]: 2026-01-22 00:48:10.998 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:11 compute-0 nova_compute[182935]: 2026-01-22 00:48:11.125 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:48:11 compute-0 nova_compute[182935]: 2026-01-22 00:48:11.126 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:48:11 compute-0 nova_compute[182935]: 2026-01-22 00:48:11.157 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:48:11 compute-0 nova_compute[182935]: 2026-01-22 00:48:11.174 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:48:11 compute-0 nova_compute[182935]: 2026-01-22 00:48:11.205 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:48:11 compute-0 nova_compute[182935]: 2026-01-22 00:48:11.205 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:12 compute-0 nova_compute[182935]: 2026-01-22 00:48:12.210 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:12 compute-0 nova_compute[182935]: 2026-01-22 00:48:12.212 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:12 compute-0 nova_compute[182935]: 2026-01-22 00:48:12.212 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:48:12 compute-0 nova_compute[182935]: 2026-01-22 00:48:12.212 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:48:12 compute-0 nova_compute[182935]: 2026-01-22 00:48:12.381 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:12 compute-0 nova_compute[182935]: 2026-01-22 00:48:12.382 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:48:15 compute-0 nova_compute[182935]: 2026-01-22 00:48:15.207 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:15 compute-0 nova_compute[182935]: 2026-01-22 00:48:15.208 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:48:15 compute-0 nova_compute[182935]: 2026-01-22 00:48:15.208 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:48:15 compute-0 nova_compute[182935]: 2026-01-22 00:48:15.234 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:48:15 compute-0 podman[247963]: 2026-01-22 00:48:15.678756865 +0000 UTC m=+0.048467771 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 22 00:48:15 compute-0 nova_compute[182935]: 2026-01-22 00:48:15.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:15 compute-0 nova_compute[182935]: 2026-01-22 00:48:15.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:48:16 compute-0 nova_compute[182935]: 2026-01-22 00:48:16.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:17 compute-0 nova_compute[182935]: 2026-01-22 00:48:17.383 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:19 compute-0 sshd-session[247982]: Connection closed by authenticating user operator 188.166.69.60 port 40036 [preauth]
Jan 22 00:48:20 compute-0 podman[247984]: 2026-01-22 00:48:20.671904179 +0000 UTC m=+0.049272301 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Jan 22 00:48:20 compute-0 podman[247985]: 2026-01-22 00:48:20.679506541 +0000 UTC m=+0.053721957 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:48:22 compute-0 nova_compute[182935]: 2026-01-22 00:48:22.385 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:22 compute-0 nova_compute[182935]: 2026-01-22 00:48:22.386 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:22 compute-0 nova_compute[182935]: 2026-01-22 00:48:22.386 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:48:22 compute-0 nova_compute[182935]: 2026-01-22 00:48:22.386 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:48:22 compute-0 nova_compute[182935]: 2026-01-22 00:48:22.387 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:48:22 compute-0 nova_compute[182935]: 2026-01-22 00:48:22.387 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:48:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:48:23 compute-0 nova_compute[182935]: 2026-01-22 00:48:23.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:24 compute-0 nova_compute[182935]: 2026-01-22 00:48:24.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:26 compute-0 nova_compute[182935]: 2026-01-22 00:48:26.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:26 compute-0 nova_compute[182935]: 2026-01-22 00:48:26.851 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:26 compute-0 nova_compute[182935]: 2026-01-22 00:48:26.852 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:27 compute-0 nova_compute[182935]: 2026-01-22 00:48:27.064 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:27 compute-0 nova_compute[182935]: 2026-01-22 00:48:27.388 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:31 compute-0 nova_compute[182935]: 2026-01-22 00:48:31.816 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:32 compute-0 nova_compute[182935]: 2026-01-22 00:48:32.390 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:33 compute-0 nova_compute[182935]: 2026-01-22 00:48:33.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:37 compute-0 nova_compute[182935]: 2026-01-22 00:48:37.392 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:38 compute-0 podman[248029]: 2026-01-22 00:48:38.703041472 +0000 UTC m=+0.059018425 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:48:38 compute-0 podman[248027]: 2026-01-22 00:48:38.709429644 +0000 UTC m=+0.074996986 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:48:38 compute-0 podman[248028]: 2026-01-22 00:48:38.747839594 +0000 UTC m=+0.097673329 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:48:40 compute-0 nova_compute[182935]: 2026-01-22 00:48:40.865 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:40 compute-0 nova_compute[182935]: 2026-01-22 00:48:40.866 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:48:41 compute-0 ovn_controller[95047]: 2026-01-22T00:48:41Z|00774|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 22 00:48:42 compute-0 nova_compute[182935]: 2026-01-22 00:48:42.394 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:42 compute-0 nova_compute[182935]: 2026-01-22 00:48:42.396 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:42 compute-0 nova_compute[182935]: 2026-01-22 00:48:42.397 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:48:42 compute-0 nova_compute[182935]: 2026-01-22 00:48:42.398 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:48:42 compute-0 nova_compute[182935]: 2026-01-22 00:48:42.432 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:42 compute-0 nova_compute[182935]: 2026-01-22 00:48:42.433 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:48:46 compute-0 podman[248103]: 2026-01-22 00:48:46.693226708 +0000 UTC m=+0.067980748 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:48:47 compute-0 nova_compute[182935]: 2026-01-22 00:48:47.433 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:47 compute-0 nova_compute[182935]: 2026-01-22 00:48:47.435 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:51 compute-0 podman[248123]: 2026-01-22 00:48:51.700736397 +0000 UTC m=+0.063415140 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 22 00:48:51 compute-0 podman[248122]: 2026-01-22 00:48:51.713898971 +0000 UTC m=+0.081698226 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Jan 22 00:48:52 compute-0 nova_compute[182935]: 2026-01-22 00:48:52.436 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:52 compute-0 nova_compute[182935]: 2026-01-22 00:48:52.438 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:52 compute-0 nova_compute[182935]: 2026-01-22 00:48:52.438 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 00:48:52 compute-0 nova_compute[182935]: 2026-01-22 00:48:52.438 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:48:52 compute-0 nova_compute[182935]: 2026-01-22 00:48:52.477 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:52 compute-0 nova_compute[182935]: 2026-01-22 00:48:52.478 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.079 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.079 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.095 182939 DEBUG nova.compute.manager [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.191 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.192 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.200 182939 DEBUG nova.virt.hardware [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.201 182939 INFO nova.compute.claims [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Claim successful on node compute-0.ctlplane.example.com
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.321 182939 DEBUG nova.compute.provider_tree [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.336 182939 DEBUG nova.scheduler.client.report [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.359 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.360 182939 DEBUG nova.compute.manager [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.418 182939 DEBUG nova.compute.manager [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.419 182939 DEBUG nova.network.neutron [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.437 182939 INFO nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.453 182939 DEBUG nova.compute.manager [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.580 182939 DEBUG nova.compute.manager [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.581 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.581 182939 INFO nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Creating image(s)
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.582 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "/var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.582 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "/var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.583 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "/var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.595 182939 DEBUG oslo_concurrency.processutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.649 182939 DEBUG oslo_concurrency.processutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.650 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.651 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.662 182939 DEBUG oslo_concurrency.processutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.713 182939 DEBUG oslo_concurrency.processutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.715 182939 DEBUG oslo_concurrency.processutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.745 182939 DEBUG oslo_concurrency.processutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474,backing_fmt=raw /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.746 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.746 182939 DEBUG oslo_concurrency.processutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.798 182939 DEBUG oslo_concurrency.processutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.799 182939 DEBUG nova.virt.disk.api [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Checking if we can resize image /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.800 182939 DEBUG oslo_concurrency.processutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.814 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.814 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.830 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.853 182939 DEBUG oslo_concurrency.processutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.854 182939 DEBUG nova.virt.disk.api [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Cannot resize image /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.854 182939 DEBUG nova.objects.instance [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'migration_context' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.868 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.868 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Ensure instance console log exists: /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.869 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.869 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.869 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:53 compute-0 nova_compute[182935]: 2026-01-22 00:48:53.979 182939 DEBUG nova.policy [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f96259409b0747b6ac866ebe79dcf160', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9d05c3cf062a4f6ebb5083b35d40286e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 22 00:48:55 compute-0 nova_compute[182935]: 2026-01-22 00:48:55.628 182939 DEBUG nova.network.neutron [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Successfully created port: ee3eb2da-6644-4c49-952b-d4fd939223d9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 22 00:48:56 compute-0 nova_compute[182935]: 2026-01-22 00:48:56.858 182939 DEBUG nova.network.neutron [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Successfully updated port: ee3eb2da-6644-4c49-952b-d4fd939223d9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 22 00:48:56 compute-0 nova_compute[182935]: 2026-01-22 00:48:56.887 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:48:56 compute-0 nova_compute[182935]: 2026-01-22 00:48:56.888 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquired lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:48:56 compute-0 nova_compute[182935]: 2026-01-22 00:48:56.888 182939 DEBUG nova.network.neutron [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:48:57 compute-0 nova_compute[182935]: 2026-01-22 00:48:57.017 182939 DEBUG nova.compute.manager [req-568e0bbe-62d2-47e9-8525-02894973d6b6 req-71712c0a-3815-4ea9-b4d4-48e9a7827080 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-changed-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:48:57 compute-0 nova_compute[182935]: 2026-01-22 00:48:57.018 182939 DEBUG nova.compute.manager [req-568e0bbe-62d2-47e9-8525-02894973d6b6 req-71712c0a-3815-4ea9-b4d4-48e9a7827080 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Refreshing instance network info cache due to event network-changed-ee3eb2da-6644-4c49-952b-d4fd939223d9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:48:57 compute-0 nova_compute[182935]: 2026-01-22 00:48:57.018 182939 DEBUG oslo_concurrency.lockutils [req-568e0bbe-62d2-47e9-8525-02894973d6b6 req-71712c0a-3815-4ea9-b4d4-48e9a7827080 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:48:57 compute-0 nova_compute[182935]: 2026-01-22 00:48:57.437 182939 DEBUG nova.network.neutron [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 22 00:48:57 compute-0 nova_compute[182935]: 2026-01-22 00:48:57.478 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.459 182939 DEBUG nova.network.neutron [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updating instance_info_cache with network_info: [{"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.484 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Releasing lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.485 182939 DEBUG nova.compute.manager [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Instance network_info: |[{"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.485 182939 DEBUG oslo_concurrency.lockutils [req-568e0bbe-62d2-47e9-8525-02894973d6b6 req-71712c0a-3815-4ea9-b4d4-48e9a7827080 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.485 182939 DEBUG nova.network.neutron [req-568e0bbe-62d2-47e9-8525-02894973d6b6 req-71712c0a-3815-4ea9-b4d4-48e9a7827080 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Refreshing network info cache for port ee3eb2da-6644-4c49-952b-d4fd939223d9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.488 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Start _get_guest_xml network_info=[{"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'encryption_options': None, 'encryption_secret_uuid': None, 'image_id': '9cd98f02-a505-4543-a7ad-04e9a377b456'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.492 182939 WARNING nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.496 182939 DEBUG nova.virt.libvirt.host [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.497 182939 DEBUG nova.virt.libvirt.host [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.504 182939 DEBUG nova.virt.libvirt.host [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.504 182939 DEBUG nova.virt.libvirt.host [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.505 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.506 182939 DEBUG nova.virt.hardware [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-21T23:42:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c3389c03-89c4-4ff5-9e03-1a99d41713d4',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-21T23:42:50Z,direct_url=<?>,disk_format='qcow2',id=9cd98f02-a505-4543-a7ad-04e9a377b456,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='43b70c4e837343859ac97b6b2397ba1b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-21T23:42:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.506 182939 DEBUG nova.virt.hardware [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.506 182939 DEBUG nova.virt.hardware [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.507 182939 DEBUG nova.virt.hardware [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.507 182939 DEBUG nova.virt.hardware [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.507 182939 DEBUG nova.virt.hardware [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.507 182939 DEBUG nova.virt.hardware [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.507 182939 DEBUG nova.virt.hardware [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.508 182939 DEBUG nova.virt.hardware [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.508 182939 DEBUG nova.virt.hardware [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.508 182939 DEBUG nova.virt.hardware [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.512 182939 DEBUG nova.virt.libvirt.vif [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:48:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-229309001',display_name='tempest-TestShelveInstance-server-229309001',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-229309001',id=188,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOgA5ZzfKbfeycJFDDkwZB6fgaNy8JupDR87kTs8Udzuy0Hdmm9UofiFEnY5+8X6yn18pBjwI+0V0Npbtx57RV5bhVB+OuvvOnztrIeeQxpfJd0y9DR5TAlaf0wFpgpxfw==',key_name='tempest-TestShelveInstance-1678551224',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d05c3cf062a4f6ebb5083b35d40286e',ramdisk_id='',reservation_id='r-urkgbm2c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1694031060',owner_user_name='tempest-TestShelveInstance-1694031060-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:48:53Z,user_data=None,user_id='f96259409b0747b6ac866ebe79dcf160',uuid=07d46432-944a-49b9-9862-65d4e541e750,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.512 182939 DEBUG nova.network.os_vif_util [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Converting VIF {"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.513 182939 DEBUG nova.network.os_vif_util [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.514 182939 DEBUG nova.objects.instance [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'pci_devices' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.535 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] End _get_guest_xml xml=<domain type="kvm">
Jan 22 00:48:58 compute-0 nova_compute[182935]:   <uuid>07d46432-944a-49b9-9862-65d4e541e750</uuid>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   <name>instance-000000bc</name>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   <memory>131072</memory>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   <vcpu>1</vcpu>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   <metadata>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <nova:name>tempest-TestShelveInstance-server-229309001</nova:name>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <nova:creationTime>2026-01-22 00:48:58</nova:creationTime>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <nova:flavor name="m1.nano">
Jan 22 00:48:58 compute-0 nova_compute[182935]:         <nova:memory>128</nova:memory>
Jan 22 00:48:58 compute-0 nova_compute[182935]:         <nova:disk>1</nova:disk>
Jan 22 00:48:58 compute-0 nova_compute[182935]:         <nova:swap>0</nova:swap>
Jan 22 00:48:58 compute-0 nova_compute[182935]:         <nova:ephemeral>0</nova:ephemeral>
Jan 22 00:48:58 compute-0 nova_compute[182935]:         <nova:vcpus>1</nova:vcpus>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       </nova:flavor>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <nova:owner>
Jan 22 00:48:58 compute-0 nova_compute[182935]:         <nova:user uuid="f96259409b0747b6ac866ebe79dcf160">tempest-TestShelveInstance-1694031060-project-member</nova:user>
Jan 22 00:48:58 compute-0 nova_compute[182935]:         <nova:project uuid="9d05c3cf062a4f6ebb5083b35d40286e">tempest-TestShelveInstance-1694031060</nova:project>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       </nova:owner>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <nova:root type="image" uuid="9cd98f02-a505-4543-a7ad-04e9a377b456"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <nova:ports>
Jan 22 00:48:58 compute-0 nova_compute[182935]:         <nova:port uuid="ee3eb2da-6644-4c49-952b-d4fd939223d9">
Jan 22 00:48:58 compute-0 nova_compute[182935]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:         </nova:port>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       </nova:ports>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     </nova:instance>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   </metadata>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   <sysinfo type="smbios">
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <system>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <entry name="manufacturer">RDO</entry>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <entry name="product">OpenStack Compute</entry>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <entry name="serial">07d46432-944a-49b9-9862-65d4e541e750</entry>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <entry name="uuid">07d46432-944a-49b9-9862-65d4e541e750</entry>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <entry name="family">Virtual Machine</entry>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     </system>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   </sysinfo>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   <os>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <boot dev="hd"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <smbios mode="sysinfo"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   </os>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   <features>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <acpi/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <apic/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <vmcoreinfo/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   </features>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   <clock offset="utc">
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <timer name="pit" tickpolicy="delay"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <timer name="hpet" present="no"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   </clock>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   <cpu mode="custom" match="exact">
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <model>Nehalem</model>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <topology sockets="1" cores="1" threads="1"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   </cpu>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   <devices>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <disk type="file" device="disk">
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <target dev="vda" bus="virtio"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <disk type="file" device="cdrom">
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <driver name="qemu" type="raw" cache="none"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <source file="/var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.config"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <target dev="sda" bus="sata"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     </disk>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <interface type="ethernet">
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <mac address="fa:16:3e:6d:46:7b"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <driver name="vhost" rx_queue_size="512"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <mtu size="1442"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <target dev="tapee3eb2da-66"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     </interface>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <serial type="pty">
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <log file="/var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/console.log" append="off"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     </serial>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <video>
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <model type="virtio"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     </video>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <input type="tablet" bus="usb"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <rng model="virtio">
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <backend model="random">/dev/urandom</backend>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     </rng>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="pci" model="pcie-root-port"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <controller type="usb" index="0"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     <memballoon model="virtio">
Jan 22 00:48:58 compute-0 nova_compute[182935]:       <stats period="10"/>
Jan 22 00:48:58 compute-0 nova_compute[182935]:     </memballoon>
Jan 22 00:48:58 compute-0 nova_compute[182935]:   </devices>
Jan 22 00:48:58 compute-0 nova_compute[182935]: </domain>
Jan 22 00:48:58 compute-0 nova_compute[182935]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.536 182939 DEBUG nova.compute.manager [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Preparing to wait for external event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.536 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.537 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.537 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.538 182939 DEBUG nova.virt.libvirt.vif [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-22T00:48:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-229309001',display_name='tempest-TestShelveInstance-server-229309001',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-229309001',id=188,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOgA5ZzfKbfeycJFDDkwZB6fgaNy8JupDR87kTs8Udzuy0Hdmm9UofiFEnY5+8X6yn18pBjwI+0V0Npbtx57RV5bhVB+OuvvOnztrIeeQxpfJd0y9DR5TAlaf0wFpgpxfw==',key_name='tempest-TestShelveInstance-1678551224',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d05c3cf062a4f6ebb5083b35d40286e',ramdisk_id='',reservation_id='r-urkgbm2c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1694031060',owner_user_name='tempest-TestShelveInstance-1694031060-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-22T00:48:53Z,user_data=None,user_id='f96259409b0747b6ac866ebe79dcf160',uuid=07d46432-944a-49b9-9862-65d4e541e750,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.538 182939 DEBUG nova.network.os_vif_util [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Converting VIF {"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.539 182939 DEBUG nova.network.os_vif_util [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.540 182939 DEBUG os_vif [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.540 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.541 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.541 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.545 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.546 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee3eb2da-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.546 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapee3eb2da-66, col_values=(('external_ids', {'iface-id': 'ee3eb2da-6644-4c49-952b-d4fd939223d9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:46:7b', 'vm-uuid': '07d46432-944a-49b9-9862-65d4e541e750'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:48:58 compute-0 NetworkManager[55139]: <info>  [1769042938.5969] manager: (tapee3eb2da-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/380)
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.596 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.598 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.603 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.603 182939 INFO os_vif [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66')
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.672 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.673 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.673 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] No VIF found with MAC fa:16:3e:6d:46:7b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 22 00:48:58 compute-0 nova_compute[182935]: 2026-01-22 00:48:58.673 182939 INFO nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Using config drive
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.108 182939 INFO nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Creating config drive at /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.config
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.113 182939 DEBUG oslo_concurrency.processutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpimv6z9ei execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.246 182939 DEBUG oslo_concurrency.processutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpimv6z9ei" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:48:59 compute-0 kernel: tapee3eb2da-66: entered promiscuous mode
Jan 22 00:48:59 compute-0 NetworkManager[55139]: <info>  [1769042939.3240] manager: (tapee3eb2da-66): new Tun device (/org/freedesktop/NetworkManager/Devices/381)
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.323 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:59 compute-0 ovn_controller[95047]: 2026-01-22T00:48:59Z|00775|binding|INFO|Claiming lport ee3eb2da-6644-4c49-952b-d4fd939223d9 for this chassis.
Jan 22 00:48:59 compute-0 ovn_controller[95047]: 2026-01-22T00:48:59Z|00776|binding|INFO|ee3eb2da-6644-4c49-952b-d4fd939223d9: Claiming fa:16:3e:6d:46:7b 10.100.0.4
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.332 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.347 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:46:7b 10.100.0.4'], port_security=['fa:16:3e:6d:46:7b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '07d46432-944a-49b9-9862-65d4e541e750', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d05c3cf062a4f6ebb5083b35d40286e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '58135b34-ec05-462e-8563-87deac605474', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d85ad74f-8de0-427b-84fe-c5395634422f, chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=ee3eb2da-6644-4c49-952b-d4fd939223d9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.348 104408 INFO neutron.agent.ovn.metadata.agent [-] Port ee3eb2da-6644-4c49-952b-d4fd939223d9 in datapath 7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 bound to our chassis
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.349 104408 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7f04cd1e-fc0c-46c7-9d75-03b818ec99e2
Jan 22 00:48:59 compute-0 systemd-udevd[248190]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.368 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[9072a92e-95be-495f-a669-dc77a2dde88e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.369 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7f04cd1e-f1 in ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 22 00:48:59 compute-0 NetworkManager[55139]: <info>  [1769042939.3702] device (tapee3eb2da-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 22 00:48:59 compute-0 NetworkManager[55139]: <info>  [1769042939.3711] device (tapee3eb2da-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.373 211917 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7f04cd1e-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.373 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[bea0f990-6587-48fd-beb4-ed675c4cf12d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.375 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[450f3c58-9865-4b71-8ff5-0a72f8dcbba5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 systemd-machined[154182]: New machine qemu-94-instance-000000bc.
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.387 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[cadb3fc1-219e-49f1-9a42-0defeaf6b4b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.396 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:59 compute-0 ovn_controller[95047]: 2026-01-22T00:48:59Z|00777|binding|INFO|Setting lport ee3eb2da-6644-4c49-952b-d4fd939223d9 ovn-installed in OVS
Jan 22 00:48:59 compute-0 ovn_controller[95047]: 2026-01-22T00:48:59Z|00778|binding|INFO|Setting lport ee3eb2da-6644-4c49-952b-d4fd939223d9 up in Southbound
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.400 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.402 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[36f49382-daa7-4cc6-8e6b-7b0927d5cda1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 systemd[1]: Started Virtual Machine qemu-94-instance-000000bc.
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.430 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d6065dba-82f5-4bce-9acb-928df465a1d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 NetworkManager[55139]: <info>  [1769042939.4381] manager: (tap7f04cd1e-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/382)
Jan 22 00:48:59 compute-0 systemd-udevd[248195]: Network interface NamePolicy= disabled on kernel command line.
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.437 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[dd1b74e5-8fd8-44fd-98d5-aef6cfdd93e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.472 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b6b33022-8c79-4f13-89de-8d3255218eb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.476 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[b57cb501-1a25-44d7-bc70-431ba3deb281]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 NetworkManager[55139]: <info>  [1769042939.4983] device (tap7f04cd1e-f0): carrier: link connected
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.503 211938 DEBUG oslo.privsep.daemon [-] privsep: reply[d913b211-ad73-4e6c-a9c7-de56834140ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.523 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[79c8edc1-e962-4639-bba5-1cc1e5f7c871]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f04cd1e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:ee:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 244], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 744223, 'reachable_time': 42646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248226, 'error': None, 'target': 'ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.541 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[d0cd2d53-7829-4699-8a7c-67fd647e4d87]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:eec2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 744223, 'tstamp': 744223}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248227, 'error': None, 'target': 'ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.561 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[73a99857-1de0-4b85-a6fa-71b3671d7dfd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7f04cd1e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:de:ee:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 244], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 744223, 'reachable_time': 42646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248228, 'error': None, 'target': 'ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.597 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[ceef506a-d24a-4b7e-a874-595ce46f2ac0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.661 182939 DEBUG nova.compute.manager [req-2584c63f-91fe-405d-9bd1-23441fb6f574 req-ca41df40-b3a8-481f-ae11-8abfdefe7396 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.661 182939 DEBUG oslo_concurrency.lockutils [req-2584c63f-91fe-405d-9bd1-23441fb6f574 req-ca41df40-b3a8-481f-ae11-8abfdefe7396 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.662 182939 DEBUG oslo_concurrency.lockutils [req-2584c63f-91fe-405d-9bd1-23441fb6f574 req-ca41df40-b3a8-481f-ae11-8abfdefe7396 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.662 182939 DEBUG oslo_concurrency.lockutils [req-2584c63f-91fe-405d-9bd1-23441fb6f574 req-ca41df40-b3a8-481f-ae11-8abfdefe7396 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.662 182939 DEBUG nova.compute.manager [req-2584c63f-91fe-405d-9bd1-23441fb6f574 req-ca41df40-b3a8-481f-ae11-8abfdefe7396 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Processing event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.668 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[441611a4-310f-4b88-bc3c-e9b308677cbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.669 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f04cd1e-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.669 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.670 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f04cd1e-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:48:59 compute-0 NetworkManager[55139]: <info>  [1769042939.7057] manager: (tap7f04cd1e-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/383)
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.705 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:59 compute-0 kernel: tap7f04cd1e-f0: entered promiscuous mode
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.707 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.708 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7f04cd1e-f0, col_values=(('external_ids', {'iface-id': 'f2779172-88a7-44be-ac20-583c93a461c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.709 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:59 compute-0 ovn_controller[95047]: 2026-01-22T00:48:59Z|00779|binding|INFO|Releasing lport f2779172-88a7-44be-ac20-583c93a461c0 from this chassis (sb_readonly=0)
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.710 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.711 104408 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7f04cd1e-fc0c-46c7-9d75-03b818ec99e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7f04cd1e-fc0c-46c7-9d75-03b818ec99e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.711 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[4647e351-86c1-40c6-a7e7-2403a34f16f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.712 104408 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: global
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     log         /dev/log local0 debug
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     log-tag     haproxy-metadata-proxy-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     user        root
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     group       root
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     maxconn     1024
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     pidfile     /var/lib/neutron/external/pids/7f04cd1e-fc0c-46c7-9d75-03b818ec99e2.pid.haproxy
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     daemon
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: defaults
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     log global
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     mode http
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     option httplog
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     option dontlognull
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     option http-server-close
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     option forwardfor
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     retries                 3
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     timeout http-request    30s
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     timeout connect         30s
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     timeout client          32s
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     timeout server          32s
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     timeout http-keep-alive 30s
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: listen listener
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     bind 169.254.169.254:80
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     server metadata /var/lib/neutron/metadata_proxy
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:     http-request add-header X-OVN-Network-ID 7f04cd1e-fc0c-46c7-9d75-03b818ec99e2
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 22 00:48:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:48:59.712 104408 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'env', 'PROCESS_TAG=haproxy-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7f04cd1e-fc0c-46c7-9d75-03b818ec99e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.721 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.945 182939 DEBUG nova.network.neutron [req-568e0bbe-62d2-47e9-8525-02894973d6b6 req-71712c0a-3815-4ea9-b4d4-48e9a7827080 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updated VIF entry in instance network info cache for port ee3eb2da-6644-4c49-952b-d4fd939223d9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.946 182939 DEBUG nova.network.neutron [req-568e0bbe-62d2-47e9-8525-02894973d6b6 req-71712c0a-3815-4ea9-b4d4-48e9a7827080 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updating instance_info_cache with network_info: [{"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:48:59 compute-0 nova_compute[182935]: 2026-01-22 00:48:59.962 182939 DEBUG oslo_concurrency.lockutils [req-568e0bbe-62d2-47e9-8525-02894973d6b6 req-71712c0a-3815-4ea9-b4d4-48e9a7827080 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:49:00 compute-0 podman[248265]: 2026-01-22 00:49:00.05629712 +0000 UTC m=+0.048331637 container create c59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.062 182939 DEBUG nova.compute.manager [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.064 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042940.0632586, 07d46432-944a-49b9-9862-65d4e541e750 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.065 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] VM Started (Lifecycle Event)
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.069 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.074 182939 INFO nova.virt.libvirt.driver [-] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Instance spawned successfully.
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.074 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.088 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.096 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.098 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.099 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.099 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.100 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.100 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.100 182939 DEBUG nova.virt.libvirt.driver [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 22 00:49:00 compute-0 systemd[1]: Started libpod-conmon-c59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da.scope.
Jan 22 00:49:00 compute-0 podman[248265]: 2026-01-22 00:49:00.028346421 +0000 UTC m=+0.020380968 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.129 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.130 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042940.0634074, 07d46432-944a-49b9-9862-65d4e541e750 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.130 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] VM Paused (Lifecycle Event)
Jan 22 00:49:00 compute-0 systemd[1]: Started libcrun container.
Jan 22 00:49:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e102038a28ca53e3664a5039eb0db938c01931fc96e62108a3d3bad2da53f610/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 22 00:49:00 compute-0 podman[248265]: 2026-01-22 00:49:00.147230967 +0000 UTC m=+0.139265504 container init c59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.150 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:49:00 compute-0 podman[248265]: 2026-01-22 00:49:00.153660402 +0000 UTC m=+0.145694919 container start c59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.154 182939 DEBUG nova.virt.driver [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] Emitting event <LifecycleEvent: 1769042940.0687044, 07d46432-944a-49b9-9862-65d4e541e750 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.154 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] VM Resumed (Lifecycle Event)
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.172 182939 INFO nova.compute.manager [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Took 6.59 seconds to spawn the instance on the hypervisor.
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.173 182939 DEBUG nova.compute.manager [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.177 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:49:00 compute-0 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[248281]: [NOTICE]   (248285) : New worker (248287) forked
Jan 22 00:49:00 compute-0 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[248281]: [NOTICE]   (248285) : Loading success.
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.185 182939 DEBUG nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.226 182939 INFO nova.compute.manager [None req-9b9af8c5-cd08-4ad5-b97f-d1faeb6ddfb6 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.279 182939 INFO nova.compute.manager [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Took 7.13 seconds to build instance.
Jan 22 00:49:00 compute-0 nova_compute[182935]: 2026-01-22 00:49:00.298 182939 DEBUG oslo_concurrency.lockutils [None req-fb793c5e-52be-4f93-93b8-a4d7c53fcd81 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:49:01 compute-0 nova_compute[182935]: 2026-01-22 00:49:01.811 182939 DEBUG nova.compute.manager [req-b6aa842e-2484-4f32-8eb0-72180573d839 req-e0395794-3a27-430b-9333-ad8140141586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:49:01 compute-0 nova_compute[182935]: 2026-01-22 00:49:01.813 182939 DEBUG oslo_concurrency.lockutils [req-b6aa842e-2484-4f32-8eb0-72180573d839 req-e0395794-3a27-430b-9333-ad8140141586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:49:01 compute-0 nova_compute[182935]: 2026-01-22 00:49:01.813 182939 DEBUG oslo_concurrency.lockutils [req-b6aa842e-2484-4f32-8eb0-72180573d839 req-e0395794-3a27-430b-9333-ad8140141586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:49:01 compute-0 nova_compute[182935]: 2026-01-22 00:49:01.814 182939 DEBUG oslo_concurrency.lockutils [req-b6aa842e-2484-4f32-8eb0-72180573d839 req-e0395794-3a27-430b-9333-ad8140141586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:49:01 compute-0 nova_compute[182935]: 2026-01-22 00:49:01.814 182939 DEBUG nova.compute.manager [req-b6aa842e-2484-4f32-8eb0-72180573d839 req-e0395794-3a27-430b-9333-ad8140141586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] No waiting events found dispatching network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:49:01 compute-0 nova_compute[182935]: 2026-01-22 00:49:01.815 182939 WARNING nova.compute.manager [req-b6aa842e-2484-4f32-8eb0-72180573d839 req-e0395794-3a27-430b-9333-ad8140141586 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received unexpected event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 for instance with vm_state active and task_state None.
Jan 22 00:49:02 compute-0 nova_compute[182935]: 2026-01-22 00:49:02.481 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:02 compute-0 sshd-session[248296]: Connection closed by authenticating user operator 188.166.69.60 port 55364 [preauth]
Jan 22 00:49:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:03.249 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:49:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:03.249 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:49:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:03.250 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:49:03 compute-0 nova_compute[182935]: 2026-01-22 00:49:03.596 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:07 compute-0 nova_compute[182935]: 2026-01-22 00:49:07.484 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:08 compute-0 nova_compute[182935]: 2026-01-22 00:49:08.600 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:08 compute-0 nova_compute[182935]: 2026-01-22 00:49:08.682 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:08 compute-0 NetworkManager[55139]: <info>  [1769042948.6920] manager: (patch-br-int-to-provnet-99bcf688-d143-4306-a11c-956e66fbe227): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/384)
Jan 22 00:49:08 compute-0 NetworkManager[55139]: <info>  [1769042948.6928] manager: (patch-provnet-99bcf688-d143-4306-a11c-956e66fbe227-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/385)
Jan 22 00:49:08 compute-0 nova_compute[182935]: 2026-01-22 00:49:08.772 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:08 compute-0 ovn_controller[95047]: 2026-01-22T00:49:08Z|00780|binding|INFO|Releasing lport f2779172-88a7-44be-ac20-583c93a461c0 from this chassis (sb_readonly=0)
Jan 22 00:49:08 compute-0 nova_compute[182935]: 2026-01-22 00:49:08.786 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:09 compute-0 nova_compute[182935]: 2026-01-22 00:49:09.231 182939 DEBUG nova.compute.manager [req-a0e83f58-b811-4a29-9dc0-10fea6f90168 req-fd67e042-21cd-44b9-8f41-cd5d5d9674de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-changed-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:49:09 compute-0 nova_compute[182935]: 2026-01-22 00:49:09.232 182939 DEBUG nova.compute.manager [req-a0e83f58-b811-4a29-9dc0-10fea6f90168 req-fd67e042-21cd-44b9-8f41-cd5d5d9674de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Refreshing instance network info cache due to event network-changed-ee3eb2da-6644-4c49-952b-d4fd939223d9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:49:09 compute-0 nova_compute[182935]: 2026-01-22 00:49:09.232 182939 DEBUG oslo_concurrency.lockutils [req-a0e83f58-b811-4a29-9dc0-10fea6f90168 req-fd67e042-21cd-44b9-8f41-cd5d5d9674de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:49:09 compute-0 nova_compute[182935]: 2026-01-22 00:49:09.232 182939 DEBUG oslo_concurrency.lockutils [req-a0e83f58-b811-4a29-9dc0-10fea6f90168 req-fd67e042-21cd-44b9-8f41-cd5d5d9674de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:49:09 compute-0 nova_compute[182935]: 2026-01-22 00:49:09.233 182939 DEBUG nova.network.neutron [req-a0e83f58-b811-4a29-9dc0-10fea6f90168 req-fd67e042-21cd-44b9-8f41-cd5d5d9674de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Refreshing network info cache for port ee3eb2da-6644-4c49-952b-d4fd939223d9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:49:09 compute-0 podman[248299]: 2026-01-22 00:49:09.687765641 +0000 UTC m=+0.054818263 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:49:09 compute-0 podman[248301]: 2026-01-22 00:49:09.6914653 +0000 UTC m=+0.050972381 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:49:09 compute-0 podman[248300]: 2026-01-22 00:49:09.748144418 +0000 UTC m=+0.112163078 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller)
Jan 22 00:49:10 compute-0 nova_compute[182935]: 2026-01-22 00:49:10.639 182939 DEBUG nova.network.neutron [req-a0e83f58-b811-4a29-9dc0-10fea6f90168 req-fd67e042-21cd-44b9-8f41-cd5d5d9674de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updated VIF entry in instance network info cache for port ee3eb2da-6644-4c49-952b-d4fd939223d9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:49:10 compute-0 nova_compute[182935]: 2026-01-22 00:49:10.639 182939 DEBUG nova.network.neutron [req-a0e83f58-b811-4a29-9dc0-10fea6f90168 req-fd67e042-21cd-44b9-8f41-cd5d5d9674de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updating instance_info_cache with network_info: [{"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:49:10 compute-0 nova_compute[182935]: 2026-01-22 00:49:10.670 182939 DEBUG oslo_concurrency.lockutils [req-a0e83f58-b811-4a29-9dc0-10fea6f90168 req-fd67e042-21cd-44b9-8f41-cd5d5d9674de 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:49:11 compute-0 nova_compute[182935]: 2026-01-22 00:49:11.809 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:11 compute-0 nova_compute[182935]: 2026-01-22 00:49:11.835 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:49:11 compute-0 nova_compute[182935]: 2026-01-22 00:49:11.835 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:49:11 compute-0 nova_compute[182935]: 2026-01-22 00:49:11.836 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:49:11 compute-0 nova_compute[182935]: 2026-01-22 00:49:11.836 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:49:11 compute-0 nova_compute[182935]: 2026-01-22 00:49:11.913 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:49:11 compute-0 nova_compute[182935]: 2026-01-22 00:49:11.976 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:49:11 compute-0 nova_compute[182935]: 2026-01-22 00:49:11.977 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.034 182939 DEBUG oslo_concurrency.processutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.162 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.164 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5545MB free_disk=73.0982894897461GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.164 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.165 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.255 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Instance 07d46432-944a-49b9-9862-65d4e541e750 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.256 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.256 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.290 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.304 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.321 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.322 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:49:12 compute-0 nova_compute[182935]: 2026-01-22 00:49:12.488 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:13 compute-0 ovn_controller[95047]: 2026-01-22T00:49:13Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6d:46:7b 10.100.0.4
Jan 22 00:49:13 compute-0 ovn_controller[95047]: 2026-01-22T00:49:13Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6d:46:7b 10.100.0.4
Jan 22 00:49:13 compute-0 nova_compute[182935]: 2026-01-22 00:49:13.603 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:16 compute-0 nova_compute[182935]: 2026-01-22 00:49:16.306 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:16 compute-0 nova_compute[182935]: 2026-01-22 00:49:16.307 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:49:16 compute-0 nova_compute[182935]: 2026-01-22 00:49:16.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:16 compute-0 nova_compute[182935]: 2026-01-22 00:49:16.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:49:16 compute-0 nova_compute[182935]: 2026-01-22 00:49:16.797 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:49:17 compute-0 nova_compute[182935]: 2026-01-22 00:49:17.430 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:49:17 compute-0 nova_compute[182935]: 2026-01-22 00:49:17.430 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquired lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:49:17 compute-0 nova_compute[182935]: 2026-01-22 00:49:17.430 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 22 00:49:17 compute-0 nova_compute[182935]: 2026-01-22 00:49:17.430 182939 DEBUG nova.objects.instance [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:49:17 compute-0 nova_compute[182935]: 2026-01-22 00:49:17.492 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:17 compute-0 podman[248389]: 2026-01-22 00:49:17.671415532 +0000 UTC m=+0.049320101 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 22 00:49:18 compute-0 nova_compute[182935]: 2026-01-22 00:49:18.605 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:19 compute-0 nova_compute[182935]: 2026-01-22 00:49:19.452 182939 DEBUG nova.network.neutron [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updating instance_info_cache with network_info: [{"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:49:19 compute-0 nova_compute[182935]: 2026-01-22 00:49:19.497 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Releasing lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:49:19 compute-0 nova_compute[182935]: 2026-01-22 00:49:19.498 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 22 00:49:19 compute-0 nova_compute[182935]: 2026-01-22 00:49:19.499 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:22 compute-0 nova_compute[182935]: 2026-01-22 00:49:22.494 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:22 compute-0 podman[248407]: 2026-01-22 00:49:22.703862806 +0000 UTC m=+0.078814699 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Jan 22 00:49:22 compute-0 podman[248408]: 2026-01-22 00:49:22.716047077 +0000 UTC m=+0.085101088 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:49:23 compute-0 nova_compute[182935]: 2026-01-22 00:49:23.608 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:25 compute-0 nova_compute[182935]: 2026-01-22 00:49:25.568 182939 DEBUG oslo_concurrency.lockutils [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:49:25 compute-0 nova_compute[182935]: 2026-01-22 00:49:25.569 182939 DEBUG oslo_concurrency.lockutils [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:49:25 compute-0 nova_compute[182935]: 2026-01-22 00:49:25.569 182939 INFO nova.compute.manager [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Shelving
Jan 22 00:49:25 compute-0 nova_compute[182935]: 2026-01-22 00:49:25.613 182939 DEBUG nova.virt.libvirt.driver [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 22 00:49:25 compute-0 nova_compute[182935]: 2026-01-22 00:49:25.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:25 compute-0 nova_compute[182935]: 2026-01-22 00:49:25.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:27 compute-0 nova_compute[182935]: 2026-01-22 00:49:27.495 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:27 compute-0 nova_compute[182935]: 2026-01-22 00:49:27.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:27 compute-0 kernel: tapee3eb2da-66 (unregistering): left promiscuous mode
Jan 22 00:49:27 compute-0 NetworkManager[55139]: <info>  [1769042967.8099] device (tapee3eb2da-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 22 00:49:27 compute-0 nova_compute[182935]: 2026-01-22 00:49:27.819 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:27 compute-0 ovn_controller[95047]: 2026-01-22T00:49:27Z|00781|binding|INFO|Releasing lport ee3eb2da-6644-4c49-952b-d4fd939223d9 from this chassis (sb_readonly=0)
Jan 22 00:49:27 compute-0 ovn_controller[95047]: 2026-01-22T00:49:27Z|00782|binding|INFO|Setting lport ee3eb2da-6644-4c49-952b-d4fd939223d9 down in Southbound
Jan 22 00:49:27 compute-0 ovn_controller[95047]: 2026-01-22T00:49:27Z|00783|binding|INFO|Removing iface tapee3eb2da-66 ovn-installed in OVS
Jan 22 00:49:27 compute-0 nova_compute[182935]: 2026-01-22 00:49:27.821 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:27.827 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:46:7b 10.100.0.4'], port_security=['fa:16:3e:6d:46:7b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '07d46432-944a-49b9-9862-65d4e541e750', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d05c3cf062a4f6ebb5083b35d40286e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '58135b34-ec05-462e-8563-87deac605474', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.219'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d85ad74f-8de0-427b-84fe-c5395634422f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>], logical_port=ee3eb2da-6644-4c49-952b-d4fd939223d9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9e44de39a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:49:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:27.829 104408 INFO neutron.agent.ovn.metadata.agent [-] Port ee3eb2da-6644-4c49-952b-d4fd939223d9 in datapath 7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 unbound from our chassis
Jan 22 00:49:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:27.831 104408 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 22 00:49:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:27.832 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c29d02-9891-4f30-a36d-b04029a0a400]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:49:27 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:27.833 104408 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 namespace which is not needed anymore
Jan 22 00:49:27 compute-0 nova_compute[182935]: 2026-01-22 00:49:27.840 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:27 compute-0 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000bc.scope: Deactivated successfully.
Jan 22 00:49:27 compute-0 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000bc.scope: Consumed 13.782s CPU time.
Jan 22 00:49:27 compute-0 systemd-machined[154182]: Machine qemu-94-instance-000000bc terminated.
Jan 22 00:49:27 compute-0 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[248281]: [NOTICE]   (248285) : haproxy version is 2.8.14-c23fe91
Jan 22 00:49:27 compute-0 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[248281]: [NOTICE]   (248285) : path to executable is /usr/sbin/haproxy
Jan 22 00:49:27 compute-0 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[248281]: [WARNING]  (248285) : Exiting Master process...
Jan 22 00:49:27 compute-0 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[248281]: [WARNING]  (248285) : Exiting Master process...
Jan 22 00:49:27 compute-0 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[248281]: [ALERT]    (248285) : Current worker (248287) exited with code 143 (Terminated)
Jan 22 00:49:27 compute-0 neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2[248281]: [WARNING]  (248285) : All workers exited. Exiting... (0)
Jan 22 00:49:27 compute-0 systemd[1]: libpod-c59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da.scope: Deactivated successfully.
Jan 22 00:49:27 compute-0 podman[248472]: 2026-01-22 00:49:27.990274269 +0000 UTC m=+0.052987119 container died c59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:49:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da-userdata-shm.mount: Deactivated successfully.
Jan 22 00:49:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-e102038a28ca53e3664a5039eb0db938c01931fc96e62108a3d3bad2da53f610-merged.mount: Deactivated successfully.
Jan 22 00:49:28 compute-0 podman[248472]: 2026-01-22 00:49:28.035170444 +0000 UTC m=+0.097883274 container cleanup c59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 22 00:49:28 compute-0 systemd[1]: libpod-conmon-c59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da.scope: Deactivated successfully.
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.062 182939 DEBUG nova.compute.manager [req-1c7f3ac9-4fa8-4a7b-a4cf-137fb6339812 req-1409df1f-88ba-472b-a6b6-1a3f8dfe6885 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-vif-unplugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.063 182939 DEBUG oslo_concurrency.lockutils [req-1c7f3ac9-4fa8-4a7b-a4cf-137fb6339812 req-1409df1f-88ba-472b-a6b6-1a3f8dfe6885 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.063 182939 DEBUG oslo_concurrency.lockutils [req-1c7f3ac9-4fa8-4a7b-a4cf-137fb6339812 req-1409df1f-88ba-472b-a6b6-1a3f8dfe6885 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.063 182939 DEBUG oslo_concurrency.lockutils [req-1c7f3ac9-4fa8-4a7b-a4cf-137fb6339812 req-1409df1f-88ba-472b-a6b6-1a3f8dfe6885 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.064 182939 DEBUG nova.compute.manager [req-1c7f3ac9-4fa8-4a7b-a4cf-137fb6339812 req-1409df1f-88ba-472b-a6b6-1a3f8dfe6885 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] No waiting events found dispatching network-vif-unplugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.064 182939 WARNING nova.compute.manager [req-1c7f3ac9-4fa8-4a7b-a4cf-137fb6339812 req-1409df1f-88ba-472b-a6b6-1a3f8dfe6885 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received unexpected event network-vif-unplugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 for instance with vm_state active and task_state shelving.
Jan 22 00:49:28 compute-0 podman[248508]: 2026-01-22 00:49:28.105622152 +0000 UTC m=+0.044934907 container remove c59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 22 00:49:28 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:28.113 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b33954bd-975e-4696-88fb-088c713a93a9]: (4, ('Thu Jan 22 12:49:27 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 (c59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da)\nc59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da\nThu Jan 22 12:49:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 (c59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da)\nc59c2e3128dd8ab4c1be520e7ecc0217539afe678f5cc037f501ee927a79f7da\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:49:28 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:28.115 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b9f91306-24ff-42ee-8136-f6a974b4efb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:49:28 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:28.115 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f04cd1e-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.117 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:28 compute-0 kernel: tap7f04cd1e-f0: left promiscuous mode
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.131 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:28 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:28.133 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[b610baed-f21c-488c-8c4b-f8b0b0a5d574]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:49:28 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:28.149 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[3b127239-9a04-42c3-87e8-8ca349037f9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:49:28 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:28.150 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb1a9c0-355a-4683-a25d-469bb58eda83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:49:28 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:28.166 211917 DEBUG oslo.privsep.daemon [-] privsep: reply[e2fdcf7c-cb98-4558-8c95-f832e6746e31]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 744216, 'reachable_time': 20722, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248539, 'error': None, 'target': 'ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:49:28 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:28.168 104855 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7f04cd1e-fc0c-46c7-9d75-03b818ec99e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 22 00:49:28 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:28.168 104855 DEBUG oslo.privsep.daemon [-] privsep: reply[44969bc7-a565-46a0-ab7b-78f32f3c0d68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 22 00:49:28 compute-0 systemd[1]: run-netns-ovnmeta\x2d7f04cd1e\x2dfc0c\x2d46c7\x2d9d75\x2d03b818ec99e2.mount: Deactivated successfully.
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.610 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.630 182939 INFO nova.virt.libvirt.driver [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Instance shutdown successfully after 3 seconds.
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.635 182939 INFO nova.virt.libvirt.driver [-] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Instance destroyed successfully.
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.635 182939 DEBUG nova.objects.instance [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'numa_topology' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:28 compute-0 nova_compute[182935]: 2026-01-22 00:49:28.916 182939 INFO nova.virt.libvirt.driver [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Beginning cold snapshot process
Jan 22 00:49:29 compute-0 nova_compute[182935]: 2026-01-22 00:49:29.148 182939 DEBUG nova.privsep.utils [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 22 00:49:29 compute-0 nova_compute[182935]: 2026-01-22 00:49:29.149 182939 DEBUG oslo_concurrency.processutils [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk /var/lib/nova/instances/snapshots/tmpj44xwad7/eca22511e6f14021be8baa7168d09211 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 00:49:29 compute-0 nova_compute[182935]: 2026-01-22 00:49:29.490 182939 DEBUG oslo_concurrency.processutils [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750/disk /var/lib/nova/instances/snapshots/tmpj44xwad7/eca22511e6f14021be8baa7168d09211" returned: 0 in 0.341s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 00:49:29 compute-0 nova_compute[182935]: 2026-01-22 00:49:29.491 182939 INFO nova.virt.libvirt.driver [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Snapshot extracted, beginning image upload
Jan 22 00:49:30 compute-0 nova_compute[182935]: 2026-01-22 00:49:30.150 182939 DEBUG nova.compute.manager [req-dd33cd7b-7b3a-4089-8981-e331e1d21a83 req-29e4b77f-b46e-46f8-b7f1-8f0e36dc85c1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:49:30 compute-0 nova_compute[182935]: 2026-01-22 00:49:30.150 182939 DEBUG oslo_concurrency.lockutils [req-dd33cd7b-7b3a-4089-8981-e331e1d21a83 req-29e4b77f-b46e-46f8-b7f1-8f0e36dc85c1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "07d46432-944a-49b9-9862-65d4e541e750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:49:30 compute-0 nova_compute[182935]: 2026-01-22 00:49:30.151 182939 DEBUG oslo_concurrency.lockutils [req-dd33cd7b-7b3a-4089-8981-e331e1d21a83 req-29e4b77f-b46e-46f8-b7f1-8f0e36dc85c1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:49:30 compute-0 nova_compute[182935]: 2026-01-22 00:49:30.151 182939 DEBUG oslo_concurrency.lockutils [req-dd33cd7b-7b3a-4089-8981-e331e1d21a83 req-29e4b77f-b46e-46f8-b7f1-8f0e36dc85c1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:49:30 compute-0 nova_compute[182935]: 2026-01-22 00:49:30.151 182939 DEBUG nova.compute.manager [req-dd33cd7b-7b3a-4089-8981-e331e1d21a83 req-29e4b77f-b46e-46f8-b7f1-8f0e36dc85c1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] No waiting events found dispatching network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 22 00:49:30 compute-0 nova_compute[182935]: 2026-01-22 00:49:30.151 182939 WARNING nova.compute.manager [req-dd33cd7b-7b3a-4089-8981-e331e1d21a83 req-29e4b77f-b46e-46f8-b7f1-8f0e36dc85c1 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received unexpected event network-vif-plugged-ee3eb2da-6644-4c49-952b-d4fd939223d9 for instance with vm_state active and task_state shelving_image_uploading.
Jan 22 00:49:31 compute-0 nova_compute[182935]: 2026-01-22 00:49:31.988 182939 INFO nova.virt.libvirt.driver [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Snapshot image upload complete
Jan 22 00:49:31 compute-0 nova_compute[182935]: 2026-01-22 00:49:31.989 182939 DEBUG nova.compute.manager [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:49:32 compute-0 nova_compute[182935]: 2026-01-22 00:49:32.078 182939 INFO nova.compute.manager [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Shelve offloading
Jan 22 00:49:32 compute-0 nova_compute[182935]: 2026-01-22 00:49:32.099 182939 INFO nova.virt.libvirt.driver [-] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Instance destroyed successfully.
Jan 22 00:49:32 compute-0 nova_compute[182935]: 2026-01-22 00:49:32.099 182939 DEBUG nova.compute.manager [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:49:32 compute-0 nova_compute[182935]: 2026-01-22 00:49:32.102 182939 DEBUG oslo_concurrency.lockutils [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:49:32 compute-0 nova_compute[182935]: 2026-01-22 00:49:32.102 182939 DEBUG oslo_concurrency.lockutils [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquired lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:49:32 compute-0 nova_compute[182935]: 2026-01-22 00:49:32.103 182939 DEBUG nova.network.neutron [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 22 00:49:32 compute-0 nova_compute[182935]: 2026-01-22 00:49:32.497 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:33 compute-0 nova_compute[182935]: 2026-01-22 00:49:33.467 182939 DEBUG nova.network.neutron [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updating instance_info_cache with network_info: [{"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:49:33 compute-0 nova_compute[182935]: 2026-01-22 00:49:33.490 182939 DEBUG oslo_concurrency.lockutils [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Releasing lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:49:33 compute-0 nova_compute[182935]: 2026-01-22 00:49:33.613 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:33 compute-0 nova_compute[182935]: 2026-01-22 00:49:33.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.481 182939 INFO nova.virt.libvirt.driver [-] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Instance destroyed successfully.
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.482 182939 DEBUG nova.objects.instance [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lazy-loading 'resources' on Instance uuid 07d46432-944a-49b9-9862-65d4e541e750 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.497 182939 DEBUG nova.virt.libvirt.vif [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-22T00:48:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-229309001',display_name='tempest-TestShelveInstance-server-229309001',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-229309001',id=188,image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOgA5ZzfKbfeycJFDDkwZB6fgaNy8JupDR87kTs8Udzuy0Hdmm9UofiFEnY5+8X6yn18pBjwI+0V0Npbtx57RV5bhVB+OuvvOnztrIeeQxpfJd0y9DR5TAlaf0wFpgpxfw==',key_name='tempest-TestShelveInstance-1678551224',keypairs=<?>,launch_index=0,launched_at=2026-01-22T00:49:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='9d05c3cf062a4f6ebb5083b35d40286e',ramdisk_id='',reservation_id='r-urkgbm2c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='9cd98f02-a505-4543-a7ad-04e9a377b456',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1694031060',owner_user_name='tempest-TestShelveInstance-1694031060-project-member',shelved_at='2026-01-22T00:49:31.989100',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='01af4101-f702-46bd-aa85-cba557b6a17e'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-22T00:49:29Z,user_data=None,user_id='f96259409b0747b6ac866ebe79dcf160',uuid=07d46432-944a-49b9-9862-65d4e541e750,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.498 182939 DEBUG nova.network.os_vif_util [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Converting VIF {"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": "br-int", "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapee3eb2da-66", "ovs_interfaceid": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.500 182939 DEBUG nova.network.os_vif_util [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.500 182939 DEBUG os_vif [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.503 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.503 182939 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee3eb2da-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.506 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.508 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.513 182939 INFO os_vif [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:46:7b,bridge_name='br-int',has_traffic_filtering=True,id=ee3eb2da-6644-4c49-952b-d4fd939223d9,network=Network(7f04cd1e-fc0c-46c7-9d75-03b818ec99e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapee3eb2da-66')
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.514 182939 INFO nova.virt.libvirt.driver [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Deleting instance files /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750_del
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.524 182939 INFO nova.virt.libvirt.driver [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Deletion of /var/lib/nova/instances/07d46432-944a-49b9-9862-65d4e541e750_del complete
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.574 182939 DEBUG nova.compute.manager [req-ac5039e3-def6-4001-b211-350d7d49b4c7 req-d1a2cd6d-f730-44b0-9383-8ab1d67a4769 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Received event network-changed-ee3eb2da-6644-4c49-952b-d4fd939223d9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.575 182939 DEBUG nova.compute.manager [req-ac5039e3-def6-4001-b211-350d7d49b4c7 req-d1a2cd6d-f730-44b0-9383-8ab1d67a4769 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Refreshing instance network info cache due to event network-changed-ee3eb2da-6644-4c49-952b-d4fd939223d9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.576 182939 DEBUG oslo_concurrency.lockutils [req-ac5039e3-def6-4001-b211-350d7d49b4c7 req-d1a2cd6d-f730-44b0-9383-8ab1d67a4769 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquiring lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.576 182939 DEBUG oslo_concurrency.lockutils [req-ac5039e3-def6-4001-b211-350d7d49b4c7 req-d1a2cd6d-f730-44b0-9383-8ab1d67a4769 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Acquired lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.576 182939 DEBUG nova.network.neutron [req-ac5039e3-def6-4001-b211-350d7d49b4c7 req-d1a2cd6d-f730-44b0-9383-8ab1d67a4769 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Refreshing network info cache for port ee3eb2da-6644-4c49-952b-d4fd939223d9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.647 182939 INFO nova.scheduler.client.report [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Deleted allocations for instance 07d46432-944a-49b9-9862-65d4e541e750
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.736 182939 DEBUG oslo_concurrency.lockutils [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.736 182939 DEBUG oslo_concurrency.lockutils [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.771 182939 DEBUG nova.compute.provider_tree [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.792 182939 DEBUG nova.scheduler.client.report [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.812 182939 DEBUG oslo_concurrency.lockutils [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:49:34 compute-0 nova_compute[182935]: 2026-01-22 00:49:34.887 182939 DEBUG oslo_concurrency.lockutils [None req-d8db90a7-65af-42bd-b212-2f316d18fdc4 f96259409b0747b6ac866ebe79dcf160 9d05c3cf062a4f6ebb5083b35d40286e - - default default] Lock "07d46432-944a-49b9-9862-65d4e541e750" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 9.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:49:35 compute-0 nova_compute[182935]: 2026-01-22 00:49:35.811 182939 DEBUG nova.network.neutron [req-ac5039e3-def6-4001-b211-350d7d49b4c7 req-d1a2cd6d-f730-44b0-9383-8ab1d67a4769 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updated VIF entry in instance network info cache for port ee3eb2da-6644-4c49-952b-d4fd939223d9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 22 00:49:35 compute-0 nova_compute[182935]: 2026-01-22 00:49:35.811 182939 DEBUG nova.network.neutron [req-ac5039e3-def6-4001-b211-350d7d49b4c7 req-d1a2cd6d-f730-44b0-9383-8ab1d67a4769 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Updating instance_info_cache with network_info: [{"id": "ee3eb2da-6644-4c49-952b-d4fd939223d9", "address": "fa:16:3e:6d:46:7b", "network": {"id": "7f04cd1e-fc0c-46c7-9d75-03b818ec99e2", "bridge": null, "label": "tempest-TestShelveInstance-1867443717-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d05c3cf062a4f6ebb5083b35d40286e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapee3eb2da-66", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 22 00:49:35 compute-0 nova_compute[182935]: 2026-01-22 00:49:35.830 182939 DEBUG oslo_concurrency.lockutils [req-ac5039e3-def6-4001-b211-350d7d49b4c7 req-d1a2cd6d-f730-44b0-9383-8ab1d67a4769 206b6fd4585b49148e7c53235ba2435b e28ed94946cf41d7be8d4c802def2956 - - default default] Releasing lock "refresh_cache-07d46432-944a-49b9-9862-65d4e541e750" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 22 00:49:37 compute-0 nova_compute[182935]: 2026-01-22 00:49:37.533 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:39 compute-0 nova_compute[182935]: 2026-01-22 00:49:39.507 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:40.047 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:49:40 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:40.047 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:49:40 compute-0 nova_compute[182935]: 2026-01-22 00:49:40.080 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:40 compute-0 podman[248551]: 2026-01-22 00:49:40.686477236 +0000 UTC m=+0.058335548 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:49:40 compute-0 podman[248553]: 2026-01-22 00:49:40.709723383 +0000 UTC m=+0.075667193 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:49:40 compute-0 podman[248552]: 2026-01-22 00:49:40.713640677 +0000 UTC m=+0.083042930 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 00:49:42 compute-0 nova_compute[182935]: 2026-01-22 00:49:42.593 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:43 compute-0 nova_compute[182935]: 2026-01-22 00:49:43.087 182939 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769042968.085189, 07d46432-944a-49b9-9862-65d4e541e750 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 22 00:49:43 compute-0 nova_compute[182935]: 2026-01-22 00:49:43.087 182939 INFO nova.compute.manager [-] [instance: 07d46432-944a-49b9-9862-65d4e541e750] VM Stopped (Lifecycle Event)
Jan 22 00:49:43 compute-0 nova_compute[182935]: 2026-01-22 00:49:43.111 182939 DEBUG nova.compute.manager [None req-1ecb7ea8-5c58-4367-b72c-56c0b7c159b0 - - - - - -] [instance: 07d46432-944a-49b9-9862-65d4e541e750] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 22 00:49:44 compute-0 nova_compute[182935]: 2026-01-22 00:49:44.511 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:45 compute-0 sshd-session[248623]: Invalid user developer from 188.166.69.60 port 37180
Jan 22 00:49:45 compute-0 sshd-session[248623]: Connection closed by invalid user developer 188.166.69.60 port 37180 [preauth]
Jan 22 00:49:47 compute-0 nova_compute[182935]: 2026-01-22 00:49:47.640 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:48 compute-0 podman[248625]: 2026-01-22 00:49:48.671488329 +0000 UTC m=+0.046542226 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 00:49:49 compute-0 nova_compute[182935]: 2026-01-22 00:49:49.514 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:50 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:49:50.049 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:49:52 compute-0 nova_compute[182935]: 2026-01-22 00:49:52.687 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:53 compute-0 podman[248645]: 2026-01-22 00:49:53.684626682 +0000 UTC m=+0.059274861 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 22 00:49:53 compute-0 podman[248644]: 2026-01-22 00:49:53.712198862 +0000 UTC m=+0.090441797 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, version=9.6, architecture=x86_64, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 00:49:54 compute-0 nova_compute[182935]: 2026-01-22 00:49:54.517 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:57 compute-0 nova_compute[182935]: 2026-01-22 00:49:57.727 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:49:59 compute-0 nova_compute[182935]: 2026-01-22 00:49:59.521 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:02 compute-0 nova_compute[182935]: 2026-01-22 00:50:02.729 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:50:03.250 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:50:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:50:03.250 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:50:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:50:03.250 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:50:04 compute-0 ovn_controller[95047]: 2026-01-22T00:50:04Z|00784|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Jan 22 00:50:04 compute-0 nova_compute[182935]: 2026-01-22 00:50:04.525 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:07 compute-0 nova_compute[182935]: 2026-01-22 00:50:07.778 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:09 compute-0 nova_compute[182935]: 2026-01-22 00:50:09.528 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:11 compute-0 podman[248685]: 2026-01-22 00:50:11.685892998 +0000 UTC m=+0.062068817 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:50:11 compute-0 podman[248687]: 2026-01-22 00:50:11.695934918 +0000 UTC m=+0.064986116 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:50:11 compute-0 podman[248686]: 2026-01-22 00:50:11.717361992 +0000 UTC m=+0.089337480 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:50:12 compute-0 nova_compute[182935]: 2026-01-22 00:50:12.782 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:12 compute-0 nova_compute[182935]: 2026-01-22 00:50:12.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:12 compute-0 nova_compute[182935]: 2026-01-22 00:50:12.829 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:50:12 compute-0 nova_compute[182935]: 2026-01-22 00:50:12.829 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:50:12 compute-0 nova_compute[182935]: 2026-01-22 00:50:12.830 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:50:12 compute-0 nova_compute[182935]: 2026-01-22 00:50:12.830 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:50:12 compute-0 nova_compute[182935]: 2026-01-22 00:50:12.975 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:50:12 compute-0 nova_compute[182935]: 2026-01-22 00:50:12.977 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5711MB free_disk=73.12207412719727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:50:12 compute-0 nova_compute[182935]: 2026-01-22 00:50:12.977 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:50:12 compute-0 nova_compute[182935]: 2026-01-22 00:50:12.977 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:50:13 compute-0 nova_compute[182935]: 2026-01-22 00:50:13.032 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:50:13 compute-0 nova_compute[182935]: 2026-01-22 00:50:13.033 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:50:13 compute-0 nova_compute[182935]: 2026-01-22 00:50:13.056 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:50:13 compute-0 nova_compute[182935]: 2026-01-22 00:50:13.069 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:50:13 compute-0 nova_compute[182935]: 2026-01-22 00:50:13.087 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:50:13 compute-0 nova_compute[182935]: 2026-01-22 00:50:13.087 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:50:14 compute-0 nova_compute[182935]: 2026-01-22 00:50:14.530 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:14 compute-0 nova_compute[182935]: 2026-01-22 00:50:14.643 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:14 compute-0 nova_compute[182935]: 2026-01-22 00:50:14.748 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:16 compute-0 nova_compute[182935]: 2026-01-22 00:50:16.088 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:16 compute-0 nova_compute[182935]: 2026-01-22 00:50:16.088 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:50:16 compute-0 nova_compute[182935]: 2026-01-22 00:50:16.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:17 compute-0 nova_compute[182935]: 2026-01-22 00:50:17.785 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:18 compute-0 nova_compute[182935]: 2026-01-22 00:50:18.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:18 compute-0 nova_compute[182935]: 2026-01-22 00:50:18.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:50:18 compute-0 nova_compute[182935]: 2026-01-22 00:50:18.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:50:18 compute-0 nova_compute[182935]: 2026-01-22 00:50:18.870 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:50:19 compute-0 nova_compute[182935]: 2026-01-22 00:50:19.533 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:19 compute-0 podman[248759]: 2026-01-22 00:50:19.67585718 +0000 UTC m=+0.052023356 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 22 00:50:21 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:50:21.243 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 00:50:21 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:50:21.243 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 00:50:21 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:50:21.244 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 00:50:21 compute-0 nova_compute[182935]: 2026-01-22 00:50:21.283 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:22 compute-0 nova_compute[182935]: 2026-01-22 00:50:22.786 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:50:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:50:24 compute-0 nova_compute[182935]: 2026-01-22 00:50:24.536 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:24 compute-0 podman[248778]: 2026-01-22 00:50:24.705868996 +0000 UTC m=+0.066364271 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public)
Jan 22 00:50:24 compute-0 podman[248779]: 2026-01-22 00:50:24.711357497 +0000 UTC m=+0.079346131 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 00:50:25 compute-0 nova_compute[182935]: 2026-01-22 00:50:25.864 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:27 compute-0 sshd-session[248819]: Invalid user developer from 188.166.69.60 port 33458
Jan 22 00:50:27 compute-0 sshd-session[248819]: Connection closed by invalid user developer 188.166.69.60 port 33458 [preauth]
Jan 22 00:50:27 compute-0 nova_compute[182935]: 2026-01-22 00:50:27.787 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:27 compute-0 nova_compute[182935]: 2026-01-22 00:50:27.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:28 compute-0 nova_compute[182935]: 2026-01-22 00:50:28.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:28 compute-0 nova_compute[182935]: 2026-01-22 00:50:28.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:29 compute-0 nova_compute[182935]: 2026-01-22 00:50:29.539 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:31 compute-0 nova_compute[182935]: 2026-01-22 00:50:31.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:32 compute-0 nova_compute[182935]: 2026-01-22 00:50:32.788 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:33 compute-0 nova_compute[182935]: 2026-01-22 00:50:33.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:50:34 compute-0 nova_compute[182935]: 2026-01-22 00:50:34.542 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:37 compute-0 nova_compute[182935]: 2026-01-22 00:50:37.820 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:39 compute-0 nova_compute[182935]: 2026-01-22 00:50:39.545 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:42 compute-0 podman[248822]: 2026-01-22 00:50:42.684743277 +0000 UTC m=+0.053886401 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:50:42 compute-0 podman[248824]: 2026-01-22 00:50:42.68486049 +0000 UTC m=+0.051343500 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:50:42 compute-0 podman[248823]: 2026-01-22 00:50:42.708348953 +0000 UTC m=+0.077798655 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 22 00:50:42 compute-0 nova_compute[182935]: 2026-01-22 00:50:42.821 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:44 compute-0 nova_compute[182935]: 2026-01-22 00:50:44.549 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:47 compute-0 nova_compute[182935]: 2026-01-22 00:50:47.871 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:49 compute-0 nova_compute[182935]: 2026-01-22 00:50:49.552 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:50 compute-0 ovn_controller[95047]: 2026-01-22T00:50:50Z|00785|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 22 00:50:50 compute-0 podman[248892]: 2026-01-22 00:50:50.670125488 +0000 UTC m=+0.047743704 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:50:52 compute-0 nova_compute[182935]: 2026-01-22 00:50:52.917 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:54 compute-0 nova_compute[182935]: 2026-01-22 00:50:54.555 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:55 compute-0 podman[248911]: 2026-01-22 00:50:55.691309224 +0000 UTC m=+0.060864179 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 00:50:55 compute-0 podman[248912]: 2026-01-22 00:50:55.698343623 +0000 UTC m=+0.065196153 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute)
Jan 22 00:50:57 compute-0 nova_compute[182935]: 2026-01-22 00:50:57.972 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:50:59 compute-0 nova_compute[182935]: 2026-01-22 00:50:59.558 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:03 compute-0 nova_compute[182935]: 2026-01-22 00:51:03.028 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:51:03.251 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:51:03.251 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:51:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:51:03.251 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:51:04 compute-0 nova_compute[182935]: 2026-01-22 00:51:04.561 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:08 compute-0 nova_compute[182935]: 2026-01-22 00:51:08.029 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:09 compute-0 nova_compute[182935]: 2026-01-22 00:51:09.565 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:10 compute-0 sshd-session[248953]: Invalid user developer from 188.166.69.60 port 34806
Jan 22 00:51:10 compute-0 sshd-session[248953]: Connection closed by invalid user developer 188.166.69.60 port 34806 [preauth]
Jan 22 00:51:12 compute-0 nova_compute[182935]: 2026-01-22 00:51:12.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:12 compute-0 nova_compute[182935]: 2026-01-22 00:51:12.829 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:51:12 compute-0 nova_compute[182935]: 2026-01-22 00:51:12.830 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:51:12 compute-0 nova_compute[182935]: 2026-01-22 00:51:12.830 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:51:12 compute-0 nova_compute[182935]: 2026-01-22 00:51:12.830 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:51:12 compute-0 nova_compute[182935]: 2026-01-22 00:51:12.972 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:51:12 compute-0 nova_compute[182935]: 2026-01-22 00:51:12.973 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5714MB free_disk=73.11816787719727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:51:12 compute-0 nova_compute[182935]: 2026-01-22 00:51:12.973 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:51:12 compute-0 nova_compute[182935]: 2026-01-22 00:51:12.973 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:51:13 compute-0 nova_compute[182935]: 2026-01-22 00:51:13.068 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:13 compute-0 nova_compute[182935]: 2026-01-22 00:51:13.161 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:51:13 compute-0 nova_compute[182935]: 2026-01-22 00:51:13.161 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:51:13 compute-0 nova_compute[182935]: 2026-01-22 00:51:13.245 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:51:13 compute-0 nova_compute[182935]: 2026-01-22 00:51:13.338 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:51:13 compute-0 nova_compute[182935]: 2026-01-22 00:51:13.338 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:51:13 compute-0 nova_compute[182935]: 2026-01-22 00:51:13.354 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:51:13 compute-0 nova_compute[182935]: 2026-01-22 00:51:13.373 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:51:13 compute-0 nova_compute[182935]: 2026-01-22 00:51:13.397 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:51:13 compute-0 nova_compute[182935]: 2026-01-22 00:51:13.427 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:51:13 compute-0 nova_compute[182935]: 2026-01-22 00:51:13.428 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:51:13 compute-0 nova_compute[182935]: 2026-01-22 00:51:13.429 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:51:13 compute-0 podman[248955]: 2026-01-22 00:51:13.700606574 +0000 UTC m=+0.062638951 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:51:13 compute-0 podman[248957]: 2026-01-22 00:51:13.721908864 +0000 UTC m=+0.085709263 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:51:13 compute-0 podman[248956]: 2026-01-22 00:51:13.737066407 +0000 UTC m=+0.101089331 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 22 00:51:14 compute-0 nova_compute[182935]: 2026-01-22 00:51:14.567 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:17 compute-0 nova_compute[182935]: 2026-01-22 00:51:17.429 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:17 compute-0 nova_compute[182935]: 2026-01-22 00:51:17.429 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:51:18 compute-0 nova_compute[182935]: 2026-01-22 00:51:18.124 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:18 compute-0 nova_compute[182935]: 2026-01-22 00:51:18.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:18 compute-0 nova_compute[182935]: 2026-01-22 00:51:18.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:51:18 compute-0 nova_compute[182935]: 2026-01-22 00:51:18.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:51:18 compute-0 nova_compute[182935]: 2026-01-22 00:51:18.826 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:51:18 compute-0 nova_compute[182935]: 2026-01-22 00:51:18.827 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:19 compute-0 nova_compute[182935]: 2026-01-22 00:51:19.570 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:21 compute-0 podman[249027]: 2026-01-22 00:51:21.701726503 +0000 UTC m=+0.076959733 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 22 00:51:23 compute-0 nova_compute[182935]: 2026-01-22 00:51:23.165 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:24 compute-0 nova_compute[182935]: 2026-01-22 00:51:24.573 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:24 compute-0 sshd-session[249046]: Accepted publickey for zuul from 192.168.122.10 port 43262 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 22 00:51:24 compute-0 systemd-logind[784]: New session 63 of user zuul.
Jan 22 00:51:24 compute-0 systemd[1]: Started Session 63 of User zuul.
Jan 22 00:51:24 compute-0 sshd-session[249046]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 00:51:25 compute-0 sudo[249050]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 22 00:51:25 compute-0 sudo[249050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 00:51:26 compute-0 podman[249085]: 2026-01-22 00:51:26.071585314 +0000 UTC m=+0.064450194 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 00:51:26 compute-0 podman[249084]: 2026-01-22 00:51:26.082505204 +0000 UTC m=+0.087489475 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Jan 22 00:51:26 compute-0 nova_compute[182935]: 2026-01-22 00:51:26.821 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:28 compute-0 nova_compute[182935]: 2026-01-22 00:51:28.200 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:28 compute-0 nova_compute[182935]: 2026-01-22 00:51:28.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:29 compute-0 nova_compute[182935]: 2026-01-22 00:51:29.575 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:29 compute-0 ovs-vsctl[249259]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 22 00:51:29 compute-0 nova_compute[182935]: 2026-01-22 00:51:29.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:29 compute-0 nova_compute[182935]: 2026-01-22 00:51:29.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:30 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 249074 (sos)
Jan 22 00:51:30 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 22 00:51:30 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 22 00:51:30 compute-0 virtqemud[182477]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 22 00:51:30 compute-0 virtqemud[182477]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 22 00:51:30 compute-0 virtqemud[182477]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 00:51:31 compute-0 crontab[249665]: (root) LIST (root)
Jan 22 00:51:33 compute-0 nova_compute[182935]: 2026-01-22 00:51:33.202 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:33 compute-0 systemd[1]: Starting Hostname Service...
Jan 22 00:51:33 compute-0 systemd[1]: Started Hostname Service.
Jan 22 00:51:34 compute-0 nova_compute[182935]: 2026-01-22 00:51:34.580 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:34 compute-0 nova_compute[182935]: 2026-01-22 00:51:34.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:51:38 compute-0 nova_compute[182935]: 2026-01-22 00:51:38.204 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:39 compute-0 nova_compute[182935]: 2026-01-22 00:51:39.584 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:41 compute-0 ovs-appctl[251045]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 00:51:41 compute-0 ovs-appctl[251052]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 00:51:41 compute-0 ovs-appctl[251058]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 22 00:51:43 compute-0 nova_compute[182935]: 2026-01-22 00:51:43.208 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:44 compute-0 nova_compute[182935]: 2026-01-22 00:51:44.588 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:44 compute-0 podman[251924]: 2026-01-22 00:51:44.719598734 +0000 UTC m=+0.087896965 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:51:44 compute-0 podman[251927]: 2026-01-22 00:51:44.719903022 +0000 UTC m=+0.087811374 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:51:44 compute-0 podman[251926]: 2026-01-22 00:51:44.750617576 +0000 UTC m=+0.119242015 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:51:48 compute-0 nova_compute[182935]: 2026-01-22 00:51:48.210 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:48 compute-0 virtqemud[182477]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 00:51:49 compute-0 nova_compute[182935]: 2026-01-22 00:51:49.591 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:50 compute-0 systemd[1]: Starting Time & Date Service...
Jan 22 00:51:50 compute-0 systemd[1]: Started Time & Date Service.
Jan 22 00:51:51 compute-0 sshd-session[252542]: Invalid user developer from 188.166.69.60 port 52348
Jan 22 00:51:51 compute-0 podman[252544]: 2026-01-22 00:51:51.885553397 +0000 UTC m=+0.054357341 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 22 00:51:51 compute-0 sshd-session[252542]: Connection closed by invalid user developer 188.166.69.60 port 52348 [preauth]
Jan 22 00:51:53 compute-0 nova_compute[182935]: 2026-01-22 00:51:53.258 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:54 compute-0 nova_compute[182935]: 2026-01-22 00:51:54.646 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:56 compute-0 podman[252566]: 2026-01-22 00:51:56.696907729 +0000 UTC m=+0.066275819 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, version=9.6, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 00:51:56 compute-0 podman[252567]: 2026-01-22 00:51:56.696905999 +0000 UTC m=+0.064271920 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 00:51:58 compute-0 nova_compute[182935]: 2026-01-22 00:51:58.262 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:51:59 compute-0 nova_compute[182935]: 2026-01-22 00:51:59.649 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:52:03.252 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:52:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:52:03.252 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:52:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:52:03.252 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:52:03 compute-0 nova_compute[182935]: 2026-01-22 00:52:03.300 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:04 compute-0 nova_compute[182935]: 2026-01-22 00:52:04.652 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:08 compute-0 nova_compute[182935]: 2026-01-22 00:52:08.301 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:09 compute-0 nova_compute[182935]: 2026-01-22 00:52:09.655 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:12 compute-0 nova_compute[182935]: 2026-01-22 00:52:12.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:12 compute-0 nova_compute[182935]: 2026-01-22 00:52:12.839 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:52:12 compute-0 nova_compute[182935]: 2026-01-22 00:52:12.839 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:52:12 compute-0 nova_compute[182935]: 2026-01-22 00:52:12.840 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:52:12 compute-0 nova_compute[182935]: 2026-01-22 00:52:12.840 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:52:12 compute-0 nova_compute[182935]: 2026-01-22 00:52:12.966 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:52:12 compute-0 nova_compute[182935]: 2026-01-22 00:52:12.967 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5504MB free_disk=72.76094436645508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:52:12 compute-0 nova_compute[182935]: 2026-01-22 00:52:12.967 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:52:12 compute-0 nova_compute[182935]: 2026-01-22 00:52:12.968 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:52:13 compute-0 nova_compute[182935]: 2026-01-22 00:52:13.036 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:52:13 compute-0 nova_compute[182935]: 2026-01-22 00:52:13.036 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:52:13 compute-0 nova_compute[182935]: 2026-01-22 00:52:13.067 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:52:13 compute-0 nova_compute[182935]: 2026-01-22 00:52:13.085 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:52:13 compute-0 nova_compute[182935]: 2026-01-22 00:52:13.106 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:52:13 compute-0 nova_compute[182935]: 2026-01-22 00:52:13.107 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:52:13 compute-0 nova_compute[182935]: 2026-01-22 00:52:13.302 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:14 compute-0 nova_compute[182935]: 2026-01-22 00:52:14.699 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:15 compute-0 podman[252608]: 2026-01-22 00:52:15.679840938 +0000 UTC m=+0.046891455 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:52:15 compute-0 podman[252606]: 2026-01-22 00:52:15.68456649 +0000 UTC m=+0.055448829 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 00:52:15 compute-0 podman[252607]: 2026-01-22 00:52:15.708853421 +0000 UTC m=+0.078136882 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 00:52:17 compute-0 sudo[249050]: pam_unix(sudo:session): session closed for user root
Jan 22 00:52:17 compute-0 sshd-session[249049]: Received disconnect from 192.168.122.10 port 43262:11: disconnected by user
Jan 22 00:52:17 compute-0 sshd-session[249049]: Disconnected from user zuul 192.168.122.10 port 43262
Jan 22 00:52:17 compute-0 sshd-session[249046]: pam_unix(sshd:session): session closed for user zuul
Jan 22 00:52:17 compute-0 systemd[1]: session-63.scope: Deactivated successfully.
Jan 22 00:52:17 compute-0 systemd[1]: session-63.scope: Consumed 1min 23.850s CPU time, 788.1M memory peak, read 341.6M from disk, written 21.9M to disk.
Jan 22 00:52:17 compute-0 systemd-logind[784]: Session 63 logged out. Waiting for processes to exit.
Jan 22 00:52:17 compute-0 systemd-logind[784]: Removed session 63.
Jan 22 00:52:17 compute-0 sshd-session[252676]: Accepted publickey for zuul from 192.168.122.10 port 47572 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 22 00:52:17 compute-0 systemd-logind[784]: New session 64 of user zuul.
Jan 22 00:52:17 compute-0 systemd[1]: Started Session 64 of User zuul.
Jan 22 00:52:17 compute-0 sshd-session[252676]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 00:52:17 compute-0 sudo[252680]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2026-01-22-hxgkcrh.tar.xz
Jan 22 00:52:17 compute-0 sudo[252680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 00:52:17 compute-0 sudo[252680]: pam_unix(sudo:session): session closed for user root
Jan 22 00:52:17 compute-0 sshd-session[252679]: Received disconnect from 192.168.122.10 port 47572:11: disconnected by user
Jan 22 00:52:17 compute-0 sshd-session[252679]: Disconnected from user zuul 192.168.122.10 port 47572
Jan 22 00:52:17 compute-0 sshd-session[252676]: pam_unix(sshd:session): session closed for user zuul
Jan 22 00:52:17 compute-0 systemd[1]: session-64.scope: Deactivated successfully.
Jan 22 00:52:17 compute-0 systemd-logind[784]: Session 64 logged out. Waiting for processes to exit.
Jan 22 00:52:17 compute-0 systemd-logind[784]: Removed session 64.
Jan 22 00:52:17 compute-0 sshd-session[252705]: Accepted publickey for zuul from 192.168.122.10 port 47586 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 22 00:52:17 compute-0 systemd-logind[784]: New session 65 of user zuul.
Jan 22 00:52:17 compute-0 systemd[1]: Started Session 65 of User zuul.
Jan 22 00:52:17 compute-0 sshd-session[252705]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 00:52:17 compute-0 sudo[252709]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Jan 22 00:52:17 compute-0 sudo[252709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 00:52:17 compute-0 sudo[252709]: pam_unix(sudo:session): session closed for user root
Jan 22 00:52:17 compute-0 sshd-session[252708]: Received disconnect from 192.168.122.10 port 47586:11: disconnected by user
Jan 22 00:52:17 compute-0 sshd-session[252708]: Disconnected from user zuul 192.168.122.10 port 47586
Jan 22 00:52:17 compute-0 sshd-session[252705]: pam_unix(sshd:session): session closed for user zuul
Jan 22 00:52:17 compute-0 systemd[1]: session-65.scope: Deactivated successfully.
Jan 22 00:52:17 compute-0 systemd-logind[784]: Session 65 logged out. Waiting for processes to exit.
Jan 22 00:52:17 compute-0 systemd-logind[784]: Removed session 65.
Jan 22 00:52:18 compute-0 nova_compute[182935]: 2026-01-22 00:52:18.108 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:18 compute-0 nova_compute[182935]: 2026-01-22 00:52:18.108 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:52:18 compute-0 nova_compute[182935]: 2026-01-22 00:52:18.303 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:19 compute-0 nova_compute[182935]: 2026-01-22 00:52:19.703 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:19 compute-0 nova_compute[182935]: 2026-01-22 00:52:19.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:20 compute-0 nova_compute[182935]: 2026-01-22 00:52:20.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:20 compute-0 nova_compute[182935]: 2026-01-22 00:52:20.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:52:20 compute-0 nova_compute[182935]: 2026-01-22 00:52:20.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:52:20 compute-0 nova_compute[182935]: 2026-01-22 00:52:20.817 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:52:21 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 22 00:52:21 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 22 00:52:22 compute-0 podman[252738]: 2026-01-22 00:52:22.700449861 +0000 UTC m=+0.063366718 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:52:23 compute-0 nova_compute[182935]: 2026-01-22 00:52:23.304 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:52:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:52:24 compute-0 nova_compute[182935]: 2026-01-22 00:52:24.748 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:26 compute-0 nova_compute[182935]: 2026-01-22 00:52:26.812 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:27 compute-0 podman[252758]: 2026-01-22 00:52:27.724672158 +0000 UTC m=+0.086753198 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., config_id=openstack_network_exporter)
Jan 22 00:52:27 compute-0 podman[252759]: 2026-01-22 00:52:27.729510474 +0000 UTC m=+0.086229806 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:52:28 compute-0 nova_compute[182935]: 2026-01-22 00:52:28.307 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:28 compute-0 nova_compute[182935]: 2026-01-22 00:52:28.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:29 compute-0 nova_compute[182935]: 2026-01-22 00:52:29.751 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:29 compute-0 nova_compute[182935]: 2026-01-22 00:52:29.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:31 compute-0 nova_compute[182935]: 2026-01-22 00:52:31.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:33 compute-0 nova_compute[182935]: 2026-01-22 00:52:33.309 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:34 compute-0 sshd-session[252796]: Invalid user developer from 188.166.69.60 port 39818
Jan 22 00:52:34 compute-0 sshd-session[252796]: Connection closed by invalid user developer 188.166.69.60 port 39818 [preauth]
Jan 22 00:52:34 compute-0 nova_compute[182935]: 2026-01-22 00:52:34.754 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:34 compute-0 nova_compute[182935]: 2026-01-22 00:52:34.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:35 compute-0 nova_compute[182935]: 2026-01-22 00:52:35.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:52:38 compute-0 nova_compute[182935]: 2026-01-22 00:52:38.355 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:39 compute-0 nova_compute[182935]: 2026-01-22 00:52:39.757 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:43 compute-0 nova_compute[182935]: 2026-01-22 00:52:43.357 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:44 compute-0 nova_compute[182935]: 2026-01-22 00:52:44.760 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:46 compute-0 podman[252798]: 2026-01-22 00:52:46.705672316 +0000 UTC m=+0.074477165 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:52:46 compute-0 podman[252799]: 2026-01-22 00:52:46.741127514 +0000 UTC m=+0.111631884 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:52:46 compute-0 podman[252800]: 2026-01-22 00:52:46.741838751 +0000 UTC m=+0.100068107 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:52:48 compute-0 nova_compute[182935]: 2026-01-22 00:52:48.398 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:49 compute-0 nova_compute[182935]: 2026-01-22 00:52:49.806 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:53 compute-0 nova_compute[182935]: 2026-01-22 00:52:53.400 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:53 compute-0 podman[252870]: 2026-01-22 00:52:53.678565867 +0000 UTC m=+0.053286427 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:52:54 compute-0 nova_compute[182935]: 2026-01-22 00:52:54.809 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:58 compute-0 nova_compute[182935]: 2026-01-22 00:52:58.401 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:52:58 compute-0 podman[252891]: 2026-01-22 00:52:58.740830854 +0000 UTC m=+0.096572043 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:52:58 compute-0 podman[252890]: 2026-01-22 00:52:58.741788677 +0000 UTC m=+0.097413523 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, vendor=Red Hat, Inc.)
Jan 22 00:52:59 compute-0 nova_compute[182935]: 2026-01-22 00:52:59.813 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:53:03.253 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:53:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:53:03.253 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:53:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:53:03.253 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:53:03 compute-0 nova_compute[182935]: 2026-01-22 00:53:03.404 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:04 compute-0 nova_compute[182935]: 2026-01-22 00:53:04.817 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:08 compute-0 nova_compute[182935]: 2026-01-22 00:53:08.405 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:09 compute-0 nova_compute[182935]: 2026-01-22 00:53:09.820 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:13 compute-0 nova_compute[182935]: 2026-01-22 00:53:13.406 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:13 compute-0 nova_compute[182935]: 2026-01-22 00:53:13.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:13 compute-0 nova_compute[182935]: 2026-01-22 00:53:13.827 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:53:13 compute-0 nova_compute[182935]: 2026-01-22 00:53:13.828 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:53:13 compute-0 nova_compute[182935]: 2026-01-22 00:53:13.828 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:53:13 compute-0 nova_compute[182935]: 2026-01-22 00:53:13.828 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:53:13 compute-0 nova_compute[182935]: 2026-01-22 00:53:13.974 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:53:13 compute-0 nova_compute[182935]: 2026-01-22 00:53:13.975 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5683MB free_disk=73.11842727661133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:53:13 compute-0 nova_compute[182935]: 2026-01-22 00:53:13.976 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:53:13 compute-0 nova_compute[182935]: 2026-01-22 00:53:13.976 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:53:14 compute-0 nova_compute[182935]: 2026-01-22 00:53:14.062 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:53:14 compute-0 nova_compute[182935]: 2026-01-22 00:53:14.062 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:53:14 compute-0 nova_compute[182935]: 2026-01-22 00:53:14.088 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:53:14 compute-0 nova_compute[182935]: 2026-01-22 00:53:14.104 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:53:14 compute-0 nova_compute[182935]: 2026-01-22 00:53:14.124 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:53:14 compute-0 nova_compute[182935]: 2026-01-22 00:53:14.125 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:53:14 compute-0 nova_compute[182935]: 2026-01-22 00:53:14.823 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:15 compute-0 sshd-session[252930]: Invalid user developer from 188.166.69.60 port 55364
Jan 22 00:53:15 compute-0 sshd-session[252930]: Connection closed by invalid user developer 188.166.69.60 port 55364 [preauth]
Jan 22 00:53:17 compute-0 podman[252932]: 2026-01-22 00:53:17.68390524 +0000 UTC m=+0.053055722 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:53:17 compute-0 podman[252934]: 2026-01-22 00:53:17.691269776 +0000 UTC m=+0.052212191 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:53:17 compute-0 podman[252933]: 2026-01-22 00:53:17.727900763 +0000 UTC m=+0.085742154 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:53:18 compute-0 nova_compute[182935]: 2026-01-22 00:53:18.408 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:19 compute-0 nova_compute[182935]: 2026-01-22 00:53:19.126 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:19 compute-0 nova_compute[182935]: 2026-01-22 00:53:19.126 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:53:19 compute-0 nova_compute[182935]: 2026-01-22 00:53:19.826 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:20 compute-0 nova_compute[182935]: 2026-01-22 00:53:20.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:22 compute-0 nova_compute[182935]: 2026-01-22 00:53:22.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:22 compute-0 nova_compute[182935]: 2026-01-22 00:53:22.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:53:22 compute-0 nova_compute[182935]: 2026-01-22 00:53:22.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:53:22 compute-0 nova_compute[182935]: 2026-01-22 00:53:22.961 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:53:23 compute-0 nova_compute[182935]: 2026-01-22 00:53:23.410 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:24 compute-0 podman[253002]: 2026-01-22 00:53:24.687590109 +0000 UTC m=+0.058455241 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Jan 22 00:53:24 compute-0 nova_compute[182935]: 2026-01-22 00:53:24.830 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:28 compute-0 nova_compute[182935]: 2026-01-22 00:53:28.417 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:28 compute-0 nova_compute[182935]: 2026-01-22 00:53:28.955 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:29 compute-0 podman[253022]: 2026-01-22 00:53:29.676705243 +0000 UTC m=+0.053620195 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.6, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350)
Jan 22 00:53:29 compute-0 podman[253023]: 2026-01-22 00:53:29.736851174 +0000 UTC m=+0.103991252 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:53:29 compute-0 nova_compute[182935]: 2026-01-22 00:53:29.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:29 compute-0 nova_compute[182935]: 2026-01-22 00:53:29.832 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:31 compute-0 nova_compute[182935]: 2026-01-22 00:53:31.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:33 compute-0 nova_compute[182935]: 2026-01-22 00:53:33.419 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:33 compute-0 nova_compute[182935]: 2026-01-22 00:53:33.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:34 compute-0 nova_compute[182935]: 2026-01-22 00:53:34.835 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:35 compute-0 sshd-session[253063]: Received disconnect from 45.148.10.151 port 29728:11:  [preauth]
Jan 22 00:53:35 compute-0 sshd-session[253063]: Disconnected from authenticating user root 45.148.10.151 port 29728 [preauth]
Jan 22 00:53:35 compute-0 nova_compute[182935]: 2026-01-22 00:53:35.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:36 compute-0 nova_compute[182935]: 2026-01-22 00:53:36.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:38 compute-0 nova_compute[182935]: 2026-01-22 00:53:38.420 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:39 compute-0 nova_compute[182935]: 2026-01-22 00:53:39.863 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:43 compute-0 nova_compute[182935]: 2026-01-22 00:53:43.471 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:44 compute-0 nova_compute[182935]: 2026-01-22 00:53:44.864 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:48 compute-0 nova_compute[182935]: 2026-01-22 00:53:48.475 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:48 compute-0 podman[253067]: 2026-01-22 00:53:48.700171925 +0000 UTC m=+0.058340688 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:53:48 compute-0 podman[253065]: 2026-01-22 00:53:48.719744143 +0000 UTC m=+0.082460865 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 00:53:48 compute-0 podman[253066]: 2026-01-22 00:53:48.766839601 +0000 UTC m=+0.123487838 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 00:53:49 compute-0 nova_compute[182935]: 2026-01-22 00:53:49.806 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:53:49 compute-0 nova_compute[182935]: 2026-01-22 00:53:49.806 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:53:49 compute-0 nova_compute[182935]: 2026-01-22 00:53:49.867 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:53 compute-0 nova_compute[182935]: 2026-01-22 00:53:53.476 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:54 compute-0 nova_compute[182935]: 2026-01-22 00:53:54.870 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:55 compute-0 podman[253139]: 2026-01-22 00:53:55.693986067 +0000 UTC m=+0.065577641 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 22 00:53:58 compute-0 nova_compute[182935]: 2026-01-22 00:53:58.479 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:53:58 compute-0 sshd-session[253159]: Invalid user developer from 188.166.69.60 port 50260
Jan 22 00:53:58 compute-0 sshd-session[253159]: Connection closed by invalid user developer 188.166.69.60 port 50260 [preauth]
Jan 22 00:53:59 compute-0 nova_compute[182935]: 2026-01-22 00:53:59.872 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:00 compute-0 podman[253161]: 2026-01-22 00:54:00.717726173 +0000 UTC m=+0.077488496 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64)
Jan 22 00:54:00 compute-0 podman[253162]: 2026-01-22 00:54:00.717856046 +0000 UTC m=+0.069655449 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 22 00:54:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:54:03.253 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:54:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:54:03.254 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:54:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:54:03.254 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:54:03 compute-0 nova_compute[182935]: 2026-01-22 00:54:03.480 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:03 compute-0 nova_compute[182935]: 2026-01-22 00:54:03.827 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:03 compute-0 nova_compute[182935]: 2026-01-22 00:54:03.827 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:54:03 compute-0 nova_compute[182935]: 2026-01-22 00:54:03.872 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:54:04 compute-0 nova_compute[182935]: 2026-01-22 00:54:04.875 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:08 compute-0 nova_compute[182935]: 2026-01-22 00:54:08.482 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:09 compute-0 nova_compute[182935]: 2026-01-22 00:54:09.920 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:13 compute-0 nova_compute[182935]: 2026-01-22 00:54:13.513 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:14 compute-0 nova_compute[182935]: 2026-01-22 00:54:14.923 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:15 compute-0 nova_compute[182935]: 2026-01-22 00:54:15.839 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:15 compute-0 nova_compute[182935]: 2026-01-22 00:54:15.876 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:54:15 compute-0 nova_compute[182935]: 2026-01-22 00:54:15.876 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:54:15 compute-0 nova_compute[182935]: 2026-01-22 00:54:15.877 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:54:15 compute-0 nova_compute[182935]: 2026-01-22 00:54:15.877 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:54:16 compute-0 nova_compute[182935]: 2026-01-22 00:54:16.066 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:54:16 compute-0 nova_compute[182935]: 2026-01-22 00:54:16.067 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5690MB free_disk=73.11836624145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:54:16 compute-0 nova_compute[182935]: 2026-01-22 00:54:16.067 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:54:16 compute-0 nova_compute[182935]: 2026-01-22 00:54:16.068 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:54:16 compute-0 nova_compute[182935]: 2026-01-22 00:54:16.171 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:54:16 compute-0 nova_compute[182935]: 2026-01-22 00:54:16.172 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:54:16 compute-0 nova_compute[182935]: 2026-01-22 00:54:16.213 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:54:16 compute-0 nova_compute[182935]: 2026-01-22 00:54:16.245 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:54:16 compute-0 nova_compute[182935]: 2026-01-22 00:54:16.246 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:54:16 compute-0 nova_compute[182935]: 2026-01-22 00:54:16.247 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:54:18 compute-0 nova_compute[182935]: 2026-01-22 00:54:18.515 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:19 compute-0 nova_compute[182935]: 2026-01-22 00:54:19.201 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:19 compute-0 nova_compute[182935]: 2026-01-22 00:54:19.201 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:54:19 compute-0 podman[253201]: 2026-01-22 00:54:19.685567709 +0000 UTC m=+0.060325845 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:54:19 compute-0 podman[253203]: 2026-01-22 00:54:19.690914447 +0000 UTC m=+0.055020207 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:54:19 compute-0 podman[253202]: 2026-01-22 00:54:19.777658054 +0000 UTC m=+0.144826888 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 00:54:19 compute-0 nova_compute[182935]: 2026-01-22 00:54:19.925 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:22 compute-0 nova_compute[182935]: 2026-01-22 00:54:22.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:54:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:54:23 compute-0 nova_compute[182935]: 2026-01-22 00:54:23.516 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:23 compute-0 nova_compute[182935]: 2026-01-22 00:54:23.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:23 compute-0 nova_compute[182935]: 2026-01-22 00:54:23.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:54:23 compute-0 nova_compute[182935]: 2026-01-22 00:54:23.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:54:23 compute-0 nova_compute[182935]: 2026-01-22 00:54:23.824 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:54:24 compute-0 nova_compute[182935]: 2026-01-22 00:54:24.927 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:26 compute-0 podman[253273]: 2026-01-22 00:54:26.690845475 +0000 UTC m=+0.064760941 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 22 00:54:28 compute-0 nova_compute[182935]: 2026-01-22 00:54:28.563 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:29 compute-0 nova_compute[182935]: 2026-01-22 00:54:29.819 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:29 compute-0 nova_compute[182935]: 2026-01-22 00:54:29.931 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:30 compute-0 nova_compute[182935]: 2026-01-22 00:54:30.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:31 compute-0 podman[253293]: 2026-01-22 00:54:31.687542354 +0000 UTC m=+0.060423629 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 00:54:31 compute-0 podman[253294]: 2026-01-22 00:54:31.715873542 +0000 UTC m=+0.076329979 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:54:32 compute-0 nova_compute[182935]: 2026-01-22 00:54:32.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:33 compute-0 nova_compute[182935]: 2026-01-22 00:54:33.564 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:33 compute-0 nova_compute[182935]: 2026-01-22 00:54:33.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:34 compute-0 nova_compute[182935]: 2026-01-22 00:54:34.934 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:35 compute-0 nova_compute[182935]: 2026-01-22 00:54:35.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:36 compute-0 nova_compute[182935]: 2026-01-22 00:54:36.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:54:38 compute-0 nova_compute[182935]: 2026-01-22 00:54:38.589 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:39 compute-0 nova_compute[182935]: 2026-01-22 00:54:39.980 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:41 compute-0 sshd-session[253334]: Invalid user developer from 188.166.69.60 port 36176
Jan 22 00:54:41 compute-0 sshd-session[253334]: Connection closed by invalid user developer 188.166.69.60 port 36176 [preauth]
Jan 22 00:54:43 compute-0 nova_compute[182935]: 2026-01-22 00:54:43.629 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:44 compute-0 nova_compute[182935]: 2026-01-22 00:54:44.984 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:48 compute-0 nova_compute[182935]: 2026-01-22 00:54:48.631 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:49 compute-0 nova_compute[182935]: 2026-01-22 00:54:49.987 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:50 compute-0 podman[253337]: 2026-01-22 00:54:50.688926569 +0000 UTC m=+0.063839194 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:54:50 compute-0 podman[253339]: 2026-01-22 00:54:50.729660661 +0000 UTC m=+0.084477847 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 00:54:50 compute-0 podman[253338]: 2026-01-22 00:54:50.755338645 +0000 UTC m=+0.114722959 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 22 00:54:53 compute-0 nova_compute[182935]: 2026-01-22 00:54:53.633 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:54 compute-0 nova_compute[182935]: 2026-01-22 00:54:54.989 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:57 compute-0 podman[253411]: 2026-01-22 00:54:57.679420351 +0000 UTC m=+0.049195935 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:54:58 compute-0 nova_compute[182935]: 2026-01-22 00:54:58.635 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:54:59 compute-0 nova_compute[182935]: 2026-01-22 00:54:59.994 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:02 compute-0 podman[253432]: 2026-01-22 00:55:02.690738784 +0000 UTC m=+0.059388078 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 00:55:02 compute-0 podman[253431]: 2026-01-22 00:55:02.698659793 +0000 UTC m=+0.069284075 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, version=9.6, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc.)
Jan 22 00:55:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:55:03.254 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:55:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:55:03.255 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:55:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:55:03.255 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:55:03 compute-0 nova_compute[182935]: 2026-01-22 00:55:03.637 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:05 compute-0 nova_compute[182935]: 2026-01-22 00:55:05.031 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:08 compute-0 nova_compute[182935]: 2026-01-22 00:55:08.676 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:10 compute-0 nova_compute[182935]: 2026-01-22 00:55:10.034 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:13 compute-0 nova_compute[182935]: 2026-01-22 00:55:13.721 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:15 compute-0 nova_compute[182935]: 2026-01-22 00:55:15.037 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:17 compute-0 nova_compute[182935]: 2026-01-22 00:55:17.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:17 compute-0 nova_compute[182935]: 2026-01-22 00:55:17.837 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:55:17 compute-0 nova_compute[182935]: 2026-01-22 00:55:17.837 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:55:17 compute-0 nova_compute[182935]: 2026-01-22 00:55:17.838 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:55:17 compute-0 nova_compute[182935]: 2026-01-22 00:55:17.838 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:55:18 compute-0 nova_compute[182935]: 2026-01-22 00:55:18.062 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:55:18 compute-0 nova_compute[182935]: 2026-01-22 00:55:18.064 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5702MB free_disk=73.11836624145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:55:18 compute-0 nova_compute[182935]: 2026-01-22 00:55:18.064 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:55:18 compute-0 nova_compute[182935]: 2026-01-22 00:55:18.065 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:55:18 compute-0 nova_compute[182935]: 2026-01-22 00:55:18.156 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:55:18 compute-0 nova_compute[182935]: 2026-01-22 00:55:18.157 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:55:18 compute-0 nova_compute[182935]: 2026-01-22 00:55:18.189 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:55:18 compute-0 nova_compute[182935]: 2026-01-22 00:55:18.206 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:55:18 compute-0 nova_compute[182935]: 2026-01-22 00:55:18.208 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:55:18 compute-0 nova_compute[182935]: 2026-01-22 00:55:18.208 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:55:18 compute-0 nova_compute[182935]: 2026-01-22 00:55:18.765 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:19 compute-0 nova_compute[182935]: 2026-01-22 00:55:19.209 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:19 compute-0 nova_compute[182935]: 2026-01-22 00:55:19.210 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:55:20 compute-0 nova_compute[182935]: 2026-01-22 00:55:20.040 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:21 compute-0 podman[253472]: 2026-01-22 00:55:21.71490982 +0000 UTC m=+0.078007122 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:55:21 compute-0 podman[253474]: 2026-01-22 00:55:21.73668333 +0000 UTC m=+0.087391386 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:55:21 compute-0 podman[253473]: 2026-01-22 00:55:21.806321473 +0000 UTC m=+0.157948372 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:55:23 compute-0 nova_compute[182935]: 2026-01-22 00:55:23.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:23 compute-0 nova_compute[182935]: 2026-01-22 00:55:23.801 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:24 compute-0 nova_compute[182935]: 2026-01-22 00:55:24.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:24 compute-0 nova_compute[182935]: 2026-01-22 00:55:24.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:55:24 compute-0 nova_compute[182935]: 2026-01-22 00:55:24.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:55:24 compute-0 nova_compute[182935]: 2026-01-22 00:55:24.811 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:55:25 compute-0 nova_compute[182935]: 2026-01-22 00:55:25.069 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:25 compute-0 sshd-session[253541]: Invalid user developer from 188.166.69.60 port 36772
Jan 22 00:55:25 compute-0 sshd-session[253541]: Connection closed by invalid user developer 188.166.69.60 port 36772 [preauth]
Jan 22 00:55:28 compute-0 podman[253543]: 2026-01-22 00:55:28.703089138 +0000 UTC m=+0.071405015 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 22 00:55:28 compute-0 nova_compute[182935]: 2026-01-22 00:55:28.835 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:30 compute-0 nova_compute[182935]: 2026-01-22 00:55:30.072 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:31 compute-0 nova_compute[182935]: 2026-01-22 00:55:31.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:31 compute-0 nova_compute[182935]: 2026-01-22 00:55:31.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:33 compute-0 podman[253562]: 2026-01-22 00:55:33.696546823 +0000 UTC m=+0.065327661 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=openstack_network_exporter, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git)
Jan 22 00:55:33 compute-0 podman[253563]: 2026-01-22 00:55:33.72493413 +0000 UTC m=+0.080493942 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:55:33 compute-0 nova_compute[182935]: 2026-01-22 00:55:33.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:33 compute-0 nova_compute[182935]: 2026-01-22 00:55:33.901 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:34 compute-0 nova_compute[182935]: 2026-01-22 00:55:34.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:35 compute-0 nova_compute[182935]: 2026-01-22 00:55:35.075 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:36 compute-0 nova_compute[182935]: 2026-01-22 00:55:36.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:55:38 compute-0 nova_compute[182935]: 2026-01-22 00:55:38.903 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:40 compute-0 nova_compute[182935]: 2026-01-22 00:55:40.104 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:43 compute-0 nova_compute[182935]: 2026-01-22 00:55:43.939 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:45 compute-0 nova_compute[182935]: 2026-01-22 00:55:45.108 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:48 compute-0 nova_compute[182935]: 2026-01-22 00:55:48.941 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:50 compute-0 nova_compute[182935]: 2026-01-22 00:55:50.111 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:52 compute-0 podman[253604]: 2026-01-22 00:55:52.691932003 +0000 UTC m=+0.062191556 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:55:52 compute-0 podman[253605]: 2026-01-22 00:55:52.735484922 +0000 UTC m=+0.103329637 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 00:55:52 compute-0 podman[253606]: 2026-01-22 00:55:52.754762053 +0000 UTC m=+0.105675994 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 00:55:53 compute-0 nova_compute[182935]: 2026-01-22 00:55:53.977 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:55 compute-0 nova_compute[182935]: 2026-01-22 00:55:55.117 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:58 compute-0 nova_compute[182935]: 2026-01-22 00:55:58.978 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:55:59 compute-0 podman[253674]: 2026-01-22 00:55:59.68555189 +0000 UTC m=+0.058997421 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 22 00:56:00 compute-0 nova_compute[182935]: 2026-01-22 00:56:00.157 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:56:03.255 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:56:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:56:03.256 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:56:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:56:03.256 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:56:03 compute-0 nova_compute[182935]: 2026-01-22 00:56:03.980 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:04 compute-0 podman[253693]: 2026-01-22 00:56:04.697165778 +0000 UTC m=+0.071755584 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter)
Jan 22 00:56:04 compute-0 podman[253694]: 2026-01-22 00:56:04.724404328 +0000 UTC m=+0.089017266 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:56:05 compute-0 nova_compute[182935]: 2026-01-22 00:56:05.160 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:08 compute-0 nova_compute[182935]: 2026-01-22 00:56:08.981 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:09 compute-0 sshd-session[253734]: Invalid user deploy from 188.166.69.60 port 48990
Jan 22 00:56:09 compute-0 sshd-session[253734]: Connection closed by invalid user deploy 188.166.69.60 port 48990 [preauth]
Jan 22 00:56:10 compute-0 nova_compute[182935]: 2026-01-22 00:56:10.165 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:14 compute-0 nova_compute[182935]: 2026-01-22 00:56:14.025 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:15 compute-0 nova_compute[182935]: 2026-01-22 00:56:15.167 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:18 compute-0 nova_compute[182935]: 2026-01-22 00:56:18.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:18 compute-0 nova_compute[182935]: 2026-01-22 00:56:18.928 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:56:18 compute-0 nova_compute[182935]: 2026-01-22 00:56:18.928 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:56:18 compute-0 nova_compute[182935]: 2026-01-22 00:56:18.929 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:56:18 compute-0 nova_compute[182935]: 2026-01-22 00:56:18.929 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.028 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.201 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.203 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5705MB free_disk=73.11836624145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.204 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.204 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.504 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.504 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.601 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.666 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.667 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.686 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.720 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.746 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.769 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.771 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:56:19 compute-0 nova_compute[182935]: 2026-01-22 00:56:19.771 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:56:20 compute-0 nova_compute[182935]: 2026-01-22 00:56:20.169 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:20 compute-0 nova_compute[182935]: 2026-01-22 00:56:20.772 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:20 compute-0 nova_compute[182935]: 2026-01-22 00:56:20.772 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:56:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:56:23 compute-0 podman[253736]: 2026-01-22 00:56:23.700848627 +0000 UTC m=+0.075903993 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:56:23 compute-0 podman[253738]: 2026-01-22 00:56:23.701057442 +0000 UTC m=+0.065994596 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:56:23 compute-0 podman[253737]: 2026-01-22 00:56:23.753058243 +0000 UTC m=+0.112125427 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 22 00:56:23 compute-0 nova_compute[182935]: 2026-01-22 00:56:23.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:24 compute-0 nova_compute[182935]: 2026-01-22 00:56:24.083 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:24 compute-0 nova_compute[182935]: 2026-01-22 00:56:24.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:24 compute-0 nova_compute[182935]: 2026-01-22 00:56:24.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:56:24 compute-0 nova_compute[182935]: 2026-01-22 00:56:24.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:56:24 compute-0 nova_compute[182935]: 2026-01-22 00:56:24.826 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:56:25 compute-0 nova_compute[182935]: 2026-01-22 00:56:25.213 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:29 compute-0 nova_compute[182935]: 2026-01-22 00:56:29.084 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:30 compute-0 nova_compute[182935]: 2026-01-22 00:56:30.216 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:30 compute-0 podman[253812]: 2026-01-22 00:56:30.712620087 +0000 UTC m=+0.078349651 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 00:56:32 compute-0 nova_compute[182935]: 2026-01-22 00:56:32.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:33 compute-0 nova_compute[182935]: 2026-01-22 00:56:33.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:34 compute-0 nova_compute[182935]: 2026-01-22 00:56:34.121 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:35 compute-0 nova_compute[182935]: 2026-01-22 00:56:35.268 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:35 compute-0 podman[253831]: 2026-01-22 00:56:35.683364241 +0000 UTC m=+0.059802039 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_id=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 22 00:56:35 compute-0 podman[253832]: 2026-01-22 00:56:35.683885253 +0000 UTC m=+0.059794869 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:56:35 compute-0 nova_compute[182935]: 2026-01-22 00:56:35.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:36 compute-0 nova_compute[182935]: 2026-01-22 00:56:36.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:36 compute-0 nova_compute[182935]: 2026-01-22 00:56:36.808 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:37 compute-0 nova_compute[182935]: 2026-01-22 00:56:37.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:56:39 compute-0 nova_compute[182935]: 2026-01-22 00:56:39.124 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:40 compute-0 nova_compute[182935]: 2026-01-22 00:56:40.312 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:44 compute-0 nova_compute[182935]: 2026-01-22 00:56:44.164 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:45 compute-0 nova_compute[182935]: 2026-01-22 00:56:45.315 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:49 compute-0 nova_compute[182935]: 2026-01-22 00:56:49.169 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:50 compute-0 nova_compute[182935]: 2026-01-22 00:56:50.320 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:52 compute-0 sshd-session[253869]: Invalid user deploy from 188.166.69.60 port 51830
Jan 22 00:56:52 compute-0 sshd-session[253869]: Connection closed by invalid user deploy 188.166.69.60 port 51830 [preauth]
Jan 22 00:56:54 compute-0 nova_compute[182935]: 2026-01-22 00:56:54.201 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:54 compute-0 podman[253871]: 2026-01-22 00:56:54.685644684 +0000 UTC m=+0.058264741 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:56:54 compute-0 podman[253873]: 2026-01-22 00:56:54.718668463 +0000 UTC m=+0.081671280 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:56:54 compute-0 podman[253872]: 2026-01-22 00:56:54.727621006 +0000 UTC m=+0.093791840 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Jan 22 00:56:55 compute-0 nova_compute[182935]: 2026-01-22 00:56:55.322 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:56:59 compute-0 nova_compute[182935]: 2026-01-22 00:56:59.203 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:00 compute-0 nova_compute[182935]: 2026-01-22 00:57:00.325 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:01 compute-0 podman[253942]: 2026-01-22 00:57:01.70644947 +0000 UTC m=+0.067931673 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 22 00:57:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:57:03.256 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:57:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:57:03.257 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:57:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:57:03.257 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:57:04 compute-0 nova_compute[182935]: 2026-01-22 00:57:04.205 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:05 compute-0 nova_compute[182935]: 2026-01-22 00:57:05.328 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:06 compute-0 podman[253961]: 2026-01-22 00:57:06.735010164 +0000 UTC m=+0.090923991 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Jan 22 00:57:06 compute-0 podman[253962]: 2026-01-22 00:57:06.776168717 +0000 UTC m=+0.120937908 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 22 00:57:09 compute-0 nova_compute[182935]: 2026-01-22 00:57:09.207 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:10 compute-0 nova_compute[182935]: 2026-01-22 00:57:10.331 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:14 compute-0 nova_compute[182935]: 2026-01-22 00:57:14.247 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:15 compute-0 nova_compute[182935]: 2026-01-22 00:57:15.378 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:19 compute-0 nova_compute[182935]: 2026-01-22 00:57:19.247 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:19 compute-0 nova_compute[182935]: 2026-01-22 00:57:19.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:19 compute-0 nova_compute[182935]: 2026-01-22 00:57:19.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:57:20 compute-0 nova_compute[182935]: 2026-01-22 00:57:20.382 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:20 compute-0 nova_compute[182935]: 2026-01-22 00:57:20.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:20 compute-0 nova_compute[182935]: 2026-01-22 00:57:20.820 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:57:20 compute-0 nova_compute[182935]: 2026-01-22 00:57:20.821 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:57:20 compute-0 nova_compute[182935]: 2026-01-22 00:57:20.821 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:57:20 compute-0 nova_compute[182935]: 2026-01-22 00:57:20.821 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:57:21 compute-0 nova_compute[182935]: 2026-01-22 00:57:21.002 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:57:21 compute-0 nova_compute[182935]: 2026-01-22 00:57:21.004 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5713MB free_disk=73.11836624145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:57:21 compute-0 nova_compute[182935]: 2026-01-22 00:57:21.005 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:57:21 compute-0 nova_compute[182935]: 2026-01-22 00:57:21.006 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:57:21 compute-0 nova_compute[182935]: 2026-01-22 00:57:21.085 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:57:21 compute-0 nova_compute[182935]: 2026-01-22 00:57:21.086 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:57:21 compute-0 nova_compute[182935]: 2026-01-22 00:57:21.120 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:57:21 compute-0 nova_compute[182935]: 2026-01-22 00:57:21.140 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:57:21 compute-0 nova_compute[182935]: 2026-01-22 00:57:21.142 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:57:21 compute-0 nova_compute[182935]: 2026-01-22 00:57:21.143 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:57:24 compute-0 nova_compute[182935]: 2026-01-22 00:57:24.143 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:24 compute-0 nova_compute[182935]: 2026-01-22 00:57:24.295 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:25 compute-0 nova_compute[182935]: 2026-01-22 00:57:25.428 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:25 compute-0 podman[254004]: 2026-01-22 00:57:25.68456798 +0000 UTC m=+0.059001069 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:57:25 compute-0 podman[254005]: 2026-01-22 00:57:25.728436238 +0000 UTC m=+0.096321270 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 00:57:25 compute-0 podman[254006]: 2026-01-22 00:57:25.730516677 +0000 UTC m=+0.095310826 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 00:57:26 compute-0 nova_compute[182935]: 2026-01-22 00:57:26.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:26 compute-0 nova_compute[182935]: 2026-01-22 00:57:26.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:57:26 compute-0 nova_compute[182935]: 2026-01-22 00:57:26.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:57:26 compute-0 nova_compute[182935]: 2026-01-22 00:57:26.823 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:57:29 compute-0 nova_compute[182935]: 2026-01-22 00:57:29.332 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:30 compute-0 nova_compute[182935]: 2026-01-22 00:57:30.464 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:32 compute-0 podman[254076]: 2026-01-22 00:57:32.671969798 +0000 UTC m=+0.049043961 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 22 00:57:32 compute-0 nova_compute[182935]: 2026-01-22 00:57:32.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:33 compute-0 sshd-session[254095]: Invalid user deploy from 188.166.69.60 port 57584
Jan 22 00:57:33 compute-0 sshd-session[254095]: Connection closed by invalid user deploy 188.166.69.60 port 57584 [preauth]
Jan 22 00:57:33 compute-0 nova_compute[182935]: 2026-01-22 00:57:33.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:34 compute-0 nova_compute[182935]: 2026-01-22 00:57:34.384 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:35 compute-0 nova_compute[182935]: 2026-01-22 00:57:35.509 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:37 compute-0 podman[254097]: 2026-01-22 00:57:37.701336671 +0000 UTC m=+0.073907865 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Jan 22 00:57:37 compute-0 podman[254098]: 2026-01-22 00:57:37.73941964 +0000 UTC m=+0.082071880 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 22 00:57:37 compute-0 nova_compute[182935]: 2026-01-22 00:57:37.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:37 compute-0 nova_compute[182935]: 2026-01-22 00:57:37.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:38 compute-0 nova_compute[182935]: 2026-01-22 00:57:38.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:57:39 compute-0 nova_compute[182935]: 2026-01-22 00:57:39.385 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:40 compute-0 nova_compute[182935]: 2026-01-22 00:57:40.511 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:44 compute-0 nova_compute[182935]: 2026-01-22 00:57:44.433 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:45 compute-0 nova_compute[182935]: 2026-01-22 00:57:45.553 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:49 compute-0 nova_compute[182935]: 2026-01-22 00:57:49.475 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:50 compute-0 nova_compute[182935]: 2026-01-22 00:57:50.557 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:54 compute-0 nova_compute[182935]: 2026-01-22 00:57:54.503 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:55 compute-0 nova_compute[182935]: 2026-01-22 00:57:55.559 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:57:56 compute-0 podman[254137]: 2026-01-22 00:57:56.69209475 +0000 UTC m=+0.066262423 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 00:57:56 compute-0 podman[254144]: 2026-01-22 00:57:56.692574532 +0000 UTC m=+0.051784967 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 00:57:56 compute-0 podman[254138]: 2026-01-22 00:57:56.742106684 +0000 UTC m=+0.099777253 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 00:57:59 compute-0 nova_compute[182935]: 2026-01-22 00:57:59.505 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:00 compute-0 nova_compute[182935]: 2026-01-22 00:58:00.562 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:58:03.258 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:58:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:58:03.258 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:58:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:58:03.259 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:58:03 compute-0 podman[254206]: 2026-01-22 00:58:03.711631066 +0000 UTC m=+0.084137348 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 00:58:04 compute-0 nova_compute[182935]: 2026-01-22 00:58:04.550 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:05 compute-0 nova_compute[182935]: 2026-01-22 00:58:05.564 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:08 compute-0 podman[254225]: 2026-01-22 00:58:08.712644193 +0000 UTC m=+0.079531750 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 00:58:08 compute-0 podman[254226]: 2026-01-22 00:58:08.734896783 +0000 UTC m=+0.092791566 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.796 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.797 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.798 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.798 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.799 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.799 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.831 182939 DEBUG nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.832 182939 WARNING nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.832 182939 WARNING nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.833 182939 WARNING nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.833 182939 WARNING nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.833 182939 WARNING nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.834 182939 WARNING nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.834 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Removable base files: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474 /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74 /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6 /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.835 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3e35ea3e3c2e0ae86eb8c4d225ee0c24002a1474
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.835 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/28d799b333bd7d52e5e892149f424e185effed74
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.836 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/5a5493d740bb49aad4d429bc7765118b91ad220a
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.836 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/bf25a60a05476d5b07338b195b9b51af2b6b007b
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.836 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/7f2508ebfd258f79131ef449d573cad936dd91f6
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.837 182939 INFO nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/f293700577693f64ada9e231fdffdcd8806f6455
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.837 182939 DEBUG nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.838 182939 DEBUG nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 22 00:58:08 compute-0 nova_compute[182935]: 2026-01-22 00:58:08.838 182939 DEBUG nova.virt.libvirt.imagecache [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 22 00:58:09 compute-0 nova_compute[182935]: 2026-01-22 00:58:09.552 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:10 compute-0 nova_compute[182935]: 2026-01-22 00:58:10.567 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:14 compute-0 nova_compute[182935]: 2026-01-22 00:58:14.591 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:14 compute-0 sshd-session[254264]: Invalid user deploy from 188.166.69.60 port 34764
Jan 22 00:58:15 compute-0 sshd-session[254264]: Connection closed by invalid user deploy 188.166.69.60 port 34764 [preauth]
Jan 22 00:58:15 compute-0 nova_compute[182935]: 2026-01-22 00:58:15.569 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:19 compute-0 nova_compute[182935]: 2026-01-22 00:58:19.624 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:19 compute-0 nova_compute[182935]: 2026-01-22 00:58:19.837 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:19 compute-0 nova_compute[182935]: 2026-01-22 00:58:19.838 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:58:20 compute-0 nova_compute[182935]: 2026-01-22 00:58:20.572 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:20 compute-0 nova_compute[182935]: 2026-01-22 00:58:20.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:20 compute-0 nova_compute[182935]: 2026-01-22 00:58:20.816 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:58:20 compute-0 nova_compute[182935]: 2026-01-22 00:58:20.816 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:58:20 compute-0 nova_compute[182935]: 2026-01-22 00:58:20.816 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:58:20 compute-0 nova_compute[182935]: 2026-01-22 00:58:20.816 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:58:20 compute-0 nova_compute[182935]: 2026-01-22 00:58:20.946 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:58:20 compute-0 nova_compute[182935]: 2026-01-22 00:58:20.947 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5708MB free_disk=73.11836624145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:58:20 compute-0 nova_compute[182935]: 2026-01-22 00:58:20.947 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:58:20 compute-0 nova_compute[182935]: 2026-01-22 00:58:20.948 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:58:21 compute-0 nova_compute[182935]: 2026-01-22 00:58:21.003 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:58:21 compute-0 nova_compute[182935]: 2026-01-22 00:58:21.003 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:58:21 compute-0 nova_compute[182935]: 2026-01-22 00:58:21.020 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:58:21 compute-0 nova_compute[182935]: 2026-01-22 00:58:21.032 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:58:21 compute-0 nova_compute[182935]: 2026-01-22 00:58:21.034 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:58:21 compute-0 nova_compute[182935]: 2026-01-22 00:58:21.034 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 00:58:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 00:58:24 compute-0 nova_compute[182935]: 2026-01-22 00:58:24.677 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:25 compute-0 nova_compute[182935]: 2026-01-22 00:58:25.034 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:25 compute-0 nova_compute[182935]: 2026-01-22 00:58:25.575 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:27 compute-0 podman[254268]: 2026-01-22 00:58:27.692288834 +0000 UTC m=+0.058548708 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:58:27 compute-0 podman[254266]: 2026-01-22 00:58:27.692321785 +0000 UTC m=+0.058343104 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:58:27 compute-0 podman[254267]: 2026-01-22 00:58:27.722538487 +0000 UTC m=+0.082755106 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:58:28 compute-0 nova_compute[182935]: 2026-01-22 00:58:28.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:28 compute-0 nova_compute[182935]: 2026-01-22 00:58:28.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:58:28 compute-0 nova_compute[182935]: 2026-01-22 00:58:28.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:58:28 compute-0 nova_compute[182935]: 2026-01-22 00:58:28.814 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:58:29 compute-0 nova_compute[182935]: 2026-01-22 00:58:29.732 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:30 compute-0 nova_compute[182935]: 2026-01-22 00:58:30.578 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:33 compute-0 nova_compute[182935]: 2026-01-22 00:58:33.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:33 compute-0 nova_compute[182935]: 2026-01-22 00:58:33.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:33 compute-0 podman[254337]: 2026-01-22 00:58:33.88359493 +0000 UTC m=+0.060100595 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 22 00:58:34 compute-0 nova_compute[182935]: 2026-01-22 00:58:34.733 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:35 compute-0 nova_compute[182935]: 2026-01-22 00:58:35.580 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:36 compute-0 nova_compute[182935]: 2026-01-22 00:58:36.064 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:38 compute-0 nova_compute[182935]: 2026-01-22 00:58:38.805 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:38 compute-0 nova_compute[182935]: 2026-01-22 00:58:38.820 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:39 compute-0 podman[254357]: 2026-01-22 00:58:39.701836993 +0000 UTC m=+0.080849151 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, release=1755695350, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal)
Jan 22 00:58:39 compute-0 podman[254358]: 2026-01-22 00:58:39.736975672 +0000 UTC m=+0.103863101 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 00:58:39 compute-0 nova_compute[182935]: 2026-01-22 00:58:39.779 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:39 compute-0 nova_compute[182935]: 2026-01-22 00:58:39.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:39 compute-0 nova_compute[182935]: 2026-01-22 00:58:39.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:40 compute-0 nova_compute[182935]: 2026-01-22 00:58:40.582 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:42 compute-0 nova_compute[182935]: 2026-01-22 00:58:42.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:44 compute-0 nova_compute[182935]: 2026-01-22 00:58:44.781 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:45 compute-0 nova_compute[182935]: 2026-01-22 00:58:45.585 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:49 compute-0 nova_compute[182935]: 2026-01-22 00:58:49.815 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:50 compute-0 nova_compute[182935]: 2026-01-22 00:58:50.587 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:51 compute-0 nova_compute[182935]: 2026-01-22 00:58:51.818 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:58:51 compute-0 nova_compute[182935]: 2026-01-22 00:58:51.819 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 00:58:54 compute-0 nova_compute[182935]: 2026-01-22 00:58:54.817 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:55 compute-0 sshd-session[254395]: Invalid user deploy from 188.166.69.60 port 51858
Jan 22 00:58:55 compute-0 sshd-session[254395]: Connection closed by invalid user deploy 188.166.69.60 port 51858 [preauth]
Jan 22 00:58:55 compute-0 nova_compute[182935]: 2026-01-22 00:58:55.589 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:58:58 compute-0 podman[254397]: 2026-01-22 00:58:58.720652331 +0000 UTC m=+0.082362987 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 00:58:58 compute-0 podman[254399]: 2026-01-22 00:58:58.736686863 +0000 UTC m=+0.089319562 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:58:58 compute-0 podman[254398]: 2026-01-22 00:58:58.758272879 +0000 UTC m=+0.111849451 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 00:58:59 compute-0 nova_compute[182935]: 2026-01-22 00:58:59.854 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:00 compute-0 nova_compute[182935]: 2026-01-22 00:59:00.591 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:59:03.260 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:59:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:59:03.260 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:59:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 00:59:03.261 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:59:04 compute-0 podman[254469]: 2026-01-22 00:59:04.720503087 +0000 UTC m=+0.087103739 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 22 00:59:04 compute-0 nova_compute[182935]: 2026-01-22 00:59:04.857 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:05 compute-0 nova_compute[182935]: 2026-01-22 00:59:05.593 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:09 compute-0 nova_compute[182935]: 2026-01-22 00:59:09.859 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:10 compute-0 nova_compute[182935]: 2026-01-22 00:59:10.596 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:10 compute-0 podman[254488]: 2026-01-22 00:59:10.778942532 +0000 UTC m=+0.102505247 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Jan 22 00:59:10 compute-0 podman[254489]: 2026-01-22 00:59:10.804245456 +0000 UTC m=+0.120121747 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 00:59:11 compute-0 nova_compute[182935]: 2026-01-22 00:59:11.809 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:11 compute-0 nova_compute[182935]: 2026-01-22 00:59:11.810 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 00:59:11 compute-0 nova_compute[182935]: 2026-01-22 00:59:11.829 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 00:59:14 compute-0 nova_compute[182935]: 2026-01-22 00:59:14.861 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:15 compute-0 nova_compute[182935]: 2026-01-22 00:59:15.599 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:19 compute-0 nova_compute[182935]: 2026-01-22 00:59:19.862 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:20 compute-0 nova_compute[182935]: 2026-01-22 00:59:20.603 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:20 compute-0 nova_compute[182935]: 2026-01-22 00:59:20.813 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:20 compute-0 nova_compute[182935]: 2026-01-22 00:59:20.814 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 00:59:20 compute-0 nova_compute[182935]: 2026-01-22 00:59:20.814 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:20 compute-0 nova_compute[182935]: 2026-01-22 00:59:20.845 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:59:20 compute-0 nova_compute[182935]: 2026-01-22 00:59:20.846 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:59:20 compute-0 nova_compute[182935]: 2026-01-22 00:59:20.846 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:59:20 compute-0 nova_compute[182935]: 2026-01-22 00:59:20.847 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 00:59:21 compute-0 nova_compute[182935]: 2026-01-22 00:59:21.068 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 00:59:21 compute-0 nova_compute[182935]: 2026-01-22 00:59:21.070 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5712MB free_disk=73.11836624145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 00:59:21 compute-0 nova_compute[182935]: 2026-01-22 00:59:21.071 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 00:59:21 compute-0 nova_compute[182935]: 2026-01-22 00:59:21.071 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 00:59:21 compute-0 nova_compute[182935]: 2026-01-22 00:59:21.150 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 00:59:21 compute-0 nova_compute[182935]: 2026-01-22 00:59:21.150 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 00:59:21 compute-0 nova_compute[182935]: 2026-01-22 00:59:21.171 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 00:59:21 compute-0 nova_compute[182935]: 2026-01-22 00:59:21.190 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 00:59:21 compute-0 nova_compute[182935]: 2026-01-22 00:59:21.192 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 00:59:21 compute-0 nova_compute[182935]: 2026-01-22 00:59:21.192 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 00:59:24 compute-0 nova_compute[182935]: 2026-01-22 00:59:24.864 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:25 compute-0 nova_compute[182935]: 2026-01-22 00:59:25.606 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:26 compute-0 nova_compute[182935]: 2026-01-22 00:59:26.173 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:28 compute-0 nova_compute[182935]: 2026-01-22 00:59:28.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:28 compute-0 nova_compute[182935]: 2026-01-22 00:59:28.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 00:59:28 compute-0 nova_compute[182935]: 2026-01-22 00:59:28.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 00:59:28 compute-0 nova_compute[182935]: 2026-01-22 00:59:28.811 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 00:59:29 compute-0 podman[254531]: 2026-01-22 00:59:29.724355419 +0000 UTC m=+0.073079305 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 00:59:29 compute-0 podman[254529]: 2026-01-22 00:59:29.727928725 +0000 UTC m=+0.086080826 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 00:59:29 compute-0 podman[254530]: 2026-01-22 00:59:29.811628783 +0000 UTC m=+0.166874435 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 00:59:29 compute-0 nova_compute[182935]: 2026-01-22 00:59:29.866 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:30 compute-0 nova_compute[182935]: 2026-01-22 00:59:30.608 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:34 compute-0 nova_compute[182935]: 2026-01-22 00:59:34.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:34 compute-0 nova_compute[182935]: 2026-01-22 00:59:34.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:34 compute-0 nova_compute[182935]: 2026-01-22 00:59:34.868 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:35 compute-0 nova_compute[182935]: 2026-01-22 00:59:35.611 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:35 compute-0 podman[254599]: 2026-01-22 00:59:35.692833157 +0000 UTC m=+0.058456595 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 22 00:59:36 compute-0 sshd-session[254618]: Invalid user deploy from 188.166.69.60 port 60508
Jan 22 00:59:36 compute-0 sshd-session[254618]: Connection closed by invalid user deploy 188.166.69.60 port 60508 [preauth]
Jan 22 00:59:39 compute-0 nova_compute[182935]: 2026-01-22 00:59:39.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:39 compute-0 nova_compute[182935]: 2026-01-22 00:59:39.894 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:40 compute-0 nova_compute[182935]: 2026-01-22 00:59:40.613 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:40 compute-0 nova_compute[182935]: 2026-01-22 00:59:40.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:41 compute-0 podman[254620]: 2026-01-22 00:59:41.737840202 +0000 UTC m=+0.103574443 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.expose-services=)
Jan 22 00:59:41 compute-0 podman[254621]: 2026-01-22 00:59:41.73942217 +0000 UTC m=+0.099441995 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 22 00:59:41 compute-0 nova_compute[182935]: 2026-01-22 00:59:41.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 00:59:44 compute-0 nova_compute[182935]: 2026-01-22 00:59:44.939 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:45 compute-0 nova_compute[182935]: 2026-01-22 00:59:45.616 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:49 compute-0 nova_compute[182935]: 2026-01-22 00:59:49.940 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:50 compute-0 nova_compute[182935]: 2026-01-22 00:59:50.618 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:54 compute-0 nova_compute[182935]: 2026-01-22 00:59:54.942 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:55 compute-0 nova_compute[182935]: 2026-01-22 00:59:55.620 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 00:59:59 compute-0 nova_compute[182935]: 2026-01-22 00:59:59.980 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:00 compute-0 nova_compute[182935]: 2026-01-22 01:00:00.623 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:00 compute-0 podman[254662]: 2026-01-22 01:00:00.713688103 +0000 UTC m=+0.084091328 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 01:00:00 compute-0 podman[254663]: 2026-01-22 01:00:00.731825616 +0000 UTC m=+0.100035479 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:00:00 compute-0 podman[254668]: 2026-01-22 01:00:00.734673044 +0000 UTC m=+0.087591122 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 01:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:00:03.262 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:00:03.263 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:00:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:00:03.263 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:00:04 compute-0 nova_compute[182935]: 2026-01-22 01:00:04.983 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:05 compute-0 nova_compute[182935]: 2026-01-22 01:00:05.625 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:06 compute-0 podman[254735]: 2026-01-22 01:00:06.714111143 +0000 UTC m=+0.081879876 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 22 01:00:09 compute-0 nova_compute[182935]: 2026-01-22 01:00:09.985 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:10 compute-0 nova_compute[182935]: 2026-01-22 01:00:10.628 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:12 compute-0 podman[254757]: 2026-01-22 01:00:12.70098464 +0000 UTC m=+0.066242792 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 22 01:00:12 compute-0 podman[254756]: 2026-01-22 01:00:12.751128877 +0000 UTC m=+0.112938747 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 01:00:14 compute-0 nova_compute[182935]: 2026-01-22 01:00:14.987 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:15 compute-0 nova_compute[182935]: 2026-01-22 01:00:15.630 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:17 compute-0 sshd-session[254798]: Invalid user deploy from 188.166.69.60 port 53982
Jan 22 01:00:17 compute-0 sshd-session[254798]: Connection closed by invalid user deploy 188.166.69.60 port 53982 [preauth]
Jan 22 01:00:19 compute-0 nova_compute[182935]: 2026-01-22 01:00:19.989 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:20 compute-0 nova_compute[182935]: 2026-01-22 01:00:20.633 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:21 compute-0 nova_compute[182935]: 2026-01-22 01:00:21.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:21 compute-0 nova_compute[182935]: 2026-01-22 01:00:21.871 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:00:21 compute-0 nova_compute[182935]: 2026-01-22 01:00:21.871 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:00:21 compute-0 nova_compute[182935]: 2026-01-22 01:00:21.872 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:00:21 compute-0 nova_compute[182935]: 2026-01-22 01:00:21.872 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:00:22 compute-0 nova_compute[182935]: 2026-01-22 01:00:22.061 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:00:22 compute-0 nova_compute[182935]: 2026-01-22 01:00:22.062 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5706MB free_disk=73.11836624145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:00:22 compute-0 nova_compute[182935]: 2026-01-22 01:00:22.062 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:00:22 compute-0 nova_compute[182935]: 2026-01-22 01:00:22.062 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:00:22 compute-0 nova_compute[182935]: 2026-01-22 01:00:22.135 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:00:22 compute-0 nova_compute[182935]: 2026-01-22 01:00:22.136 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:00:22 compute-0 nova_compute[182935]: 2026-01-22 01:00:22.176 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:00:22 compute-0 nova_compute[182935]: 2026-01-22 01:00:22.188 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:00:22 compute-0 nova_compute[182935]: 2026-01-22 01:00:22.189 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:00:22 compute-0 nova_compute[182935]: 2026-01-22 01:00:22.190 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:00:23 compute-0 nova_compute[182935]: 2026-01-22 01:00:23.190 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:23 compute-0 nova_compute[182935]: 2026-01-22 01:00:23.191 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:00:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:00:24 compute-0 nova_compute[182935]: 2026-01-22 01:00:24.994 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:25 compute-0 nova_compute[182935]: 2026-01-22 01:00:25.635 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:26 compute-0 nova_compute[182935]: 2026-01-22 01:00:26.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:29 compute-0 nova_compute[182935]: 2026-01-22 01:00:29.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:29 compute-0 nova_compute[182935]: 2026-01-22 01:00:29.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:00:29 compute-0 nova_compute[182935]: 2026-01-22 01:00:29.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:00:29 compute-0 nova_compute[182935]: 2026-01-22 01:00:29.873 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:00:30 compute-0 nova_compute[182935]: 2026-01-22 01:00:30.017 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:30 compute-0 nova_compute[182935]: 2026-01-22 01:00:30.637 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:31 compute-0 podman[254800]: 2026-01-22 01:00:31.720489472 +0000 UTC m=+0.076014425 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 01:00:31 compute-0 podman[254802]: 2026-01-22 01:00:31.745100649 +0000 UTC m=+0.093907122 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 01:00:31 compute-0 podman[254801]: 2026-01-22 01:00:31.747901506 +0000 UTC m=+0.110165030 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 01:00:35 compute-0 nova_compute[182935]: 2026-01-22 01:00:35.051 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:35 compute-0 nova_compute[182935]: 2026-01-22 01:00:35.639 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:35 compute-0 nova_compute[182935]: 2026-01-22 01:00:35.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:36 compute-0 nova_compute[182935]: 2026-01-22 01:00:36.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:37 compute-0 podman[254873]: 2026-01-22 01:00:37.693633211 +0000 UTC m=+0.065621577 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 01:00:40 compute-0 nova_compute[182935]: 2026-01-22 01:00:40.093 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:40 compute-0 nova_compute[182935]: 2026-01-22 01:00:40.642 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:40 compute-0 nova_compute[182935]: 2026-01-22 01:00:40.787 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:40 compute-0 nova_compute[182935]: 2026-01-22 01:00:40.854 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:41 compute-0 nova_compute[182935]: 2026-01-22 01:00:41.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:41 compute-0 nova_compute[182935]: 2026-01-22 01:00:41.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:00:43 compute-0 podman[254893]: 2026-01-22 01:00:43.715924604 +0000 UTC m=+0.082291085 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Jan 22 01:00:43 compute-0 podman[254892]: 2026-01-22 01:00:43.741841223 +0000 UTC m=+0.110332735 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7)
Jan 22 01:00:45 compute-0 nova_compute[182935]: 2026-01-22 01:00:45.095 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:45 compute-0 nova_compute[182935]: 2026-01-22 01:00:45.643 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:48 compute-0 nova_compute[182935]: 2026-01-22 01:00:48.204 182939 DEBUG oslo_concurrency.processutils [None req-3383b229-0645-4d84-a161-f817197299df c798bde61dce4297a27213eac66acb7f 43b70c4e837343859ac97b6b2397ba1b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 22 01:00:48 compute-0 nova_compute[182935]: 2026-01-22 01:00:48.222 182939 DEBUG oslo_concurrency.processutils [None req-3383b229-0645-4d84-a161-f817197299df c798bde61dce4297a27213eac66acb7f 43b70c4e837343859ac97b6b2397ba1b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 22 01:00:50 compute-0 nova_compute[182935]: 2026-01-22 01:00:50.095 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:50 compute-0 nova_compute[182935]: 2026-01-22 01:00:50.645 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:55 compute-0 nova_compute[182935]: 2026-01-22 01:00:55.100 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:55 compute-0 nova_compute[182935]: 2026-01-22 01:00:55.647 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:00:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:00:59.122 104408 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '16:a4:fb', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fa:c2:bb:50:eb:27'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 22 01:00:59 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:00:59.123 104408 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 22 01:00:59 compute-0 nova_compute[182935]: 2026-01-22 01:00:59.125 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:00 compute-0 nova_compute[182935]: 2026-01-22 01:01:00.132 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:00 compute-0 nova_compute[182935]: 2026-01-22 01:01:00.650 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:01 compute-0 CROND[254934]: (root) CMD (run-parts /etc/cron.hourly)
Jan 22 01:01:01 compute-0 run-parts[254937]: (/etc/cron.hourly) starting 0anacron
Jan 22 01:01:01 compute-0 anacron[254945]: Anacron started on 2026-01-22
Jan 22 01:01:01 compute-0 anacron[254945]: Normal exit (0 jobs run)
Jan 22 01:01:01 compute-0 run-parts[254947]: (/etc/cron.hourly) finished 0anacron
Jan 22 01:01:01 compute-0 CROND[254933]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 22 01:01:02 compute-0 podman[254948]: 2026-01-22 01:01:02.696706905 +0000 UTC m=+0.056625112 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 01:01:02 compute-0 podman[254949]: 2026-01-22 01:01:02.717980233 +0000 UTC m=+0.082027609 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Jan 22 01:01:02 compute-0 podman[254950]: 2026-01-22 01:01:02.719702214 +0000 UTC m=+0.081108917 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 01:01:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:01:03.263 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:01:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:01:03.264 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:01:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:01:03.264 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:01:05 compute-0 nova_compute[182935]: 2026-01-22 01:01:05.133 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:05 compute-0 nova_compute[182935]: 2026-01-22 01:01:05.652 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:08 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:01:08.125 104408 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7f404a2f-20ba-4b9b-88d6-fa3588630efa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 22 01:01:08 compute-0 podman[255020]: 2026-01-22 01:01:08.715629337 +0000 UTC m=+0.081364683 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 22 01:01:10 compute-0 nova_compute[182935]: 2026-01-22 01:01:10.137 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:10 compute-0 nova_compute[182935]: 2026-01-22 01:01:10.655 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:14 compute-0 podman[255041]: 2026-01-22 01:01:14.735041421 +0000 UTC m=+0.095991472 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 01:01:14 compute-0 podman[255040]: 2026-01-22 01:01:14.735593364 +0000 UTC m=+0.103075131 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-type=git, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter)
Jan 22 01:01:15 compute-0 nova_compute[182935]: 2026-01-22 01:01:15.179 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:15 compute-0 nova_compute[182935]: 2026-01-22 01:01:15.657 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:20 compute-0 nova_compute[182935]: 2026-01-22 01:01:20.182 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:20 compute-0 nova_compute[182935]: 2026-01-22 01:01:20.659 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:23 compute-0 nova_compute[182935]: 2026-01-22 01:01:23.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:23 compute-0 nova_compute[182935]: 2026-01-22 01:01:23.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:01:23 compute-0 nova_compute[182935]: 2026-01-22 01:01:23.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:23 compute-0 nova_compute[182935]: 2026-01-22 01:01:23.840 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:01:23 compute-0 nova_compute[182935]: 2026-01-22 01:01:23.841 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:01:23 compute-0 nova_compute[182935]: 2026-01-22 01:01:23.841 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:01:23 compute-0 nova_compute[182935]: 2026-01-22 01:01:23.841 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.011 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.012 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5706MB free_disk=73.11836624145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.012 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.012 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.202 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.202 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.341 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.482 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.483 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.525 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.571 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.602 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.647 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.650 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:01:24 compute-0 nova_compute[182935]: 2026-01-22 01:01:24.650 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:01:25 compute-0 nova_compute[182935]: 2026-01-22 01:01:25.229 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:25 compute-0 nova_compute[182935]: 2026-01-22 01:01:25.661 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:27 compute-0 nova_compute[182935]: 2026-01-22 01:01:27.652 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:30 compute-0 nova_compute[182935]: 2026-01-22 01:01:30.232 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:30 compute-0 nova_compute[182935]: 2026-01-22 01:01:30.663 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:30 compute-0 nova_compute[182935]: 2026-01-22 01:01:30.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:30 compute-0 nova_compute[182935]: 2026-01-22 01:01:30.793 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:01:30 compute-0 nova_compute[182935]: 2026-01-22 01:01:30.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:01:30 compute-0 nova_compute[182935]: 2026-01-22 01:01:30.831 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:01:33 compute-0 podman[255081]: 2026-01-22 01:01:33.693189241 +0000 UTC m=+0.059327897 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 01:01:33 compute-0 podman[255083]: 2026-01-22 01:01:33.717595224 +0000 UTC m=+0.067202595 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 01:01:33 compute-0 podman[255082]: 2026-01-22 01:01:33.735353718 +0000 UTC m=+0.096329301 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 22 01:01:35 compute-0 nova_compute[182935]: 2026-01-22 01:01:35.235 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:35 compute-0 nova_compute[182935]: 2026-01-22 01:01:35.709 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:36 compute-0 sshd-session[255152]: Received disconnect from 45.227.254.170 port 64668:11:  [preauth]
Jan 22 01:01:36 compute-0 sshd-session[255152]: Disconnected from authenticating user root 45.227.254.170 port 64668 [preauth]
Jan 22 01:01:36 compute-0 nova_compute[182935]: 2026-01-22 01:01:36.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:38 compute-0 nova_compute[182935]: 2026-01-22 01:01:38.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:39 compute-0 podman[255154]: 2026-01-22 01:01:39.710790753 +0000 UTC m=+0.081108706 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 22 01:01:40 compute-0 nova_compute[182935]: 2026-01-22 01:01:40.242 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:40 compute-0 nova_compute[182935]: 2026-01-22 01:01:40.712 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:40 compute-0 nova_compute[182935]: 2026-01-22 01:01:40.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:42 compute-0 nova_compute[182935]: 2026-01-22 01:01:42.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:43 compute-0 nova_compute[182935]: 2026-01-22 01:01:43.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:01:45 compute-0 nova_compute[182935]: 2026-01-22 01:01:45.275 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:45 compute-0 nova_compute[182935]: 2026-01-22 01:01:45.714 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:45 compute-0 podman[255174]: 2026-01-22 01:01:45.744754574 +0000 UTC m=+0.097334314 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Jan 22 01:01:45 compute-0 podman[255175]: 2026-01-22 01:01:45.756682129 +0000 UTC m=+0.101635317 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 22 01:01:50 compute-0 nova_compute[182935]: 2026-01-22 01:01:50.278 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:50 compute-0 nova_compute[182935]: 2026-01-22 01:01:50.715 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:55 compute-0 nova_compute[182935]: 2026-01-22 01:01:55.281 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:01:55 compute-0 nova_compute[182935]: 2026-01-22 01:01:55.717 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:00 compute-0 nova_compute[182935]: 2026-01-22 01:02:00.283 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:00 compute-0 nova_compute[182935]: 2026-01-22 01:02:00.722 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:02:03.265 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:02:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:02:03.265 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:02:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:02:03.266 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:02:04 compute-0 podman[255217]: 2026-01-22 01:02:04.725383362 +0000 UTC m=+0.085844820 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 01:02:04 compute-0 podman[255219]: 2026-01-22 01:02:04.7425048 +0000 UTC m=+0.085512642 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 01:02:04 compute-0 podman[255218]: 2026-01-22 01:02:04.754139649 +0000 UTC m=+0.115758745 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 22 01:02:05 compute-0 nova_compute[182935]: 2026-01-22 01:02:05.286 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:05 compute-0 nova_compute[182935]: 2026-01-22 01:02:05.724 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:10 compute-0 nova_compute[182935]: 2026-01-22 01:02:10.327 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:10 compute-0 nova_compute[182935]: 2026-01-22 01:02:10.727 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:10 compute-0 podman[255290]: 2026-01-22 01:02:10.752774566 +0000 UTC m=+0.118240093 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 01:02:15 compute-0 nova_compute[182935]: 2026-01-22 01:02:15.328 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:15 compute-0 nova_compute[182935]: 2026-01-22 01:02:15.730 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:16 compute-0 podman[255310]: 2026-01-22 01:02:16.735190509 +0000 UTC m=+0.098432090 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, release=1755695350, io.buildah.version=1.33.7, version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 22 01:02:16 compute-0 podman[255311]: 2026-01-22 01:02:16.752634015 +0000 UTC m=+0.110219641 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 22 01:02:20 compute-0 nova_compute[182935]: 2026-01-22 01:02:20.362 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:20 compute-0 nova_compute[182935]: 2026-01-22 01:02:20.734 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:02:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:02:23 compute-0 nova_compute[182935]: 2026-01-22 01:02:23.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:23 compute-0 nova_compute[182935]: 2026-01-22 01:02:23.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:02:23 compute-0 nova_compute[182935]: 2026-01-22 01:02:23.796 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:23 compute-0 nova_compute[182935]: 2026-01-22 01:02:23.833 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:02:23 compute-0 nova_compute[182935]: 2026-01-22 01:02:23.833 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:02:23 compute-0 nova_compute[182935]: 2026-01-22 01:02:23.834 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:02:23 compute-0 nova_compute[182935]: 2026-01-22 01:02:23.834 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:02:24 compute-0 nova_compute[182935]: 2026-01-22 01:02:24.074 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:02:24 compute-0 nova_compute[182935]: 2026-01-22 01:02:24.076 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5712MB free_disk=73.11836624145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:02:24 compute-0 nova_compute[182935]: 2026-01-22 01:02:24.076 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:02:24 compute-0 nova_compute[182935]: 2026-01-22 01:02:24.077 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:02:24 compute-0 nova_compute[182935]: 2026-01-22 01:02:24.183 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:02:24 compute-0 nova_compute[182935]: 2026-01-22 01:02:24.184 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:02:24 compute-0 nova_compute[182935]: 2026-01-22 01:02:24.214 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:02:24 compute-0 nova_compute[182935]: 2026-01-22 01:02:24.236 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:02:24 compute-0 nova_compute[182935]: 2026-01-22 01:02:24.238 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:02:24 compute-0 nova_compute[182935]: 2026-01-22 01:02:24.239 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:02:25 compute-0 nova_compute[182935]: 2026-01-22 01:02:25.365 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:25 compute-0 nova_compute[182935]: 2026-01-22 01:02:25.736 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:27 compute-0 nova_compute[182935]: 2026-01-22 01:02:27.238 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:30 compute-0 nova_compute[182935]: 2026-01-22 01:02:30.367 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:30 compute-0 nova_compute[182935]: 2026-01-22 01:02:30.739 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:32 compute-0 nova_compute[182935]: 2026-01-22 01:02:32.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:32 compute-0 nova_compute[182935]: 2026-01-22 01:02:32.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:02:32 compute-0 nova_compute[182935]: 2026-01-22 01:02:32.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:02:32 compute-0 nova_compute[182935]: 2026-01-22 01:02:32.815 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:02:35 compute-0 nova_compute[182935]: 2026-01-22 01:02:35.390 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:35 compute-0 podman[255350]: 2026-01-22 01:02:35.724950073 +0000 UTC m=+0.084226942 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 01:02:35 compute-0 podman[255352]: 2026-01-22 01:02:35.727441822 +0000 UTC m=+0.075682857 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 01:02:35 compute-0 nova_compute[182935]: 2026-01-22 01:02:35.742 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:35 compute-0 podman[255351]: 2026-01-22 01:02:35.774493375 +0000 UTC m=+0.127089494 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 22 01:02:37 compute-0 nova_compute[182935]: 2026-01-22 01:02:37.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:38 compute-0 nova_compute[182935]: 2026-01-22 01:02:38.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:40 compute-0 nova_compute[182935]: 2026-01-22 01:02:40.435 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:40 compute-0 nova_compute[182935]: 2026-01-22 01:02:40.743 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:41 compute-0 podman[255424]: 2026-01-22 01:02:41.728099098 +0000 UTC m=+0.090310356 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 22 01:02:41 compute-0 nova_compute[182935]: 2026-01-22 01:02:41.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:42 compute-0 nova_compute[182935]: 2026-01-22 01:02:42.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:42 compute-0 nova_compute[182935]: 2026-01-22 01:02:42.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:43 compute-0 nova_compute[182935]: 2026-01-22 01:02:43.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:02:45 compute-0 nova_compute[182935]: 2026-01-22 01:02:45.484 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:45 compute-0 nova_compute[182935]: 2026-01-22 01:02:45.746 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:47 compute-0 podman[255443]: 2026-01-22 01:02:47.746779543 +0000 UTC m=+0.103729567 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, release=1755695350)
Jan 22 01:02:47 compute-0 podman[255444]: 2026-01-22 01:02:47.752417537 +0000 UTC m=+0.100362196 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 22 01:02:50 compute-0 nova_compute[182935]: 2026-01-22 01:02:50.535 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:50 compute-0 nova_compute[182935]: 2026-01-22 01:02:50.749 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:55 compute-0 nova_compute[182935]: 2026-01-22 01:02:55.579 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:02:55 compute-0 nova_compute[182935]: 2026-01-22 01:02:55.751 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:00 compute-0 nova_compute[182935]: 2026-01-22 01:03:00.614 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:00 compute-0 nova_compute[182935]: 2026-01-22 01:03:00.753 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:03:03.267 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:03:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:03:03.267 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:03:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:03:03.267 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:03:05 compute-0 nova_compute[182935]: 2026-01-22 01:03:05.669 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:05 compute-0 nova_compute[182935]: 2026-01-22 01:03:05.756 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:06 compute-0 podman[255483]: 2026-01-22 01:03:06.725030264 +0000 UTC m=+0.083544024 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 01:03:06 compute-0 podman[255485]: 2026-01-22 01:03:06.734432879 +0000 UTC m=+0.075842501 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 01:03:06 compute-0 podman[255484]: 2026-01-22 01:03:06.855040018 +0000 UTC m=+0.205090736 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 22 01:03:10 compute-0 nova_compute[182935]: 2026-01-22 01:03:10.709 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:10 compute-0 nova_compute[182935]: 2026-01-22 01:03:10.758 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:12 compute-0 podman[255557]: 2026-01-22 01:03:12.724086724 +0000 UTC m=+0.087650314 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 01:03:15 compute-0 nova_compute[182935]: 2026-01-22 01:03:15.711 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:15 compute-0 nova_compute[182935]: 2026-01-22 01:03:15.759 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:18 compute-0 podman[255577]: 2026-01-22 01:03:18.737582747 +0000 UTC m=+0.099317292 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, version=9.6, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible)
Jan 22 01:03:18 compute-0 podman[255578]: 2026-01-22 01:03:18.7498845 +0000 UTC m=+0.104684689 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 01:03:20 compute-0 nova_compute[182935]: 2026-01-22 01:03:20.714 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:20 compute-0 nova_compute[182935]: 2026-01-22 01:03:20.760 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:23 compute-0 nova_compute[182935]: 2026-01-22 01:03:23.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:23 compute-0 nova_compute[182935]: 2026-01-22 01:03:23.905 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:03:23 compute-0 nova_compute[182935]: 2026-01-22 01:03:23.906 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:03:23 compute-0 nova_compute[182935]: 2026-01-22 01:03:23.906 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:03:23 compute-0 nova_compute[182935]: 2026-01-22 01:03:23.906 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:03:24 compute-0 nova_compute[182935]: 2026-01-22 01:03:24.191 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:03:24 compute-0 nova_compute[182935]: 2026-01-22 01:03:24.192 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5718MB free_disk=73.11836624145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:03:24 compute-0 nova_compute[182935]: 2026-01-22 01:03:24.193 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:03:24 compute-0 nova_compute[182935]: 2026-01-22 01:03:24.193 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:03:24 compute-0 nova_compute[182935]: 2026-01-22 01:03:24.287 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:03:24 compute-0 nova_compute[182935]: 2026-01-22 01:03:24.287 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:03:24 compute-0 nova_compute[182935]: 2026-01-22 01:03:24.331 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:03:24 compute-0 nova_compute[182935]: 2026-01-22 01:03:24.357 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:03:24 compute-0 nova_compute[182935]: 2026-01-22 01:03:24.360 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:03:24 compute-0 nova_compute[182935]: 2026-01-22 01:03:24.360 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:03:25 compute-0 nova_compute[182935]: 2026-01-22 01:03:25.748 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:25 compute-0 nova_compute[182935]: 2026-01-22 01:03:25.762 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:26 compute-0 nova_compute[182935]: 2026-01-22 01:03:26.360 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:26 compute-0 nova_compute[182935]: 2026-01-22 01:03:26.361 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:03:26 compute-0 nova_compute[182935]: 2026-01-22 01:03:26.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:30 compute-0 nova_compute[182935]: 2026-01-22 01:03:30.750 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:30 compute-0 nova_compute[182935]: 2026-01-22 01:03:30.763 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:33 compute-0 nova_compute[182935]: 2026-01-22 01:03:33.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:33 compute-0 nova_compute[182935]: 2026-01-22 01:03:33.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:03:33 compute-0 nova_compute[182935]: 2026-01-22 01:03:33.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:03:33 compute-0 nova_compute[182935]: 2026-01-22 01:03:33.817 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:03:35 compute-0 nova_compute[182935]: 2026-01-22 01:03:35.765 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:35 compute-0 nova_compute[182935]: 2026-01-22 01:03:35.768 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:35 compute-0 nova_compute[182935]: 2026-01-22 01:03:35.768 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:03:35 compute-0 nova_compute[182935]: 2026-01-22 01:03:35.769 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:35 compute-0 nova_compute[182935]: 2026-01-22 01:03:35.792 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:35 compute-0 nova_compute[182935]: 2026-01-22 01:03:35.793 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:37 compute-0 podman[255615]: 2026-01-22 01:03:37.753382044 +0000 UTC m=+0.100141271 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 01:03:37 compute-0 podman[255617]: 2026-01-22 01:03:37.764357306 +0000 UTC m=+0.095866769 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 22 01:03:37 compute-0 podman[255616]: 2026-01-22 01:03:37.818280564 +0000 UTC m=+0.156451386 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 22 01:03:39 compute-0 nova_compute[182935]: 2026-01-22 01:03:39.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:39 compute-0 nova_compute[182935]: 2026-01-22 01:03:39.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:40 compute-0 nova_compute[182935]: 2026-01-22 01:03:40.795 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:40 compute-0 nova_compute[182935]: 2026-01-22 01:03:40.797 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:40 compute-0 nova_compute[182935]: 2026-01-22 01:03:40.797 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:03:40 compute-0 nova_compute[182935]: 2026-01-22 01:03:40.797 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:40 compute-0 nova_compute[182935]: 2026-01-22 01:03:40.863 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:40 compute-0 nova_compute[182935]: 2026-01-22 01:03:40.864 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:42 compute-0 nova_compute[182935]: 2026-01-22 01:03:42.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:43 compute-0 podman[255685]: 2026-01-22 01:03:43.722891987 +0000 UTC m=+0.091952606 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 22 01:03:44 compute-0 nova_compute[182935]: 2026-01-22 01:03:44.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:44 compute-0 nova_compute[182935]: 2026-01-22 01:03:44.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:45 compute-0 nova_compute[182935]: 2026-01-22 01:03:45.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:45 compute-0 nova_compute[182935]: 2026-01-22 01:03:45.864 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:45 compute-0 nova_compute[182935]: 2026-01-22 01:03:45.866 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:03:45 compute-0 nova_compute[182935]: 2026-01-22 01:03:45.866 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:03:45 compute-0 nova_compute[182935]: 2026-01-22 01:03:45.866 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:45 compute-0 nova_compute[182935]: 2026-01-22 01:03:45.887 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:45 compute-0 nova_compute[182935]: 2026-01-22 01:03:45.888 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:03:49 compute-0 podman[255706]: 2026-01-22 01:03:49.732765003 +0000 UTC m=+0.097981900 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 01:03:49 compute-0 podman[255707]: 2026-01-22 01:03:49.738035459 +0000 UTC m=+0.102658452 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 01:03:50 compute-0 nova_compute[182935]: 2026-01-22 01:03:50.888 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:55 compute-0 nova_compute[182935]: 2026-01-22 01:03:55.890 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:03:58 compute-0 nova_compute[182935]: 2026-01-22 01:03:58.854 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:03:58 compute-0 nova_compute[182935]: 2026-01-22 01:03:58.854 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 01:04:00 compute-0 nova_compute[182935]: 2026-01-22 01:04:00.891 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:04:03.268 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:04:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:04:03.269 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:04:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:04:03.270 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:04:05 compute-0 nova_compute[182935]: 2026-01-22 01:04:05.893 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:08 compute-0 podman[255746]: 2026-01-22 01:04:08.729241618 +0000 UTC m=+0.094500857 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 01:04:08 compute-0 podman[255748]: 2026-01-22 01:04:08.733373307 +0000 UTC m=+0.093171625 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 22 01:04:08 compute-0 podman[255747]: 2026-01-22 01:04:08.779848306 +0000 UTC m=+0.140213047 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 22 01:04:10 compute-0 nova_compute[182935]: 2026-01-22 01:04:10.895 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:14 compute-0 podman[255817]: 2026-01-22 01:04:14.689856878 +0000 UTC m=+0.060599408 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 01:04:15 compute-0 nova_compute[182935]: 2026-01-22 01:04:15.897 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:20 compute-0 podman[255836]: 2026-01-22 01:04:20.703546796 +0000 UTC m=+0.072886351 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 22 01:04:20 compute-0 podman[255837]: 2026-01-22 01:04:20.740135489 +0000 UTC m=+0.097804396 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 22 01:04:20 compute-0 nova_compute[182935]: 2026-01-22 01:04:20.898 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:20 compute-0 nova_compute[182935]: 2026-01-22 01:04:20.899 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:20 compute-0 nova_compute[182935]: 2026-01-22 01:04:20.900 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:04:20 compute-0 nova_compute[182935]: 2026-01-22 01:04:20.900 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:20 compute-0 nova_compute[182935]: 2026-01-22 01:04:20.900 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:20 compute-0 nova_compute[182935]: 2026-01-22 01:04:20.902 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:21 compute-0 nova_compute[182935]: 2026-01-22 01:04:21.809 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:21 compute-0 nova_compute[182935]: 2026-01-22 01:04:21.809 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 01:04:21 compute-0 nova_compute[182935]: 2026-01-22 01:04:21.826 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.330 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:04:23.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:04:25 compute-0 nova_compute[182935]: 2026-01-22 01:04:25.811 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:25 compute-0 nova_compute[182935]: 2026-01-22 01:04:25.874 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:04:25 compute-0 nova_compute[182935]: 2026-01-22 01:04:25.874 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:04:25 compute-0 nova_compute[182935]: 2026-01-22 01:04:25.875 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:04:25 compute-0 nova_compute[182935]: 2026-01-22 01:04:25.875 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:04:25 compute-0 nova_compute[182935]: 2026-01-22 01:04:25.901 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:26 compute-0 nova_compute[182935]: 2026-01-22 01:04:26.086 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:04:26 compute-0 nova_compute[182935]: 2026-01-22 01:04:26.088 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5709MB free_disk=73.11844635009766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:04:26 compute-0 nova_compute[182935]: 2026-01-22 01:04:26.088 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:04:26 compute-0 nova_compute[182935]: 2026-01-22 01:04:26.089 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:04:26 compute-0 nova_compute[182935]: 2026-01-22 01:04:26.157 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:04:26 compute-0 nova_compute[182935]: 2026-01-22 01:04:26.157 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:04:26 compute-0 nova_compute[182935]: 2026-01-22 01:04:26.177 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:04:26 compute-0 nova_compute[182935]: 2026-01-22 01:04:26.190 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:04:26 compute-0 nova_compute[182935]: 2026-01-22 01:04:26.192 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:04:26 compute-0 nova_compute[182935]: 2026-01-22 01:04:26.192 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:04:27 compute-0 nova_compute[182935]: 2026-01-22 01:04:27.175 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:27 compute-0 nova_compute[182935]: 2026-01-22 01:04:27.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:27 compute-0 nova_compute[182935]: 2026-01-22 01:04:27.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:04:30 compute-0 nova_compute[182935]: 2026-01-22 01:04:30.903 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:30 compute-0 nova_compute[182935]: 2026-01-22 01:04:30.905 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:30 compute-0 nova_compute[182935]: 2026-01-22 01:04:30.905 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:04:30 compute-0 nova_compute[182935]: 2026-01-22 01:04:30.905 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:30 compute-0 nova_compute[182935]: 2026-01-22 01:04:30.929 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:30 compute-0 nova_compute[182935]: 2026-01-22 01:04:30.931 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:33 compute-0 nova_compute[182935]: 2026-01-22 01:04:33.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:33 compute-0 nova_compute[182935]: 2026-01-22 01:04:33.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:04:33 compute-0 nova_compute[182935]: 2026-01-22 01:04:33.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:04:33 compute-0 nova_compute[182935]: 2026-01-22 01:04:33.810 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:04:35 compute-0 nova_compute[182935]: 2026-01-22 01:04:35.930 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:39 compute-0 podman[255874]: 2026-01-22 01:04:39.677832843 +0000 UTC m=+0.047749631 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 01:04:39 compute-0 podman[255876]: 2026-01-22 01:04:39.687596785 +0000 UTC m=+0.048105788 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 01:04:39 compute-0 podman[255875]: 2026-01-22 01:04:39.707503181 +0000 UTC m=+0.073048924 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 22 01:04:39 compute-0 nova_compute[182935]: 2026-01-22 01:04:39.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:40 compute-0 nova_compute[182935]: 2026-01-22 01:04:40.787 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:40 compute-0 nova_compute[182935]: 2026-01-22 01:04:40.932 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:40 compute-0 nova_compute[182935]: 2026-01-22 01:04:40.934 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:40 compute-0 nova_compute[182935]: 2026-01-22 01:04:40.934 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:04:40 compute-0 nova_compute[182935]: 2026-01-22 01:04:40.934 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:40 compute-0 nova_compute[182935]: 2026-01-22 01:04:40.986 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:40 compute-0 nova_compute[182935]: 2026-01-22 01:04:40.987 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:04:44 compute-0 nova_compute[182935]: 2026-01-22 01:04:44.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:44 compute-0 nova_compute[182935]: 2026-01-22 01:04:44.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:45 compute-0 podman[255944]: 2026-01-22 01:04:45.676494281 +0000 UTC m=+0.054201525 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 22 01:04:45 compute-0 nova_compute[182935]: 2026-01-22 01:04:45.987 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:04:46 compute-0 nova_compute[182935]: 2026-01-22 01:04:46.789 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:46 compute-0 nova_compute[182935]: 2026-01-22 01:04:46.810 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:04:51 compute-0 nova_compute[182935]: 2026-01-22 01:04:51.020 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:04:51 compute-0 podman[255964]: 2026-01-22 01:04:51.712333736 +0000 UTC m=+0.079391947 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 01:04:51 compute-0 podman[255965]: 2026-01-22 01:04:51.734850803 +0000 UTC m=+0.090003479 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 22 01:04:56 compute-0 nova_compute[182935]: 2026-01-22 01:04:56.022 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:01 compute-0 nova_compute[182935]: 2026-01-22 01:05:01.023 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:01 compute-0 nova_compute[182935]: 2026-01-22 01:05:01.025 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:01 compute-0 nova_compute[182935]: 2026-01-22 01:05:01.025 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:05:01 compute-0 nova_compute[182935]: 2026-01-22 01:05:01.025 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:01 compute-0 nova_compute[182935]: 2026-01-22 01:05:01.065 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:01 compute-0 nova_compute[182935]: 2026-01-22 01:05:01.065 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:05:03.269 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:05:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:05:03.270 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:05:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:05:03.270 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:05:06 compute-0 nova_compute[182935]: 2026-01-22 01:05:06.066 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:06 compute-0 nova_compute[182935]: 2026-01-22 01:05:06.068 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:06 compute-0 nova_compute[182935]: 2026-01-22 01:05:06.068 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:05:06 compute-0 nova_compute[182935]: 2026-01-22 01:05:06.069 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:06 compute-0 nova_compute[182935]: 2026-01-22 01:05:06.069 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:06 compute-0 nova_compute[182935]: 2026-01-22 01:05:06.070 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:10 compute-0 podman[256005]: 2026-01-22 01:05:10.701721792 +0000 UTC m=+0.072482601 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 01:05:10 compute-0 podman[256007]: 2026-01-22 01:05:10.702556992 +0000 UTC m=+0.065294469 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 22 01:05:10 compute-0 podman[256006]: 2026-01-22 01:05:10.754152314 +0000 UTC m=+0.113464429 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Jan 22 01:05:11 compute-0 nova_compute[182935]: 2026-01-22 01:05:11.069 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:11 compute-0 nova_compute[182935]: 2026-01-22 01:05:11.072 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:16 compute-0 nova_compute[182935]: 2026-01-22 01:05:16.073 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:16 compute-0 nova_compute[182935]: 2026-01-22 01:05:16.075 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:16 compute-0 nova_compute[182935]: 2026-01-22 01:05:16.075 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:05:16 compute-0 nova_compute[182935]: 2026-01-22 01:05:16.075 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:16 compute-0 nova_compute[182935]: 2026-01-22 01:05:16.095 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:16 compute-0 nova_compute[182935]: 2026-01-22 01:05:16.096 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:16 compute-0 podman[256080]: 2026-01-22 01:05:16.722452197 +0000 UTC m=+0.093426390 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 22 01:05:21 compute-0 nova_compute[182935]: 2026-01-22 01:05:21.097 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:22 compute-0 podman[256099]: 2026-01-22 01:05:22.728180985 +0000 UTC m=+0.095192093 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, managed_by=edpm_ansible, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Jan 22 01:05:22 compute-0 podman[256100]: 2026-01-22 01:05:22.750845735 +0000 UTC m=+0.102995379 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 22 01:05:25 compute-0 nova_compute[182935]: 2026-01-22 01:05:25.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:25 compute-0 nova_compute[182935]: 2026-01-22 01:05:25.838 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:05:25 compute-0 nova_compute[182935]: 2026-01-22 01:05:25.838 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:05:25 compute-0 nova_compute[182935]: 2026-01-22 01:05:25.839 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:05:25 compute-0 nova_compute[182935]: 2026-01-22 01:05:25.839 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:05:26 compute-0 nova_compute[182935]: 2026-01-22 01:05:26.022 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:05:26 compute-0 nova_compute[182935]: 2026-01-22 01:05:26.029 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5707MB free_disk=73.11844635009766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:05:26 compute-0 nova_compute[182935]: 2026-01-22 01:05:26.030 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:05:26 compute-0 nova_compute[182935]: 2026-01-22 01:05:26.030 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:05:26 compute-0 nova_compute[182935]: 2026-01-22 01:05:26.099 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:26 compute-0 nova_compute[182935]: 2026-01-22 01:05:26.114 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:05:26 compute-0 nova_compute[182935]: 2026-01-22 01:05:26.114 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:05:26 compute-0 nova_compute[182935]: 2026-01-22 01:05:26.150 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:05:26 compute-0 nova_compute[182935]: 2026-01-22 01:05:26.166 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:05:26 compute-0 nova_compute[182935]: 2026-01-22 01:05:26.169 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:05:26 compute-0 nova_compute[182935]: 2026-01-22 01:05:26.170 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:05:28 compute-0 nova_compute[182935]: 2026-01-22 01:05:28.169 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:28 compute-0 nova_compute[182935]: 2026-01-22 01:05:28.171 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:28 compute-0 nova_compute[182935]: 2026-01-22 01:05:28.171 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:05:31 compute-0 nova_compute[182935]: 2026-01-22 01:05:31.101 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:33 compute-0 nova_compute[182935]: 2026-01-22 01:05:33.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:33 compute-0 nova_compute[182935]: 2026-01-22 01:05:33.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:05:33 compute-0 nova_compute[182935]: 2026-01-22 01:05:33.796 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:05:33 compute-0 nova_compute[182935]: 2026-01-22 01:05:33.818 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:05:36 compute-0 nova_compute[182935]: 2026-01-22 01:05:36.102 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:41 compute-0 nova_compute[182935]: 2026-01-22 01:05:41.102 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:41 compute-0 nova_compute[182935]: 2026-01-22 01:05:41.105 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:41 compute-0 podman[256143]: 2026-01-22 01:05:41.702677464 +0000 UTC m=+0.062799630 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 01:05:41 compute-0 podman[256141]: 2026-01-22 01:05:41.723149732 +0000 UTC m=+0.093773669 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 22 01:05:41 compute-0 podman[256142]: 2026-01-22 01:05:41.75991758 +0000 UTC m=+0.114700259 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 01:05:41 compute-0 nova_compute[182935]: 2026-01-22 01:05:41.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:41 compute-0 nova_compute[182935]: 2026-01-22 01:05:41.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:44 compute-0 nova_compute[182935]: 2026-01-22 01:05:44.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:46 compute-0 nova_compute[182935]: 2026-01-22 01:05:46.107 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:46 compute-0 nova_compute[182935]: 2026-01-22 01:05:46.109 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:46 compute-0 nova_compute[182935]: 2026-01-22 01:05:46.109 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:05:46 compute-0 nova_compute[182935]: 2026-01-22 01:05:46.109 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:46 compute-0 nova_compute[182935]: 2026-01-22 01:05:46.148 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:46 compute-0 nova_compute[182935]: 2026-01-22 01:05:46.149 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:46 compute-0 nova_compute[182935]: 2026-01-22 01:05:46.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:46 compute-0 nova_compute[182935]: 2026-01-22 01:05:46.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:05:47 compute-0 podman[256212]: 2026-01-22 01:05:47.738664743 +0000 UTC m=+0.098940743 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 22 01:05:51 compute-0 nova_compute[182935]: 2026-01-22 01:05:51.149 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:51 compute-0 nova_compute[182935]: 2026-01-22 01:05:51.151 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:51 compute-0 nova_compute[182935]: 2026-01-22 01:05:51.151 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:05:51 compute-0 nova_compute[182935]: 2026-01-22 01:05:51.151 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:51 compute-0 nova_compute[182935]: 2026-01-22 01:05:51.199 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:51 compute-0 nova_compute[182935]: 2026-01-22 01:05:51.199 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:53 compute-0 podman[256232]: 2026-01-22 01:05:53.690769271 +0000 UTC m=+0.065256038 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container)
Jan 22 01:05:53 compute-0 podman[256233]: 2026-01-22 01:05:53.690780351 +0000 UTC m=+0.057527483 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 01:05:56 compute-0 nova_compute[182935]: 2026-01-22 01:05:56.200 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:05:56 compute-0 nova_compute[182935]: 2026-01-22 01:05:56.202 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:05:56 compute-0 nova_compute[182935]: 2026-01-22 01:05:56.202 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:05:56 compute-0 nova_compute[182935]: 2026-01-22 01:05:56.203 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:56 compute-0 nova_compute[182935]: 2026-01-22 01:05:56.203 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:05:56 compute-0 nova_compute[182935]: 2026-01-22 01:05:56.204 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:01 compute-0 nova_compute[182935]: 2026-01-22 01:06:01.205 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:01 compute-0 nova_compute[182935]: 2026-01-22 01:06:01.207 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:01 compute-0 nova_compute[182935]: 2026-01-22 01:06:01.207 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:06:01 compute-0 nova_compute[182935]: 2026-01-22 01:06:01.207 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:01 compute-0 nova_compute[182935]: 2026-01-22 01:06:01.241 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:01 compute-0 nova_compute[182935]: 2026-01-22 01:06:01.242 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:06:03.270 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:06:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:06:03.270 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:06:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:06:03.271 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:06:06 compute-0 nova_compute[182935]: 2026-01-22 01:06:06.243 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:06 compute-0 nova_compute[182935]: 2026-01-22 01:06:06.245 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:06 compute-0 nova_compute[182935]: 2026-01-22 01:06:06.245 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:06:06 compute-0 nova_compute[182935]: 2026-01-22 01:06:06.245 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:06 compute-0 nova_compute[182935]: 2026-01-22 01:06:06.280 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:06 compute-0 nova_compute[182935]: 2026-01-22 01:06:06.280 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:11 compute-0 nova_compute[182935]: 2026-01-22 01:06:11.281 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:11 compute-0 nova_compute[182935]: 2026-01-22 01:06:11.283 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:11 compute-0 nova_compute[182935]: 2026-01-22 01:06:11.284 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:06:11 compute-0 nova_compute[182935]: 2026-01-22 01:06:11.284 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:11 compute-0 nova_compute[182935]: 2026-01-22 01:06:11.321 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:11 compute-0 nova_compute[182935]: 2026-01-22 01:06:11.322 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:12 compute-0 podman[256276]: 2026-01-22 01:06:12.698079255 +0000 UTC m=+0.058303213 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 01:06:12 compute-0 podman[256274]: 2026-01-22 01:06:12.746367267 +0000 UTC m=+0.106940983 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 01:06:12 compute-0 podman[256275]: 2026-01-22 01:06:12.77835111 +0000 UTC m=+0.133926747 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 01:06:16 compute-0 nova_compute[182935]: 2026-01-22 01:06:16.322 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:18 compute-0 podman[256350]: 2026-01-22 01:06:18.723122122 +0000 UTC m=+0.083007162 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 01:06:21 compute-0 nova_compute[182935]: 2026-01-22 01:06:21.324 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:06:23.337 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:06:24 compute-0 podman[256370]: 2026-01-22 01:06:24.708448852 +0000 UTC m=+0.073231659 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 22 01:06:24 compute-0 podman[256369]: 2026-01-22 01:06:24.755951926 +0000 UTC m=+0.115203221 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Jan 22 01:06:25 compute-0 nova_compute[182935]: 2026-01-22 01:06:25.798 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:25 compute-0 nova_compute[182935]: 2026-01-22 01:06:25.826 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:06:25 compute-0 nova_compute[182935]: 2026-01-22 01:06:25.826 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:06:25 compute-0 nova_compute[182935]: 2026-01-22 01:06:25.826 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:06:25 compute-0 nova_compute[182935]: 2026-01-22 01:06:25.826 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.012 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.013 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5708MB free_disk=73.11844635009766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.014 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.014 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.190 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.190 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.262 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing inventories for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.382 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating ProviderTree inventory for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.383 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Updating inventory in ProviderTree for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.386 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.405 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing aggregate associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.431 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Refreshing trait associations for resource provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535, traits: COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.454 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.470 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.472 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:06:26 compute-0 nova_compute[182935]: 2026-01-22 01:06:26.472 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.458s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:06:29 compute-0 nova_compute[182935]: 2026-01-22 01:06:29.467 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:29 compute-0 nova_compute[182935]: 2026-01-22 01:06:29.468 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:06:29 compute-0 nova_compute[182935]: 2026-01-22 01:06:29.795 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:31 compute-0 nova_compute[182935]: 2026-01-22 01:06:31.388 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:31 compute-0 nova_compute[182935]: 2026-01-22 01:06:31.390 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:31 compute-0 nova_compute[182935]: 2026-01-22 01:06:31.390 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:06:31 compute-0 nova_compute[182935]: 2026-01-22 01:06:31.391 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:31 compute-0 nova_compute[182935]: 2026-01-22 01:06:31.432 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:31 compute-0 nova_compute[182935]: 2026-01-22 01:06:31.433 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:34 compute-0 nova_compute[182935]: 2026-01-22 01:06:34.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:34 compute-0 nova_compute[182935]: 2026-01-22 01:06:34.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:06:34 compute-0 nova_compute[182935]: 2026-01-22 01:06:34.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:06:34 compute-0 nova_compute[182935]: 2026-01-22 01:06:34.812 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:06:36 compute-0 nova_compute[182935]: 2026-01-22 01:06:36.434 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:36 compute-0 nova_compute[182935]: 2026-01-22 01:06:36.436 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:36 compute-0 nova_compute[182935]: 2026-01-22 01:06:36.436 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:06:36 compute-0 nova_compute[182935]: 2026-01-22 01:06:36.437 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:36 compute-0 nova_compute[182935]: 2026-01-22 01:06:36.468 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:36 compute-0 nova_compute[182935]: 2026-01-22 01:06:36.469 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:41 compute-0 nova_compute[182935]: 2026-01-22 01:06:41.469 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:41 compute-0 nova_compute[182935]: 2026-01-22 01:06:41.471 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:41 compute-0 nova_compute[182935]: 2026-01-22 01:06:41.471 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:06:41 compute-0 nova_compute[182935]: 2026-01-22 01:06:41.471 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:41 compute-0 nova_compute[182935]: 2026-01-22 01:06:41.509 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:41 compute-0 nova_compute[182935]: 2026-01-22 01:06:41.510 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:41 compute-0 nova_compute[182935]: 2026-01-22 01:06:41.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:42 compute-0 nova_compute[182935]: 2026-01-22 01:06:42.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:43 compute-0 podman[256407]: 2026-01-22 01:06:43.752140146 +0000 UTC m=+0.118603680 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 22 01:06:43 compute-0 podman[256409]: 2026-01-22 01:06:43.758777495 +0000 UTC m=+0.118123378 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 01:06:43 compute-0 podman[256408]: 2026-01-22 01:06:43.767267108 +0000 UTC m=+0.136873046 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 01:06:44 compute-0 nova_compute[182935]: 2026-01-22 01:06:44.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:46 compute-0 nova_compute[182935]: 2026-01-22 01:06:46.510 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:46 compute-0 nova_compute[182935]: 2026-01-22 01:06:46.512 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:46 compute-0 nova_compute[182935]: 2026-01-22 01:06:46.512 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:06:46 compute-0 nova_compute[182935]: 2026-01-22 01:06:46.512 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:46 compute-0 nova_compute[182935]: 2026-01-22 01:06:46.571 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:46 compute-0 nova_compute[182935]: 2026-01-22 01:06:46.572 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:46 compute-0 nova_compute[182935]: 2026-01-22 01:06:46.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:46 compute-0 nova_compute[182935]: 2026-01-22 01:06:46.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:49 compute-0 podman[256477]: 2026-01-22 01:06:49.730308336 +0000 UTC m=+0.090928581 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 22 01:06:50 compute-0 nova_compute[182935]: 2026-01-22 01:06:50.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:06:51 compute-0 nova_compute[182935]: 2026-01-22 01:06:51.573 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:55 compute-0 podman[256501]: 2026-01-22 01:06:55.733749589 +0000 UTC m=+0.097118140 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 22 01:06:55 compute-0 podman[256500]: 2026-01-22 01:06:55.735113441 +0000 UTC m=+0.096606037 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 22 01:06:56 compute-0 nova_compute[182935]: 2026-01-22 01:06:56.574 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:06:56 compute-0 nova_compute[182935]: 2026-01-22 01:06:56.575 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:06:56 compute-0 nova_compute[182935]: 2026-01-22 01:06:56.575 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:06:56 compute-0 nova_compute[182935]: 2026-01-22 01:06:56.575 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:56 compute-0 nova_compute[182935]: 2026-01-22 01:06:56.576 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:06:56 compute-0 nova_compute[182935]: 2026-01-22 01:06:56.577 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:01 compute-0 nova_compute[182935]: 2026-01-22 01:07:01.578 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:01 compute-0 nova_compute[182935]: 2026-01-22 01:07:01.580 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:01 compute-0 nova_compute[182935]: 2026-01-22 01:07:01.580 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:07:01 compute-0 nova_compute[182935]: 2026-01-22 01:07:01.581 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:01 compute-0 nova_compute[182935]: 2026-01-22 01:07:01.645 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:01 compute-0 nova_compute[182935]: 2026-01-22 01:07:01.646 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:07:03.271 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:07:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:07:03.272 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:07:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:07:03.272 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:07:06 compute-0 nova_compute[182935]: 2026-01-22 01:07:06.647 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:06 compute-0 nova_compute[182935]: 2026-01-22 01:07:06.648 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:06 compute-0 nova_compute[182935]: 2026-01-22 01:07:06.649 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:07:06 compute-0 nova_compute[182935]: 2026-01-22 01:07:06.649 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:06 compute-0 nova_compute[182935]: 2026-01-22 01:07:06.678 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:06 compute-0 nova_compute[182935]: 2026-01-22 01:07:06.679 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:11 compute-0 nova_compute[182935]: 2026-01-22 01:07:11.680 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:11 compute-0 nova_compute[182935]: 2026-01-22 01:07:11.682 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:11 compute-0 nova_compute[182935]: 2026-01-22 01:07:11.683 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:07:11 compute-0 nova_compute[182935]: 2026-01-22 01:07:11.683 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:11 compute-0 nova_compute[182935]: 2026-01-22 01:07:11.698 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:11 compute-0 nova_compute[182935]: 2026-01-22 01:07:11.698 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:14 compute-0 podman[256541]: 2026-01-22 01:07:14.691572579 +0000 UTC m=+0.066574110 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 01:07:14 compute-0 podman[256543]: 2026-01-22 01:07:14.708514694 +0000 UTC m=+0.070516435 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 01:07:14 compute-0 podman[256542]: 2026-01-22 01:07:14.745696091 +0000 UTC m=+0.109520965 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 22 01:07:16 compute-0 nova_compute[182935]: 2026-01-22 01:07:16.699 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:16 compute-0 nova_compute[182935]: 2026-01-22 01:07:16.701 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:16 compute-0 nova_compute[182935]: 2026-01-22 01:07:16.701 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:07:16 compute-0 nova_compute[182935]: 2026-01-22 01:07:16.702 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:16 compute-0 nova_compute[182935]: 2026-01-22 01:07:16.744 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:16 compute-0 nova_compute[182935]: 2026-01-22 01:07:16.744 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:20 compute-0 podman[256614]: 2026-01-22 01:07:20.685006562 +0000 UTC m=+0.063218690 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 22 01:07:21 compute-0 nova_compute[182935]: 2026-01-22 01:07:21.745 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:26 compute-0 podman[256634]: 2026-01-22 01:07:26.704557919 +0000 UTC m=+0.074893328 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:07:26 compute-0 podman[256633]: 2026-01-22 01:07:26.71593724 +0000 UTC m=+0.079390045 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_id=openstack_network_exporter, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Jan 22 01:07:26 compute-0 nova_compute[182935]: 2026-01-22 01:07:26.747 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:26 compute-0 nova_compute[182935]: 2026-01-22 01:07:26.748 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:26 compute-0 nova_compute[182935]: 2026-01-22 01:07:26.749 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:07:26 compute-0 nova_compute[182935]: 2026-01-22 01:07:26.749 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:26 compute-0 nova_compute[182935]: 2026-01-22 01:07:26.782 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:26 compute-0 nova_compute[182935]: 2026-01-22 01:07:26.783 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:27 compute-0 nova_compute[182935]: 2026-01-22 01:07:27.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:27 compute-0 nova_compute[182935]: 2026-01-22 01:07:27.841 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:07:27 compute-0 nova_compute[182935]: 2026-01-22 01:07:27.842 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:07:27 compute-0 nova_compute[182935]: 2026-01-22 01:07:27.842 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:07:27 compute-0 nova_compute[182935]: 2026-01-22 01:07:27.843 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:07:28 compute-0 nova_compute[182935]: 2026-01-22 01:07:28.025 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:07:28 compute-0 nova_compute[182935]: 2026-01-22 01:07:28.026 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5720MB free_disk=73.11844635009766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:07:28 compute-0 nova_compute[182935]: 2026-01-22 01:07:28.026 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:07:28 compute-0 nova_compute[182935]: 2026-01-22 01:07:28.027 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:07:28 compute-0 nova_compute[182935]: 2026-01-22 01:07:28.111 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:07:28 compute-0 nova_compute[182935]: 2026-01-22 01:07:28.111 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:07:28 compute-0 nova_compute[182935]: 2026-01-22 01:07:28.144 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:07:28 compute-0 nova_compute[182935]: 2026-01-22 01:07:28.160 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:07:28 compute-0 nova_compute[182935]: 2026-01-22 01:07:28.162 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:07:28 compute-0 nova_compute[182935]: 2026-01-22 01:07:28.162 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:07:29 compute-0 nova_compute[182935]: 2026-01-22 01:07:29.162 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:29 compute-0 nova_compute[182935]: 2026-01-22 01:07:29.163 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:07:30 compute-0 nova_compute[182935]: 2026-01-22 01:07:30.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:31 compute-0 nova_compute[182935]: 2026-01-22 01:07:31.783 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:35 compute-0 nova_compute[182935]: 2026-01-22 01:07:35.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:35 compute-0 nova_compute[182935]: 2026-01-22 01:07:35.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:07:35 compute-0 nova_compute[182935]: 2026-01-22 01:07:35.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:07:35 compute-0 nova_compute[182935]: 2026-01-22 01:07:35.813 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:07:36 compute-0 nova_compute[182935]: 2026-01-22 01:07:36.786 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:41 compute-0 nova_compute[182935]: 2026-01-22 01:07:41.787 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:41 compute-0 nova_compute[182935]: 2026-01-22 01:07:41.789 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:41 compute-0 nova_compute[182935]: 2026-01-22 01:07:41.789 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:07:41 compute-0 nova_compute[182935]: 2026-01-22 01:07:41.789 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:41 compute-0 nova_compute[182935]: 2026-01-22 01:07:41.804 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:41 compute-0 nova_compute[182935]: 2026-01-22 01:07:41.804 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:42 compute-0 nova_compute[182935]: 2026-01-22 01:07:42.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:42 compute-0 nova_compute[182935]: 2026-01-22 01:07:42.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:45 compute-0 podman[256669]: 2026-01-22 01:07:45.720348775 +0000 UTC m=+0.085443841 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 22 01:07:45 compute-0 podman[256671]: 2026-01-22 01:07:45.735994258 +0000 UTC m=+0.086848994 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 01:07:45 compute-0 podman[256670]: 2026-01-22 01:07:45.768127115 +0000 UTC m=+0.123740705 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:07:46 compute-0 nova_compute[182935]: 2026-01-22 01:07:46.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:46 compute-0 nova_compute[182935]: 2026-01-22 01:07:46.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:46 compute-0 nova_compute[182935]: 2026-01-22 01:07:46.805 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:46 compute-0 nova_compute[182935]: 2026-01-22 01:07:46.806 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:48 compute-0 nova_compute[182935]: 2026-01-22 01:07:48.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:07:51 compute-0 podman[256737]: 2026-01-22 01:07:51.722396643 +0000 UTC m=+0.084402055 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Jan 22 01:07:51 compute-0 nova_compute[182935]: 2026-01-22 01:07:51.807 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:56 compute-0 nova_compute[182935]: 2026-01-22 01:07:56.845 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:07:56 compute-0 nova_compute[182935]: 2026-01-22 01:07:56.847 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:56 compute-0 nova_compute[182935]: 2026-01-22 01:07:56.847 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:07:56 compute-0 nova_compute[182935]: 2026-01-22 01:07:56.847 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:56 compute-0 nova_compute[182935]: 2026-01-22 01:07:56.848 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:07:56 compute-0 nova_compute[182935]: 2026-01-22 01:07:56.849 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:07:57 compute-0 podman[256757]: 2026-01-22 01:07:57.738727023 +0000 UTC m=+0.091407732 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 22 01:07:57 compute-0 podman[256756]: 2026-01-22 01:07:57.73940716 +0000 UTC m=+0.099496086 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, config_id=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Jan 22 01:08:01 compute-0 nova_compute[182935]: 2026-01-22 01:08:01.850 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:08:03.272 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:08:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:08:03.273 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:08:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:08:03.273 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:08:04 compute-0 sshd-session[256797]: Received disconnect from 45.227.254.170 port 55230:11:  [preauth]
Jan 22 01:08:04 compute-0 sshd-session[256797]: Disconnected from authenticating user root 45.227.254.170 port 55230 [preauth]
Jan 22 01:08:06 compute-0 nova_compute[182935]: 2026-01-22 01:08:06.852 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:06 compute-0 nova_compute[182935]: 2026-01-22 01:08:06.854 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:06 compute-0 nova_compute[182935]: 2026-01-22 01:08:06.854 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:08:06 compute-0 nova_compute[182935]: 2026-01-22 01:08:06.855 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:06 compute-0 nova_compute[182935]: 2026-01-22 01:08:06.894 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:06 compute-0 nova_compute[182935]: 2026-01-22 01:08:06.895 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:11 compute-0 nova_compute[182935]: 2026-01-22 01:08:11.895 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:11 compute-0 nova_compute[182935]: 2026-01-22 01:08:11.897 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:11 compute-0 nova_compute[182935]: 2026-01-22 01:08:11.897 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:08:11 compute-0 nova_compute[182935]: 2026-01-22 01:08:11.898 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:11 compute-0 nova_compute[182935]: 2026-01-22 01:08:11.899 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:11 compute-0 nova_compute[182935]: 2026-01-22 01:08:11.901 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:16 compute-0 podman[256799]: 2026-01-22 01:08:16.71296137 +0000 UTC m=+0.070981575 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 22 01:08:16 compute-0 podman[256801]: 2026-01-22 01:08:16.739786741 +0000 UTC m=+0.093281457 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 01:08:16 compute-0 podman[256800]: 2026-01-22 01:08:16.758526088 +0000 UTC m=+0.115151520 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Jan 22 01:08:16 compute-0 nova_compute[182935]: 2026-01-22 01:08:16.897 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:16 compute-0 nova_compute[182935]: 2026-01-22 01:08:16.900 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:21 compute-0 nova_compute[182935]: 2026-01-22 01:08:21.901 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:22 compute-0 podman[256866]: 2026-01-22 01:08:22.710247396 +0000 UTC m=+0.077572612 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.332 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.334 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.335 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:23 compute-0 ceilometer_agent_compute[192638]: 2026-01-22 01:08:23.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 22 01:08:26 compute-0 nova_compute[182935]: 2026-01-22 01:08:26.902 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:26 compute-0 nova_compute[182935]: 2026-01-22 01:08:26.904 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:26 compute-0 nova_compute[182935]: 2026-01-22 01:08:26.905 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:08:26 compute-0 nova_compute[182935]: 2026-01-22 01:08:26.905 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:26 compute-0 nova_compute[182935]: 2026-01-22 01:08:26.939 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:26 compute-0 nova_compute[182935]: 2026-01-22 01:08:26.940 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:27 compute-0 nova_compute[182935]: 2026-01-22 01:08:27.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:27 compute-0 nova_compute[182935]: 2026-01-22 01:08:27.830 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:08:27 compute-0 nova_compute[182935]: 2026-01-22 01:08:27.831 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:08:27 compute-0 nova_compute[182935]: 2026-01-22 01:08:27.831 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:08:27 compute-0 nova_compute[182935]: 2026-01-22 01:08:27.831 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:08:28 compute-0 nova_compute[182935]: 2026-01-22 01:08:28.096 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:08:28 compute-0 nova_compute[182935]: 2026-01-22 01:08:28.097 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5703MB free_disk=73.11844635009766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:08:28 compute-0 nova_compute[182935]: 2026-01-22 01:08:28.098 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:08:28 compute-0 nova_compute[182935]: 2026-01-22 01:08:28.098 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:08:28 compute-0 nova_compute[182935]: 2026-01-22 01:08:28.161 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:08:28 compute-0 nova_compute[182935]: 2026-01-22 01:08:28.162 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:08:28 compute-0 nova_compute[182935]: 2026-01-22 01:08:28.189 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:08:28 compute-0 nova_compute[182935]: 2026-01-22 01:08:28.206 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:08:28 compute-0 nova_compute[182935]: 2026-01-22 01:08:28.207 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:08:28 compute-0 nova_compute[182935]: 2026-01-22 01:08:28.208 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:08:28 compute-0 podman[256886]: 2026-01-22 01:08:28.741133925 +0000 UTC m=+0.093685608 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 01:08:28 compute-0 podman[256885]: 2026-01-22 01:08:28.742747153 +0000 UTC m=+0.102886607 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vendor=Red Hat, Inc., release=1755695350)
Jan 22 01:08:29 compute-0 nova_compute[182935]: 2026-01-22 01:08:29.208 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:29 compute-0 nova_compute[182935]: 2026-01-22 01:08:29.209 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:08:30 compute-0 nova_compute[182935]: 2026-01-22 01:08:30.794 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:31 compute-0 nova_compute[182935]: 2026-01-22 01:08:31.941 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:31 compute-0 nova_compute[182935]: 2026-01-22 01:08:31.942 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:31 compute-0 nova_compute[182935]: 2026-01-22 01:08:31.942 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:08:31 compute-0 nova_compute[182935]: 2026-01-22 01:08:31.942 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:31 compute-0 nova_compute[182935]: 2026-01-22 01:08:31.943 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:31 compute-0 nova_compute[182935]: 2026-01-22 01:08:31.945 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:36 compute-0 nova_compute[182935]: 2026-01-22 01:08:36.943 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:37 compute-0 nova_compute[182935]: 2026-01-22 01:08:37.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:37 compute-0 nova_compute[182935]: 2026-01-22 01:08:37.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:08:37 compute-0 nova_compute[182935]: 2026-01-22 01:08:37.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:08:37 compute-0 nova_compute[182935]: 2026-01-22 01:08:37.812 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:08:41 compute-0 nova_compute[182935]: 2026-01-22 01:08:41.946 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:41 compute-0 nova_compute[182935]: 2026-01-22 01:08:41.948 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:41 compute-0 nova_compute[182935]: 2026-01-22 01:08:41.949 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:08:41 compute-0 nova_compute[182935]: 2026-01-22 01:08:41.949 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:41 compute-0 nova_compute[182935]: 2026-01-22 01:08:41.989 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:41 compute-0 nova_compute[182935]: 2026-01-22 01:08:41.990 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:42 compute-0 nova_compute[182935]: 2026-01-22 01:08:42.807 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:43 compute-0 nova_compute[182935]: 2026-01-22 01:08:43.792 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:46 compute-0 nova_compute[182935]: 2026-01-22 01:08:46.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:46 compute-0 nova_compute[182935]: 2026-01-22 01:08:46.991 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:46 compute-0 nova_compute[182935]: 2026-01-22 01:08:46.993 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:46 compute-0 nova_compute[182935]: 2026-01-22 01:08:46.994 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:08:46 compute-0 nova_compute[182935]: 2026-01-22 01:08:46.994 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:47 compute-0 nova_compute[182935]: 2026-01-22 01:08:47.029 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:47 compute-0 nova_compute[182935]: 2026-01-22 01:08:47.030 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:47 compute-0 podman[256924]: 2026-01-22 01:08:47.673726516 +0000 UTC m=+0.048419308 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 01:08:47 compute-0 podman[256925]: 2026-01-22 01:08:47.717377597 +0000 UTC m=+0.084490057 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:08:47 compute-0 podman[256926]: 2026-01-22 01:08:47.724879946 +0000 UTC m=+0.089678111 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 22 01:08:47 compute-0 nova_compute[182935]: 2026-01-22 01:08:47.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:48 compute-0 nova_compute[182935]: 2026-01-22 01:08:48.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:52 compute-0 nova_compute[182935]: 2026-01-22 01:08:52.031 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:52 compute-0 nova_compute[182935]: 2026-01-22 01:08:52.033 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:52 compute-0 nova_compute[182935]: 2026-01-22 01:08:52.034 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:08:52 compute-0 nova_compute[182935]: 2026-01-22 01:08:52.034 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:52 compute-0 nova_compute[182935]: 2026-01-22 01:08:52.079 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:52 compute-0 nova_compute[182935]: 2026-01-22 01:08:52.080 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:53 compute-0 podman[256998]: 2026-01-22 01:08:53.715640787 +0000 UTC m=+0.068323382 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 22 01:08:53 compute-0 nova_compute[182935]: 2026-01-22 01:08:53.788 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:57 compute-0 nova_compute[182935]: 2026-01-22 01:08:57.081 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:57 compute-0 nova_compute[182935]: 2026-01-22 01:08:57.082 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:08:57 compute-0 nova_compute[182935]: 2026-01-22 01:08:57.082 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:08:57 compute-0 nova_compute[182935]: 2026-01-22 01:08:57.082 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:57 compute-0 nova_compute[182935]: 2026-01-22 01:08:57.108 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:08:57 compute-0 nova_compute[182935]: 2026-01-22 01:08:57.109 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:08:58 compute-0 nova_compute[182935]: 2026-01-22 01:08:58.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:59 compute-0 nova_compute[182935]: 2026-01-22 01:08:59.080 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:08:59 compute-0 podman[257019]: 2026-01-22 01:08:59.730653145 +0000 UTC m=+0.088405091 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9)
Jan 22 01:08:59 compute-0 podman[257020]: 2026-01-22 01:08:59.731993937 +0000 UTC m=+0.084117778 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 22 01:09:02 compute-0 nova_compute[182935]: 2026-01-22 01:09:02.110 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:02 compute-0 nova_compute[182935]: 2026-01-22 01:09:02.112 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:02 compute-0 nova_compute[182935]: 2026-01-22 01:09:02.112 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:09:02 compute-0 nova_compute[182935]: 2026-01-22 01:09:02.113 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:02 compute-0 nova_compute[182935]: 2026-01-22 01:09:02.146 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:02 compute-0 nova_compute[182935]: 2026-01-22 01:09:02.148 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:09:03.273 104408 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:09:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:09:03.273 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:09:03 compute-0 ovn_metadata_agent[104403]: 2026-01-22 01:09:03.273 104408 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:09:07 compute-0 nova_compute[182935]: 2026-01-22 01:09:07.149 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:07 compute-0 nova_compute[182935]: 2026-01-22 01:09:07.150 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:07 compute-0 nova_compute[182935]: 2026-01-22 01:09:07.150 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:09:07 compute-0 nova_compute[182935]: 2026-01-22 01:09:07.151 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:07 compute-0 nova_compute[182935]: 2026-01-22 01:09:07.152 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:07 compute-0 nova_compute[182935]: 2026-01-22 01:09:07.153 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:09 compute-0 nova_compute[182935]: 2026-01-22 01:09:09.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:09 compute-0 nova_compute[182935]: 2026-01-22 01:09:09.794 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 22 01:09:12 compute-0 nova_compute[182935]: 2026-01-22 01:09:12.154 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:12 compute-0 nova_compute[182935]: 2026-01-22 01:09:12.156 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:12 compute-0 nova_compute[182935]: 2026-01-22 01:09:12.156 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:09:12 compute-0 nova_compute[182935]: 2026-01-22 01:09:12.157 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:12 compute-0 nova_compute[182935]: 2026-01-22 01:09:12.189 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:12 compute-0 nova_compute[182935]: 2026-01-22 01:09:12.190 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:17 compute-0 nova_compute[182935]: 2026-01-22 01:09:17.191 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:18 compute-0 podman[257056]: 2026-01-22 01:09:18.720997885 +0000 UTC m=+0.087792097 container health_status 06d9f0aceb8da94f130505e23f673f543c54575e857a6c52892ba3a0248146c5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 22 01:09:18 compute-0 podman[257063]: 2026-01-22 01:09:18.722602503 +0000 UTC m=+0.068805763 container health_status ffe8ce68bf2eb6caeabf1e1b2aec6003b21b34f14869743d311fd6e3b54d8c88 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 22 01:09:18 compute-0 podman[257057]: 2026-01-22 01:09:18.744078677 +0000 UTC m=+0.102296863 container health_status 5aa10f71a7f3cb8a6adc16033fcc2ff9bf1f5182ff9825aa9f0581cf1207ca61 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 22 01:09:22 compute-0 nova_compute[182935]: 2026-01-22 01:09:22.194 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:22 compute-0 nova_compute[182935]: 2026-01-22 01:09:22.196 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:22 compute-0 nova_compute[182935]: 2026-01-22 01:09:22.196 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:09:22 compute-0 nova_compute[182935]: 2026-01-22 01:09:22.196 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:22 compute-0 nova_compute[182935]: 2026-01-22 01:09:22.245 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:22 compute-0 nova_compute[182935]: 2026-01-22 01:09:22.246 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:24 compute-0 podman[257131]: 2026-01-22 01:09:24.70314864 +0000 UTC m=+0.070837222 container health_status 86fc4ea490792e79908e04ce901bd3a7138eb3000793d6e6a209ad6769c0318c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 22 01:09:24 compute-0 nova_compute[182935]: 2026-01-22 01:09:24.810 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:24 compute-0 nova_compute[182935]: 2026-01-22 01:09:24.811 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 22 01:09:24 compute-0 nova_compute[182935]: 2026-01-22 01:09:24.827 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 22 01:09:27 compute-0 nova_compute[182935]: 2026-01-22 01:09:27.246 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:27 compute-0 nova_compute[182935]: 2026-01-22 01:09:27.248 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:27 compute-0 nova_compute[182935]: 2026-01-22 01:09:27.248 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:09:27 compute-0 nova_compute[182935]: 2026-01-22 01:09:27.248 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:27 compute-0 nova_compute[182935]: 2026-01-22 01:09:27.248 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:27 compute-0 nova_compute[182935]: 2026-01-22 01:09:27.249 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:28 compute-0 nova_compute[182935]: 2026-01-22 01:09:28.810 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:28 compute-0 nova_compute[182935]: 2026-01-22 01:09:28.810 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 22 01:09:29 compute-0 nova_compute[182935]: 2026-01-22 01:09:29.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:29 compute-0 nova_compute[182935]: 2026-01-22 01:09:29.827 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:09:29 compute-0 nova_compute[182935]: 2026-01-22 01:09:29.828 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:09:29 compute-0 nova_compute[182935]: 2026-01-22 01:09:29.828 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:09:29 compute-0 nova_compute[182935]: 2026-01-22 01:09:29.829 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 22 01:09:30 compute-0 nova_compute[182935]: 2026-01-22 01:09:30.072 182939 WARNING nova.virt.libvirt.driver [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 22 01:09:30 compute-0 nova_compute[182935]: 2026-01-22 01:09:30.075 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5710MB free_disk=73.11844635009766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 22 01:09:30 compute-0 nova_compute[182935]: 2026-01-22 01:09:30.075 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 22 01:09:30 compute-0 nova_compute[182935]: 2026-01-22 01:09:30.076 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 22 01:09:30 compute-0 nova_compute[182935]: 2026-01-22 01:09:30.235 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 22 01:09:30 compute-0 nova_compute[182935]: 2026-01-22 01:09:30.235 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 22 01:09:30 compute-0 nova_compute[182935]: 2026-01-22 01:09:30.263 182939 DEBUG nova.compute.provider_tree [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed in ProviderTree for provider: 5f09a77c-505f-4bd3-ac26-41f43ebdf535 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 22 01:09:30 compute-0 nova_compute[182935]: 2026-01-22 01:09:30.282 182939 DEBUG nova.scheduler.client.report [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Inventory has not changed for provider 5f09a77c-505f-4bd3-ac26-41f43ebdf535 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 22 01:09:30 compute-0 nova_compute[182935]: 2026-01-22 01:09:30.283 182939 DEBUG nova.compute.resource_tracker [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 22 01:09:30 compute-0 nova_compute[182935]: 2026-01-22 01:09:30.283 182939 DEBUG oslo_concurrency.lockutils [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 22 01:09:30 compute-0 podman[257153]: 2026-01-22 01:09:30.695553811 +0000 UTC m=+0.060293010 container health_status 6ed30c7dd50ab44ab8426e0f63c14c69efa003107e42e3f443305001c445e89d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '7a1458d90194082253c8c3286824b124909978da03e47c58e7f56809fdd5366e-f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 22 01:09:30 compute-0 podman[257152]: 2026-01-22 01:09:30.727117514 +0000 UTC m=+0.084276432 container health_status 40d214d69e1c296e473b237e797a6a9f37af92a6520054ced7198a571267ee53 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'f8605786ec50a28e92a630f334db69d5713a8ff36a168e44f80e610c31743850-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 22 01:09:32 compute-0 nova_compute[182935]: 2026-01-22 01:09:32.292 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:32 compute-0 nova_compute[182935]: 2026-01-22 01:09:32.293 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:32 compute-0 nova_compute[182935]: 2026-01-22 01:09:32.293 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:32 compute-0 nova_compute[182935]: 2026-01-22 01:09:32.293 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:09:32 compute-0 nova_compute[182935]: 2026-01-22 01:09:32.294 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:32 compute-0 nova_compute[182935]: 2026-01-22 01:09:32.294 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:32 compute-0 nova_compute[182935]: 2026-01-22 01:09:32.295 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:32 compute-0 sshd-session[257193]: Accepted publickey for zuul from 192.168.122.10 port 55196 ssh2: ECDSA SHA256:yVd9eaNlUxI8uClDfn5gH9n5gqPIh4AgIE0cN4DTPFE
Jan 22 01:09:32 compute-0 systemd-logind[784]: New session 66 of user zuul.
Jan 22 01:09:32 compute-0 systemd[1]: Started Session 66 of User zuul.
Jan 22 01:09:32 compute-0 sshd-session[257193]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 22 01:09:32 compute-0 sudo[257197]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 22 01:09:32 compute-0 sudo[257197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 22 01:09:36 compute-0 ovs-vsctl[257371]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 22 01:09:37 compute-0 nova_compute[182935]: 2026-01-22 01:09:37.295 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:37 compute-0 virtqemud[182477]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 22 01:09:37 compute-0 virtqemud[182477]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 22 01:09:37 compute-0 virtqemud[182477]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 22 01:09:38 compute-0 crontab[257777]: (root) LIST (root)
Jan 22 01:09:38 compute-0 nova_compute[182935]: 2026-01-22 01:09:38.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:38 compute-0 nova_compute[182935]: 2026-01-22 01:09:38.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 22 01:09:38 compute-0 nova_compute[182935]: 2026-01-22 01:09:38.795 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 22 01:09:38 compute-0 nova_compute[182935]: 2026-01-22 01:09:38.823 182939 DEBUG nova.compute.manager [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 22 01:09:40 compute-0 systemd[1]: Starting Hostname Service...
Jan 22 01:09:40 compute-0 systemd[1]: Started Hostname Service.
Jan 22 01:09:42 compute-0 nova_compute[182935]: 2026-01-22 01:09:42.126 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:42 compute-0 nova_compute[182935]: 2026-01-22 01:09:42.298 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:42 compute-0 nova_compute[182935]: 2026-01-22 01:09:42.300 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 22 01:09:42 compute-0 nova_compute[182935]: 2026-01-22 01:09:42.300 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 22 01:09:42 compute-0 nova_compute[182935]: 2026-01-22 01:09:42.300 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:42 compute-0 nova_compute[182935]: 2026-01-22 01:09:42.341 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 29 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 22 01:09:42 compute-0 nova_compute[182935]: 2026-01-22 01:09:42.342 182939 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 22 01:09:43 compute-0 nova_compute[182935]: 2026-01-22 01:09:43.791 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 22 01:09:45 compute-0 nova_compute[182935]: 2026-01-22 01:09:45.793 182939 DEBUG oslo_service.periodic_task [None req-1bf9a4d3-9245-4dcf-b2bf-dda0f139ddd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
